David Griesing | Work Life Reward Author | Philadelphia

  • Blog
  • About
    • Biography
    • Teaching and Training
  • Book
    • WorkLifeReward
  • Subscribe to my Newsletter
  • Contact
You are here: Home / Archives for tech platforms

Companies That Wear Their Values on Their Sleeves

March 31, 2019 By David Griesing Leave a Comment

We lead with our values because it’s easier to connect with somebody who shares our priorities, but it’s a trickier proposition for companies that want our loyalty because we have rarely been so divided about what is important to us. 

Deepening divisions over our common commitments make Apple’s roll-out of a suite of new services this week both riveting—and potentially fraught.

As the company’s newly announced services like AppleTV+ and AppleNews+ tie themselves more closely to Hollywood and the cover stories that sell glossy magazines, is Apple cloaking itself in values that could alienate or confuse many of the customers it aims to bond with more closely?

In 1997, on the eve of launching a new, national advertising campaign, Steve Jobs gave a short talk about the link between Apple’s values and its customers. “At our core, we believe in making the world a better place,” he said.  So our ads will “honor people who have changed the world for the better—[because] if they ever used a computer, it would be a Mac.”  In its marketing, Apple aligned itself with tech innovators like Thomas Edison, a genius who had already changed the world as Jobs and Apple were about to do.

A little more than 20 years later, with Apple following in the footsteps of Edison’s light-bulb company to industry dominance, the question is whether it would have a better chance of preserving that dominance by once again aligning itself with technology innovators who have already changed the world instead of those, like Steven Spielberg and Oprah Winfrey, who aim to do so by relying on their own approaches to social betterment? To set the stage for your possible reactions, here is a link, beginning with Spielberg and ending with Winfrey, that covers the highlights from Apple’s new services launch this past week in a vivid, all-over-the-place 15 minutes.

I should confess that I have a bit of a horse in this race because I want Apple to keep winning by bringing me world-class products and customer care, but I’m not sure that it can by pursuing services (like entertainment, news and games through the new AppleArcade) that put the company in lockstep with industries that muddy its focus and dilute its values proposition.

Instead of bringing me a global version of Oprah’s book club or more of Steven Speilberg’s progressive reminiscences, I was hoping to hear that Apple would be providing—even in stages over months or years—an elegantly conceived and designed piece of technology that would (1) allow me to cut the cable cord with my internet provider while (2) upgrading me to an interface where I could see and pay for whatever visual programming I choose whenever I want it. An ApplePay-enabled entertainment router. Now that would be the kind of tech innovation that would change my world for the better again (and maybe yours too) while staying true to its founder’s messaging from twenty-odd years ago.  

Tech commentator Christopher Mims talked this week about why Apple was claiming values (like those embedded in Oprah’s notion of social responsibility) to maintain its market share, but he never really debated whether it should do so. I’d argue that Apple should continue to make its own case for the social benefits of its tech solutions instead of mixing its message about priorities with the aspirations of our celebrity culture.

When it comes to Silicon Valley’s mostly hypocritcal claims about social responsibility, I start with the skepticism of observers like Anand Giridharadas. To him, Facebook, Google, Amazon and Apple all use values as tools to gain the profits they’re after, covering their self-serving agendas with feel-good marketing.

In a post last October, I discussed some of his observations about Facebook (and by implication, most of the others) in the context of his recent book, Winners Take All.

The problem, said Giridharadas, is that while these companies are always taking credit for the efficiencies and other benefits they have brought, they take no responsibility whatsoever for the harms… In their exercise of corporate social responsibility, there is a mismatch between the solutions that the tech entrepreneurs can and want to bring and the problems we have that need to be solved. “Tending to the public welfare is not an efficiency problem, The work of governing a society is tending to everybody. It’s figuring out universal rules and norms and programs that express the value of the whole and take care of the common welfare.” By contrast, the tech industry sees the world more narrowly. For example, the fake news controversy lead Facebook not to a comprehensive solution for providing reliable information but to what Giridharadas calls “the Trying-to-Solve-the-Problem-with-the-Tools-that-Caused-It” quandary.

In the face of judgments like his, I’d argue that Apple should be vetting its corporate messaging with the inside guidance of those who understand the power of values before it squanders the high ground it still holds. 
 
Beyond “sticking with the tech innovations that it’s good at” and the Edison-type analogies that add to their luster, what follows are three proposals for how the company might build on its values-based loyalty while still looking us in the eye when it does so.
 
Each one has Apple talking about what its tech-appreciating customers still care about most when they think—with healthy nostalgia—about all the things that Apple has done for them already.

The fate that the company is aiming to avoid

1.         Apple should keep reminding us about its unparalleled customer service

The unbelievable service I have come to expect from Apple feeds my brand loyalty. I feel that we share the value of trustworthiness. When someone relies on me for something, I stand behind it instead of telling them I don’t have the time or it’s too expensive to fix. For me, Apple has consistently done the same.
 
So I was surprised when I had to argue a bit harder than I thought was necessary for Apple’s battery fix for an older iPhone, and I started following other customer complaints against the company to see if a change of priorities was in the air. Since I’m writing to you on a MacBook Air, problems with the Air’s later generation keyboards have apparently escalated to the point that national class-action litigation is in the offing. Not unlike the iPhone battery fix, Apple has gone on record as being willing to replace any sticky keyboard for free within 4 years of purchase, but is it really as easy as it sounds? As recently as last week, there was this plea to Apple from a tech reviewer in a national newspaper after profiling a litany of customer difficulties:

For any Apple engineers and executives reading: This is the experience you’re providing customers who shell out $1200 or more—sometimes a lot more. This is the experience after THREE attempts at this keyboard design.

When you are one of the richest companies in history, relatively inexpensive problems like this need to be solved before they get this far. A reputation for world-class customer service is a terrible thing to waste. Be glad to fix your technology on those rare occasions when it breaks down, and solve the technology problem with these keyboards before you sell or replace any more of them. Don’t make customers who were loyal enough to pay a premium for an Apple device take you to court because they can’t get enough of your attention any other way. Caring for your customers is a core value that needs polishing before its shine begins to fade and your customer loyalty slips away.

2.         Apple should keep telling us how much it’s different from Google, Facebook and Amazon

The uses that the dominant tech platforms are making of our personal data are on everyone’s mind.
 
Beyond invasions of privacy, deep concerns are also being voiced about the impact of “surveillance capitalism” on Western democracy, not only because of meddling with our elections but, even more fundamentally, because of how this new economic model disrupts “the organic reciprocities involving employment and consumption” that undergird democratic market systems. These are profound and increasingly wide-spread concerns, and Apple for one seems to share them. 
 
This is from another post last October called “Looking Out For the Human Side of Technology”: 

I was also struck this week by Apple CEO Tim Cook’s explosive testimony at a privacy conference organized by the European Union…:
 
‘Our own information—from the everyday to the deeply personal—is being weaponized against us with military efficiency. Today, that trade has exploded into a data-industrial complex.
 
These scraps of data, each one harmless enough on its own, are carefully assembled, synthesized, traded, and sold. This is surveillance. And these stockpiles of personal data serve only to enrich the companies that collect them. This should make us very uncomfortable.
 
Technology is and must always be rooted in the faith people have in it. We also recognize not everyone sees it that way—in a way, the desire to put profits over privacy is nothing new.’
 
‘Weaponized’ technology delivered with ‘military efficiency.’ ‘A data-industrial complex.’ One of the benefits of competition is that rivals call you out, while directing unwanted attention away from themselves…so Cook’s (and Apple’s) motives here have more than a dollop of competitive self-interest where [companies like] Google and Facebook are concerned. On the other hand, Apple is properly credited with limiting the data it makes available to third parties and rendering the data it does provide anonymous. There is a bit more to the story, however.
 
If data privacy were as paramount to Apple as it sounded this week, it would be impossible to reconcile Apple’s receiving more than $5 billion a year from Google to make it the default search engine on all Apple devices.

In its star-studded launch of TV, News and Arcade services this week, Apple’s presenters always reiterated that none of these services would be “ad selling models” targeting Apple users. They’re good reminders about Apple’s values but…while $5B is a lot of revenue to forsake if new purchasers of Apple devices got to pick their own search engines, it’s also a significant amount of support for an antithetical business model.

Not selling my data to others for use against me, like Apple’s standing behind the functionality of my devices, are core contributors to the company’s good reputation in my mind, and never more so than today. If Apple continues to differentiate itself from its competitors on the use of our data—and I think that it should—it needs to find ways to be more forthright about its own conflicts of interest while doing so.

When you stand on your values and back it up with your actions, the competitors you are challenging will always seek to undermine you when your actions are inconsistent with your words. Why let that inconsistency be tomorrow’s headline, Tim? Why not be ready to talk more forthrightly about the quandary with Google, and how the company is trying to address it, when asked to comment before a “gotcha” story like this gets published?

3.         Apple should get ahead of its new services launch by proactively addressing likely problems with real consequences for the rest of us

In its roll-out presentation, Apple announced plans for a new service that will link players to games that Apple will be creating. Few tech areas have seen more startling impacts from the use of behavioral data that’s being gathered from players by those who are behind these on-line games. I recently talked here about how the programmers and monitors behind Fortnite and the updated Tetris games are using data about how players react to as many as 200 “persuasive design elements” in these games to enhance the intensity of the player experience while making it more addictive to boys and other susceptible populations. 

Apple’s engineers know about these issues already. Its programmers are making its new games ready for primetime as soon as next fall. To differentiate itself from others in the on-line gaming industry, to strike a more principled note than its competitors have, and to broaden the scope of Apple’s values when it comes to personal data, the company could tell us some or all of the following in the coming months:

-whether it will be using behavioral data it generates from players through real time play to make its games more absorbing or addictive;

-whether it intends to restrict certain classes of users (like pre-teen boys) from playing certain games or restrict the hours that they can play them;

-what other safeguards it will be implementing to limit the amount of “player attention” that these games will be capturing;

-whether it will be selling game-related merchandise in the Apple store so its financial incentives to encourage extensive game-playing are clear from the outset; and

-whether it will be using data about player behavior to encourage improved learning, collaborative problem-solving, community building and other so-called “pro-social” skills in any of the games it will be offering.

I have no reason to doubt that Apple is serious about protecting the user data that its devices and services generate. Its new venture into gaming provides an opportunity to build on Apple’s reputation for safeguarding the use of its customers’ information. Tim Cook and Apple need to be talking to the rest of us, both now and next fall, about how it will be applying its data-related values to problems its customers care about today in the brave new world of on-line gaming.

+ + +

Apple’s stated values will hold its current customers and attract new ones when there is “a match” between the company’s solutions and the problems the rest of us have that need solving. Affiliation and loyalty grow when there are shared priorities in communities, in politics and in the marketplace.

That means Apple should keep talking about what its tech-appreciating customers care about most in the light of the wonders that Apple has given us already. Despite its recently announced forays into entertainment, the company should never take its eye too far from what it does best—which is to make world-changing devices—even when they take more time to develop than short-term financial performance seems to demand. 

When a company knows what it is and acts accordingly, it can always take risks for the rewards that can come from wearing its values on its sleeves. 

This post was adapted from my March 31, 2019 newsletter. You can subscribe (to the right) and receive it in your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Building Your Values into Your Work Tagged With: Anand Giridharadas, Apple, behavioral data, Christopher Mims, core corporate values, corporate values, customer service, data, gaming, personal data use, priorities, Steve Jobs, surveillance capitalism, tech platforms, Tim Cook, values

These Tech Platforms Threaten Our Freedom

December 9, 2018 By David Griesing Leave a Comment

We’re being led by the nose about what to think, buy, do next, or remember about what we’ve already seen or done.  Oh, and how we’re supposed to be happy, what we like and don’t like, what’s wrong with our generation, why we work. We’re being led to conclusions about a thousand different things and don’t even know it.

The image that captures the erosion of our free thinking by influence peddlers is the frog in the saucepan. The heat is on, the water’s getting warmer, and by the time it’s boiling it’s too late for her to climb back out. Boiled frog, preceded by pleasantly warm and oblivious frog, captures the critical path pretty well. But instead of slow cooking, it’s shorter and shorter attention spans, the slow retreat of perspective and critical thought, and the final loss of freedom.

We’ve been letting the control booths behind the technology reduce the free exercise of our lives and work and we’re barely aware of it. The problem, of course, is that the grounding for good work and a good life is having the autonomy to decide what is good for us.

This kind of tech-enabled domination is hardly a new concern, but we’re wrong in thinking that it remains in the realm of science fiction.

An authority’s struggle to control our feelings, thoughts and decisions was the theme of George Orwell’s 1984, which was written 55 years before the fateful year that he envisioned. “Power,” said Orwell, “is in tearing human minds to pieces and putting them together again in new shapes of your own choosing.” Power persuades you to buy something when you don’t want or need it. It convinces you about this candidate’s, that party’s or some country’s evil motivations. It tricks you into accepting someone else’s motivations as your own. In 1984, free wills were weakened and constrained until they were no longer free. “If you want a picture of the future,” Orwell wrote, “imagine a boot stamping on a human face—for ever.”

Maybe this reflection of the present seems too extreme to you.

After all, Orwell’s jackbooted fascists and communists were defeated by our Enlightenment values. Didn’t the first President Bush, whom we buried this week, preside over some of it? The authoritarians were down and seemed out in the last decade of the last century—Freedom Finally Won!—which just happened to be the very same span of years when new technologies and communication platforms began to enable the next generation of dominators.

(There is no true victory over one man’s will to deprive another of his freedom, only a truce until the next assault begins.)

20 years later, in his book Who Owns the Future (2013), Jaron Lanier argued that a new battle for freedom must be fought against powerful corporations fueled by advertisers and other “influencers” who are obsessed with directing our thoughts today.

In exchange for “free” information from Google, “free” networking from Facebook, and “free” deliveries from Amazon, we open our minds to what Lanier calls “siren servers,” the cloud computing networks that drive much of the internet’s traffic. Machine-driven algorithms collect data about who we are to convince us to buy products, judge candidates for public office, or determine how the majority in a country like Myanmar should deal with a minority like the Rohingya.

Companies, governments, groups with good and bad motivations use our data to influence our future buying and other decisions on technology platforms that didn’t even exist when the first George Bush was president but now, only a few years later, seem indispensible to nearly all of our commerce and communication. Says Lanier:

When you are wearing sensors on your body all the time, such as the GPS and camera on your smartphone and constantly piping data to a megacomputer owned by a corporation that is paid by ‘advertisers” to subtly manipulate you…you are gradually becoming less free.

And all the while we were blissfully unaware that this was happening because the bath was so convenient and the water inside it seemed so warm. Franklin Foer, who addresses tech issues in The Atlantic and wrote 2017’s World Without Mind: The Existential Threat of Big Tech, talks about this calculated seduction in an interview he gave this week:

Facebook and Google [and Amazon] are constantly organizing things in ways in which we’re not really cognizant, and we’re not even taught to be cognizant, and most people aren’t… Our data is this cartography of the inside of our psyche. They know our weaknesses, and they know the things that give us pleasure and the things that cause us anxiety and anger. They use that information in order to keep us addicted. That makes [these] companies the enemies of independent thought.

The poor frog never understood that accepting all these “free” invitations to the saucepan meant that her freedom to climb back out was gradually being taken away from her.

Of course, we know that nothing is truly free of charge, with no strings attached. But appreciating the danger in these data driven exchanges—and being alert to the persuasive tools that are being arrayed against us—are not the only wake-up calls that seem necessary today. We also can (and should) confront two other tendencies that undermine our autonomy while we’re bombarded with too much information from too many different directions. They are our confirmation bias and what’s been called our illusion of explanatory depth.

Confirmation bias leads us to stop gathering information when the evidence we’ve gathered so far confirms the views (or biases) that we would like to be true. In other words, we ignore or reject new information, maintaining an echo chamber of sorts around what we’d prefer to believe. This kind of mindset is the opposite of self-confidence, because all we’re truly interested in doing outside ourselves is searching for evidence to shore up our egos.

Of course, the thought controllers know about our propensity for confirmation bias and seek to exploit it, particularly when we’re overwhelmed by too many opposing facts, have too little time to process the information, and long for simple black and white truths. Manipulators and other influencers have also learned from social science that our reduced attention spans are easily tricked by the illusion of explanatory depth, or our belief that we understand things far better than we actually do.

The illusion that we know more than we think we do extends to anything that we can misunderstand. It comes about because we consume knowledge widely but not deeply, and since that is rarely enough for understanding, our same egos claim that we know more than we actually do. For example, we all know that ignorant people are the most over-confident in their knowledge, but how easily we delude ourselves about the majesty of our own ignorance.  For example, I regularly ask people questions about all sorts of things that they might know about. It’s almost the end of the year as I write this and I can count on one hand the number of them who have responded to my questions by saying “I don’t know” over the past twelve months.  Most have no idea how little understanding they bring to whatever they’re talking about. It’s simply more comforting to pretend that we have all of this confusing information fully processed and under control.

Luckily, for confirmation bias or the illusion of explanatory depth, the cure is as simple as finding a skeptic and putting him on the other side of the conversation so he will hear us out and respond to or challenge whatever it is that we’re saying. When our egos are strong enough for that kind of exchange, we have an opportunity to explain our understanding of the subject at hand. If, as often happens, the effort of explaining reveals how little we actually know, we are almost forced to become more modest about our knowledge and less confirming of the biases that have taken hold of us.  A true conversation like this can migrate from a polarizing battle of certainties into an opportunity to discover what we might learn from one another.

The more that we admit to ourselves and to others what we don’t know, the more likely we are to want to fill in the blanks. Instead of false certainties and bravado, curiosity takes over—and it feels liberating precisely because becoming well-rounded in our understanding is a well-spring of autonomy.

When we open ourselves like this instead of remaining closed, we’re less receptive to, and far better able to resist, the “siren servers” that would manipulate our thoughts and emotions by playing to our biases and illusions. When we engage in conversation, we also realize that devices like our cell phones and platforms like our social networks are, in Foer’s words, actually “enemies of contemplation” which are” preventing us from thinking.”

Lanier describes the shift from this shallow tech-driven stimulus/response to a deeper assertion of personal freedom in a profile that was written about him in the New Yorker a few years back.  Before he started speaking at a South-by-Southwest Interactive conference, Lanier asked his audience not to blog, text or tweet while he spoke. He later wrote that his message to the crowd had been:

If you listen first, and write later, then whatever you write will have had time to filter through your brain, and you’ll be in what you say. This is what makes you exist. If you are only a reflector of information, are you really there?

Lanier makes two essential points about autonomy in this remark. Instead of processing on the fly, where the dangers of bias and illusions of understanding are rampant, allow what is happening “to filter through your brain,” because when it does, there is a far better chance that whoever you really are, whatever you truly understand, will be “in” what you ultimately have to say.

His other point is about what you risk becoming if you fail to claim a space for your freedom to assert itself in your lives and work. When you’re reduced to “a reflector of information,” are you there at all anymore or merely reflecting the reality that somebody else wants you to have?

We all have a better chance of being contented and sustained in our lives and work when we’re expressing our freedom, but it’s gotten a lot more difficult to exercise it given the dominant platforms that we’re relying upon for our information and communications today.

This post was adapted from my December 9, 2018 newsletter.

Filed Under: *All Posts, Building Your Values into Your Work, Continuous Learning, Work & Life Rewards Tagged With: Amazon, autonomy, communication, confirmation bias, facebook, Franklin Foer, free thinking, freedom, Google, illusion of explanatory depth, information, information overhoad, Jaron Lanier, tech, tech platforms, technology

About David

David Griesing (@worklifeward) writes from Philadelphia.

Read More →

David Griesing Twitter @worklifereward

Subscribe to my Newsletter

Join all the others who have new posts, recommendations and links to explore delivered to their inboxes every week. You can read all published newsletters via the Index on the Subscribe Page.

My Forthcoming Book

WordLifeReward Book

Writings

  • *All Posts (215)
  • Being Part of Something Bigger than Yourself (106)
  • Being Proud of Your Work (33)
  • Building Your Values into Your Work (83)
  • Continuous Learning (74)
  • Daily Preparation (52)
  • Entrepreneurship (30)
  • Heroes & Other Role Models (40)
  • Introducing Yourself & Your Work (23)
  • The Op-eds (4)
  • Using Humor Effectively (14)
  • Work & Life Rewards (72)

Archives

Search this Site

Follow Me

David Griesing Twitter @worklifereward

Recent Posts

  • An Artist Needs to Write Us a Better Story About the Future March 9, 2023
  • Patagonia’s Rock Climber February 19, 2023
  • We May Be In a Neurological Mismatch with Our Tech-Driven World January 29, 2023
  • Reading Last Year and This Year January 12, 2023
  • A Time for Repair, for Wintering  December 13, 2022

Navigate

  • About
    • Biography
    • Teaching and Training
  • Blog
  • Book
    • WorkLifeReward
  • Contact
  • Privacy Policy
  • Subscribe to my Newsletter
  • Terms of Use

Copyright © 2023 David Griesing. All Rights Reserved.

  • Terms of Use
  • Privacy Policy