David Griesing | Work Life Reward Author | Philadelphia

  • Blog
  • About
    • Biography
    • Teaching and Training
  • Book
    • WorkLifeReward
  • Newsletter Archive
  • Contact
You are here: Home / Archives for Apple

Who’s Winning Our Tugs-of-War Over On-Line Privacy & Autonomy?

February 1, 2021 By David Griesing Leave a Comment

We know that our on-line privacy and autonomy (or freedom from outside control) are threatened in two, particularly alarming ways today. There are the undisclosed privacy invasions that occur from our on-line activities and the loss of opportunities where we can speak our minds without censorship.

These alarm bells ring because of the dominance of on-line social media platforms like Facebook, YouTube and Twitter and text-based exchanges like What’s App and the other instant messaging services—most of which barely existed a decade ago. With unprecedented speed, they’ve become the town squares of modern life where we meet, talk, shop, learn, voice opinions and engage politically. But as ubiquitous and essential as they’ve become, their costs to vital zones of personal privacy and autonomy have caused a significant backlash, and this past week we got an important preview of where this backlash is likely to take us.

Privacy advocates worry about the harmful consequences when personal data is extracted from users of these platforms and services. They say our own data is being used “against us” to influence what we buy (the targeted ads that we see and don’t see), manipulate our politics (increasing our emotional engagement by showing us increasingly polarizing content), and exert control over our social behavior (by enabling data-gathering agencies like the police, FBI or NSA). Privacy advocates are also offended that third parties are monetizing personal data “that belongs to us” in ways that we never agreed to, amounting to a kind of theft of our personal property by unauthorized strangers.

For their part, censorship opponents decry content monitors who can bar particular statements or even participation on dominant platforms altogether for arbitrary and biased reasons. When deprived of the full use of our most powerful channels of mass communication, they argue that their right to peaceably assemble is being eviscerated by what they experience as “a culture war” against them. 

Both groups say they have a privacy right to be left alone and act autonomously on-line: to make choices and decisions for themselves without undue influence from outsiders; to be free from ceaseless monitoring, profiling and surveillance; to be able to speak their minds without the threat of “silencing;” and, “to gather” for any lawful purpose without harassment. 

So how are these tugs-or-war over two of our most basic rights going?

This past week provided some important indications.

This week’s contest over on-line privacy pit tech giant Apple against rivals with business models that depend upon selling their users’ data to advertisers and other third parties—most prominently, Facebook and Google.

Apple announced this week that it would immediately start offering its leading smartphone users additional privacy protections. One relates to its dominant App Store and developers like Facebook, Google and the thousands of other companies that sell their apps (or platform interfaces) to iPhone users.

Going forward—on what Apple chief Tim Cook calls “a privacy nutrition label”—every app that the company offers for installation on its phones will need to share its data collection and privacy practices before purchase in ways that Apple will ensure “every user can understand and act on.” Instead of reading (and then ignoring) multiple pages of legalese, for the first time every new Twitter or YouTube user for example, will be able through their iPhones to either “opt-in” or refuse an app’s data collection practices after reading plain language that describes the personal data that will be collected and what will be done with it. In a similar vein, iPhone users will gain a second advantage over apps that have already been installed on their phones. With new App Tracking Transparency, iPhone users will be able to control how each app is gathering and sharing their personal data. For every application on your iPhone, you can now choose whether a Facebook or Google has access to your personal data or not.

While teeing up these new privacy initiatives at an industry conference this week, Apple chief Tim Cook was sharply critical of companies that take our personal data for profit, citing several of the real world consequences when they do so. I quote at length from his remarks last Thursday because I enjoyed hearing someone of Cook’s stature speaking to these issues so pointedly, and thought you might too:

A little more than two years ago…I spoke in Brussels about the emergence of a data-industrial complex… At that gathering we asked ourselves: “what kind of world do we want to live in?” Two years later, we should now take a hard look at how we’ve answered that question. 

The fact is that an interconnected ecosystem of companies and data brokers, of purveyors of fake news and peddlers of division, of trackers and hucksters just looking to make a quick buck, is more present in our lives than it has ever been. 

And it has never been so clear how it degrades our fundamental right to privacy first, and our social fabric by consequence.

As I’ve said before, ‘if we accept as normal and unavoidable that everything in our lives can be aggregated and sold, then we lose so much more than data. We lose the freedom to be human.’….

Together, we must send a universal, humanistic response to those who claim a right to users’ private information about what should not and will not be tolerated….

At Apple…, [w]e have worked to not only deepen our own core privacy principles, but to create ripples of positive change across the industry as a whole. 

We’ve spoken out, time and again, for strong encryption without backdoors, recognizing that security is the foundation of privacy. 

We’ve set new industry standards for data minimization, user control and on-device processing for everything from location data to your contacts and photos. 

At the same time that we’ve led the way in features that keep you healthy and well, we’ve made sure that technologies like a blood-oxygen sensor and an ECG come with peace of mind that your health data stays yours.

And, last but not least, we are deploying powerful, new requirements to advance user privacy throughout the App Store ecosystem…. 

Technology does not need vast troves of personal data, stitched together across dozens of websites and apps, in order to succeed. Advertising existed and thrived for decades without it. And we’re here today because the path of least resistance is rarely the path of wisdom. 

If a business is built on misleading users, on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform….

At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement — the longer the better — and all with the goal of collecting as much data as possible.

Too many are still asking the question, “how much can we get away with?,” when they need to be asking, “what are the consequences?” What are the consequences of prioritizing conspiracy theories and violent incitement simply because of their high rates of engagement? What are the consequences of not just tolerating, but rewarding content that undermines public trust in life-saving vaccinations? What are the consequences of seeing thousands of users join extremist groups, and then perpetuating an algorithm that recommends even more?….

[N]o one needs to trade away the rights of their users to deliver a great product. 

With its new “data nutrition labels” and “app tracking transparency,” many (if not most) of Apple’s iPhone users are likely to reject other companies’ data collection and sharing practices once they understand the magnitude of what’s being taken from them. Moreover, these votes for greater data privacy could be a major financial blow to the companies extracting our data because Apple sold more smartphones globally than any other vendor in the last quarter of 2020, almost half of Americans use iPhones (45.3% of the market according to one analyst), more people access social media and messaging platforms from their phones than from other devices, and the personal data pipelines these data extracting companies rely upon could start constricting immediately.   
 
In this tug-of-war between competing business models, the outcry this week was particularly fierce from Facebook, which one analyst predicts could start to take “a 7% revenue hit” (that’s real cash at $6 billion) as early as the second quarter of this year. (Facebook’s revenue take in 2020 was $86 billion, much of it from ad sales fueled by user data.) Mark Zuckerberg charged that Apple’s move tracks its competitive interests, saying its rival “has every incentive to use their dominant platform position to interfere with how our apps and other apps work,” among other things, a dig at on-going antitrust investigations involving Apple’s App Store. In a rare expression of solidarity with the little guy, Zuckerberg also argued that small businesses which access customers through Facebook would suffer disproportionately from Apple’s move because of their reliance on targeted advertising. 
 
There’s no question that Apple was flaunting its righteousness on data privacy this week and that Facebook’s “ouches” were the most audible reactions. But there is also no question that a business model fueled by the extraction of personal data has finally been challenged by another dominant market player. In coming weeks and months we’ll find out how interested Apple users are about protecting their privacy on their iPhones and whether their eagerness prompts other tech companies to offer similar safeguards. We’ll get signals from how advertising dollars are being spent as the “underlying profile data” becomes more limited and less reliable. We may also begin to see the gradual evolution of an on-line public space that’s somewhat more respectful of our personal privacy and autonomy.
 
What’s clearer today is that tech users concerned about the privacy of their data and freedom from data-driven manipulation on-line can now limit at least some of the flow of that information to unwelcome strangers in ways that they never had at their disposal before.

All of us should be worried about censorship of our views by content moderators at private companies (whether in journalism or social media) and by governmental authorities that wish to stifle dissenting opinions.  But many of the strongest voices behind regulating the tech giants’ penchant “to moderate content” today come from those who are convinced that press, media and social networking channels both limit access to and censor content from those who differ with “their liberal or progressive points of view.” Their opposition speaks not only to the extraordinary dominance of these tech giants in the public square today but also to the air of grievance that colors the political debates that we’ve been having there.
 
Particularly after President Trump’s removal from Facebook and Twitter earlier this month and the temporary shutdown of social media upstart Parler after Amazon cut off its cloud computing services, there has been a concerted drive to find new ways for individuals and groups to communicate with one another on-line in ways that cannot be censored or “de-platformed” altogether. Like the tug-of-war over personal data privacy, a new polarity over on-line censorship and the ways to get around it could fundamentally alter the character of our on-line public squares.
 
Instead of birthing a gaggle of new “Right-leaning” social media companies with managers who might still be tempted to interfere with irritating content, blockchain software technology is now being utilized to create what amount to “moderation-proof” communication networks.
 
To help with basic blockchain mechanics, this is how I described it here in 2018.

A blockchain is a web-based chain of connections, most often with no central monitor, regulator or editor. Its software applications enable every node in its web of connections to record data which can then be seen and reviewed by every other connection. It maintains its accuracy through this transparency. Everyone with access can see what every other connection has recorded in what amounts to a digital ledger…

Blockchain-based software can be launched by individuals, organizations or even governments. Software access can be limited to a closed network of participants or open to everyone. A blockchain is usually established to overcome the need for and cost of a “middleman” (like a bank) or some other impediment (like currency regulations, tariffs or burdensome bureaucracy). It promotes “the freer flow” of legal as well as illegal goods, services and information. Blockchain is already driving both modernization and globalization. Over the next several years, it will also have profound impacts on us as individuals. 

If you’d gain from a visual description, this short video from The MIT Technology Review will also show you the basics about this software innovation.  
 
I’ve written several times before about the promise of blockchain-driven systems. For example, Your Work is About to Change Forever (about a bit-coin-type financial future without banks or traditional currencies); Innovation Driving Values (how secure and transparent recording of property rights like land deeds can drive economic progress in the developing world); Blockchain Goes to Work (how this software can enable gig economy workers to monetize their work time in a global marketplace); Data Privacy & Accuracy During the Coronavirus (how a widely accessible global ledger that records accurate virus-related information can reduce misinformation); and, with some interesting echoes today, a 2017 post called Wish Fulfillment (about why a small social media platform called Steem-It was built on blockchain software).    
 
Last Tuesday, the New York Times ran an article titled: They Found a Way to Limit Big Tech’s Power: Using the Design of Bitcoin. That “Design” in the title was blockchain software. The piece highlighted:

a growing movement by technologists, investors and everyday users to replace some of the internet’s basic building blocks in ways that would be harder for tech giants like Facebook or Google [or, indeed, anyone outside of these self-contained platforms] to control.

Among other things, the article described how those “old” internet building blocks would be replaced by blockchain-driven software, enabling social media platforms that would be the successors to the one that Steem-It built several years ago. However, while Steem-It wanted to provide a safe and reliable way to pay contributors for their social media content, in this instance the over-riding drive is “to make it much harder for any government or company to ban accounts or delete content.” 

It’s both an intoxicating and a chilling possibility.

While the Times reporter hinted about the risks with ominous quotes and references to the creation of “a decentratlized web of hate,” it’s worth noting that nothing like it has materialized, yet. Also implied but never discussed was the urgency that many feel to avoid censorship of their minority viewpoints by people like Twitter’s Jack Dorsey or even the New York Times editors who effectively decide what to report on and what to ignore. So what’s the bottom line in this tech-enabled tug-of-war between political forces?

The public square that we occupy daily—for communication and commerce, family connection and dissent—a public square that the dominant social media platforms largely provide, cannot (and must not) be governed by @Jack, the sensibilities of mainstream media, or any group of esteemed private citizens like Facebook’s recently appointed Oversight Board. One of the most essential roles of government is to maintain safety and order in, and to set forth the rules of the road for, our public square. Because blockchain-enabled social networks will likely be claiming more of that public space in the near future—even as they strive to evade its common obligations through encryption and otherwise—government can and should enforce the rules for this brave new world.

Until now, our government has failed to confront either on-line censorship or its foreseeable consequences. Because our on-line public square has become (in a few short years) as essential to our way of life as our electricity or water, its social media and similar platforms should be licensed and regulated like those basic services, that is, like utilities—not only for our physical safety but also for the sake of our democratic institutions, which survived their most recent tests but may not survive their next ones if we fail to govern ourselves and our awesome technologies more responsibly.

In this second tug-of-war, we don’t have a moment to lose.

This post was adapted from my January 31, 2021 newsletter. Newsletters are delivered to subscribers’ in-boxes every Sunday morning. You can sign up by leaving your email address in the column to the right.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself Tagged With: app tracking transparency, Apple, autonomy, blockchain, censorship, commons, content monitoring, facebook, freedom of on-line assembly, human tech, privacy, privacy controls, privacy nutrition label, public square, social media platforms

Companies That Wear Their Values on Their Sleeves

March 31, 2019 By David Griesing Leave a Comment

We lead with our values because it’s easier to connect with somebody who shares our priorities, but it’s a trickier proposition for companies that want our loyalty because we have rarely been so divided about what is important to us. 

Deepening divisions over our common commitments make Apple’s roll-out of a suite of new services this week both riveting—and potentially fraught.

As the company’s newly announced services like AppleTV+ and AppleNews+ tie themselves more closely to Hollywood and the cover stories that sell glossy magazines, is Apple cloaking itself in values that could alienate or confuse many of the customers it aims to bond with more closely?

In 1997, on the eve of launching a new, national advertising campaign, Steve Jobs gave a short talk about the link between Apple’s values and its customers. “At our core, we believe in making the world a better place,” he said.  So our ads will “honor people who have changed the world for the better—[because] if they ever used a computer, it would be a Mac.”  In its marketing, Apple aligned itself with tech innovators like Thomas Edison, a genius who had already changed the world as Jobs and Apple were about to do.

A little more than 20 years later, with Apple following in the footsteps of Edison’s light-bulb company to industry dominance, the question is whether it would have a better chance of preserving that dominance by once again aligning itself with technology innovators who have already changed the world instead of those, like Steven Spielberg and Oprah Winfrey, who aim to do so by relying on their own approaches to social betterment? To set the stage for your possible reactions, here is a link, beginning with Spielberg and ending with Winfrey, that covers the highlights from Apple’s new services launch this past week in a vivid, all-over-the-place 15 minutes.

I should confess that I have a bit of a horse in this race because I want Apple to keep winning by bringing me world-class products and customer care, but I’m not sure that it can by pursuing services (like entertainment, news and games through the new AppleArcade) that put the company in lockstep with industries that muddy its focus and dilute its values proposition.

Instead of bringing me a global version of Oprah’s book club or more of Steven Speilberg’s progressive reminiscences, I was hoping to hear that Apple would be providing—even in stages over months or years—an elegantly conceived and designed piece of technology that would (1) allow me to cut the cable cord with my internet provider while (2) upgrading me to an interface where I could see and pay for whatever visual programming I choose whenever I want it. An ApplePay-enabled entertainment router. Now that would be the kind of tech innovation that would change my world for the better again (and maybe yours too) while staying true to its founder’s messaging from twenty-odd years ago.  

Tech commentator Christopher Mims talked this week about why Apple was claiming values (like those embedded in Oprah’s notion of social responsibility) to maintain its market share, but he never really debated whether it should do so. I’d argue that Apple should continue to make its own case for the social benefits of its tech solutions instead of mixing its message about priorities with the aspirations of our celebrity culture.

When it comes to Silicon Valley’s mostly hypocritcal claims about social responsibility, I start with the skepticism of observers like Anand Giridharadas. To him, Facebook, Google, Amazon and Apple all use values as tools to gain the profits they’re after, covering their self-serving agendas with feel-good marketing.

In a post last October, I discussed some of his observations about Facebook (and by implication, most of the others) in the context of his recent book, Winners Take All.

The problem, said Giridharadas, is that while these companies are always taking credit for the efficiencies and other benefits they have brought, they take no responsibility whatsoever for the harms… In their exercise of corporate social responsibility, there is a mismatch between the solutions that the tech entrepreneurs can and want to bring and the problems we have that need to be solved. “Tending to the public welfare is not an efficiency problem, The work of governing a society is tending to everybody. It’s figuring out universal rules and norms and programs that express the value of the whole and take care of the common welfare.” By contrast, the tech industry sees the world more narrowly. For example, the fake news controversy lead Facebook not to a comprehensive solution for providing reliable information but to what Giridharadas calls “the Trying-to-Solve-the-Problem-with-the-Tools-that-Caused-It” quandary.

In the face of judgments like his, I’d argue that Apple should be vetting its corporate messaging with the inside guidance of those who understand the power of values before it squanders the high ground it still holds. 
 
Beyond “sticking with the tech innovations that it’s good at” and the Edison-type analogies that add to their luster, what follows are three proposals for how the company might build on its values-based loyalty while still looking us in the eye when it does so.
 
Each one has Apple talking about what its tech-appreciating customers still care about most when they think—with healthy nostalgia—about all the things that Apple has done for them already.

The fate that the company is aiming to avoid

1.         Apple should keep reminding us about its unparalleled customer service

The unbelievable service I have come to expect from Apple feeds my brand loyalty. I feel that we share the value of trustworthiness. When someone relies on me for something, I stand behind it instead of telling them I don’t have the time or it’s too expensive to fix. For me, Apple has consistently done the same.
 
So I was surprised when I had to argue a bit harder than I thought was necessary for Apple’s battery fix for an older iPhone, and I started following other customer complaints against the company to see if a change of priorities was in the air. Since I’m writing to you on a MacBook Air, problems with the Air’s later generation keyboards have apparently escalated to the point that national class-action litigation is in the offing. Not unlike the iPhone battery fix, Apple has gone on record as being willing to replace any sticky keyboard for free within 4 years of purchase, but is it really as easy as it sounds? As recently as last week, there was this plea to Apple from a tech reviewer in a national newspaper after profiling a litany of customer difficulties:

For any Apple engineers and executives reading: This is the experience you’re providing customers who shell out $1200 or more—sometimes a lot more. This is the experience after THREE attempts at this keyboard design.

When you are one of the richest companies in history, relatively inexpensive problems like this need to be solved before they get this far. A reputation for world-class customer service is a terrible thing to waste. Be glad to fix your technology on those rare occasions when it breaks down, and solve the technology problem with these keyboards before you sell or replace any more of them. Don’t make customers who were loyal enough to pay a premium for an Apple device take you to court because they can’t get enough of your attention any other way. Caring for your customers is a core value that needs polishing before its shine begins to fade and your customer loyalty slips away.

2.         Apple should keep telling us how much it’s different from Google, Facebook and Amazon

The uses that the dominant tech platforms are making of our personal data are on everyone’s mind.
 
Beyond invasions of privacy, deep concerns are also being voiced about the impact of “surveillance capitalism” on Western democracy, not only because of meddling with our elections but, even more fundamentally, because of how this new economic model disrupts “the organic reciprocities involving employment and consumption” that undergird democratic market systems. These are profound and increasingly wide-spread concerns, and Apple for one seems to share them. 
 
This is from another post last October called “Looking Out For the Human Side of Technology”: 

I was also struck this week by Apple CEO Tim Cook’s explosive testimony at a privacy conference organized by the European Union…:
 
‘Our own information—from the everyday to the deeply personal—is being weaponized against us with military efficiency. Today, that trade has exploded into a data-industrial complex.
 
These scraps of data, each one harmless enough on its own, are carefully assembled, synthesized, traded, and sold. This is surveillance. And these stockpiles of personal data serve only to enrich the companies that collect them. This should make us very uncomfortable.
 
Technology is and must always be rooted in the faith people have in it. We also recognize not everyone sees it that way—in a way, the desire to put profits over privacy is nothing new.’
 
‘Weaponized’ technology delivered with ‘military efficiency.’ ‘A data-industrial complex.’ One of the benefits of competition is that rivals call you out, while directing unwanted attention away from themselves…so Cook’s (and Apple’s) motives here have more than a dollop of competitive self-interest where [companies like] Google and Facebook are concerned. On the other hand, Apple is properly credited with limiting the data it makes available to third parties and rendering the data it does provide anonymous. There is a bit more to the story, however.
 
If data privacy were as paramount to Apple as it sounded this week, it would be impossible to reconcile Apple’s receiving more than $5 billion a year from Google to make it the default search engine on all Apple devices.

In its star-studded launch of TV, News and Arcade services this week, Apple’s presenters always reiterated that none of these services would be “ad selling models” targeting Apple users. They’re good reminders about Apple’s values but…while $5B is a lot of revenue to forsake if new purchasers of Apple devices got to pick their own search engines, it’s also a significant amount of support for an antithetical business model.

Not selling my data to others for use against me, like Apple’s standing behind the functionality of my devices, are core contributors to the company’s good reputation in my mind, and never more so than today. If Apple continues to differentiate itself from its competitors on the use of our data—and I think that it should—it needs to find ways to be more forthright about its own conflicts of interest while doing so.

When you stand on your values and back it up with your actions, the competitors you are challenging will always seek to undermine you when your actions are inconsistent with your words. Why let that inconsistency be tomorrow’s headline, Tim? Why not be ready to talk more forthrightly about the quandary with Google, and how the company is trying to address it, when asked to comment before a “gotcha” story like this gets published?

3.         Apple should get ahead of its new services launch by proactively addressing likely problems with real consequences for the rest of us

In its roll-out presentation, Apple announced plans for a new service that will link players to games that Apple will be creating. Few tech areas have seen more startling impacts from the use of behavioral data that’s being gathered from players by those who are behind these on-line games. I recently talked here about how the programmers and monitors behind Fortnite and the updated Tetris games are using data about how players react to as many as 200 “persuasive design elements” in these games to enhance the intensity of the player experience while making it more addictive to boys and other susceptible populations. 

Apple’s engineers know about these issues already. Its programmers are making its new games ready for primetime as soon as next fall. To differentiate itself from others in the on-line gaming industry, to strike a more principled note than its competitors have, and to broaden the scope of Apple’s values when it comes to personal data, the company could tell us some or all of the following in the coming months:

-whether it will be using behavioral data it generates from players through real time play to make its games more absorbing or addictive;

-whether it intends to restrict certain classes of users (like pre-teen boys) from playing certain games or restrict the hours that they can play them;

-what other safeguards it will be implementing to limit the amount of “player attention” that these games will be capturing;

-whether it will be selling game-related merchandise in the Apple store so its financial incentives to encourage extensive game-playing are clear from the outset; and

-whether it will be using data about player behavior to encourage improved learning, collaborative problem-solving, community building and other so-called “pro-social” skills in any of the games it will be offering.

I have no reason to doubt that Apple is serious about protecting the user data that its devices and services generate. Its new venture into gaming provides an opportunity to build on Apple’s reputation for safeguarding the use of its customers’ information. Tim Cook and Apple need to be talking to the rest of us, both now and next fall, about how it will be applying its data-related values to problems its customers care about today in the brave new world of on-line gaming.

+ + +

Apple’s stated values will hold its current customers and attract new ones when there is “a match” between the company’s solutions and the problems the rest of us have that need solving. Affiliation and loyalty grow when there are shared priorities in communities, in politics and in the marketplace.

That means Apple should keep talking about what its tech-appreciating customers care about most in the light of the wonders that Apple has given us already. Despite its recently announced forays into entertainment, the company should never take its eye too far from what it does best—which is to make world-changing devices—even when they take more time to develop than short-term financial performance seems to demand. 

When a company knows what it is and acts accordingly, it can always take risks for the rewards that can come from wearing its values on its sleeves. 

This post was adapted from my March 31, 2019 newsletter. You can subscribe (to the right) and receive it in your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Building Your Values into Your Work Tagged With: Anand Giridharadas, Apple, behavioral data, Christopher Mims, core corporate values, corporate values, customer service, data, gaming, personal data use, priorities, Steve Jobs, surveillance capitalism, tech platforms, Tim Cook, values

Looking Out For the Human Side of Technology

October 28, 2018 By David Griesing Leave a Comment

Maintaining human priorities in the face of new technologies always feels like “a rearguard action.” You struggle to prevent something bad from happening even when it seems like it may be too late.

The promise of the next tool or system intoxicates us. Smart phones, social networks, gene splicing.  It’s the super-computer at our fingertips, the comfort of a boundless circle of friends, the ability to process massive amounts of data quickly or to short-cut labor intensive tasks, the opportunity to correct genetic mutations and cure disease. We’ve already accepted these promises before we pause to consider their costs—so it always feels like we’re catching up and may not have done so in time.

When you’re dazzled by possibility and the sun is in your eyes, who’s thinking “maybe I should build a fence?”

The future that’s been promised by tech giants like Facebook is not “the win-win” that we thought it was. Their primary objectives are to serve their financial interests—those of their founder-owners and other shareholders—by offering efficiency benefits like convenience and low cost to the rest of us. But as we’ve belattedly learned, they’ve taken no responsibility for the harms they’ve also caused along the way, including exploitation of our personal information, the proliferation of fake news and jeopardy to democratic processes, as I argued here last week.

Technologies that are not associated with particular companies also run with their own promise until someone gets around to checking them–a technology like artificial intelligence or AI for example. From an ethical perspective, we are usually playing catch up ball with them too. If there’s a buck to be made or a world to transform, the discipline to ask “but should we?” always seems like getting in the way of progress.

Because our lives and work are increasingly impacted, the stories this week throw additional light on the technology juggernaut that threatens to overwhem us and our “rearguard” attempts to tame it with our human concerns.

To gain a fuller appreciation of the problem regarding Facebook, a two-part Frontline doumentary will be broadcasting this week that is devoted to what one reviewer calls “the amorality” of the company’s relentless focus on adding users and compounding ad revenues while claiming to create the on-line “community” that all of us should want in the future.  (The show airs tomorrow, October 29 at 9 p.m. and on Tuesday, October 30 at 10 p.m. EST on PBS.)

Frontline’s reporting covers Russian election interference, Facebook’s role in whipping Myanmar’s Buddhists into a frenzy over its Rohingya minority, Russian interference in past and current election cycles, and how strongmen like Rodrigo Duterte in the Phillipines have been manipulating the site to achieve their political objectives. Facebook CEO Mark Zuckerberg’s limitations as a leader are explored from a number of directions, but none as compelling as his off-screen impact on the five Facebook executives who were “given” to James Jacoby (the documentary’s director, writer and producer) to answer his questions. For the reviewer:

That they come off like deer in Mr. Jacoby’s headlights is revealing in itself. Their answers are mealy-mouthed at best, and the defensive posture they assume, and their evident fear, indicates a company unable to cope with, or confront, the corruption that has accompanied its absolute power in the social median marketplace.

You can judge for yourself. You can also ponder whether this is like holding a gun manufacturer liable when one of its guns is used to kill somebody.  I’ll be watching “The Facebook Dilemma” for what it has to say about a technology whose benefits have obscured its harms in the public mind for longer than it probably should have. But then I remember that Facebook barely existed ten years ago. The most important lesson from these Frontline episodes may be how quickly we need to get the stars out of our eyes after meeting these powerful new technologies if we are to have any hope of avoiding their most significant fallout.

Proceed With Caution

I was also struck this week by Apple CEO Tim Cook’s explosive testimony at a privacy conference organized by the European Union.

Not only was Cook bolstering his own company’s reputation for protecting Apple users’ personal information, he was also taking aim at competitors like Google and Facebook for implementing a far more harmful business plan, namely, selling user information to advertisers, reaping billions in ad dollar revenues in exchange, and claiming the bargain is providing their search engine or social network to users for “free.” This is some of what Cook had to say to European regulators this week:

Our own information—from the everyday to the deeply personal—is being weaponized against us with military efficiency. Today, that trade has exploded into a data-industrial complex.

These scraps of data, each one harmless enough on its own, are carefully assembled, synthesized, traded, and sold. This is surveillance. And these stockpiles of personal data serve only to enrich the companies that collect them. This should make us very uncomfortable.

Technology is and must always be rooted in the faith people have in it. We also recognize not everyone sees it that way—in a way, the desire to put profits over privacy is nothing new.

“Weaponized” technology delivered with “military efficiency.” “A data-industrial complex.” One of the benefits of competition is that rivals call you out, while directing unwanted attention away from themselves. One of my problems with tech giant Amazon, for example, is that it lacks a neck-to-neck rival to police its business practices, so Cook’s (and Apple’s) motives here have more than a dollop of competitive self-interest where Google and Facebook are concerned. On the other hand, Apple is properly credited with limiting the data it makes available to third parties and rendering the data it does provide anonymous. There is a bit more to the story, however.

If data privacy were as paramount to Apple as it sounded this week, it would be impossible to reconcile Apple’s receiving more than $5 billion a year from Google to make it the default search engine on all Apple devices. However complicit in today’s tech bargains, Apple pushed its rivals pretty hard this week to modify their business models and become less cynical about their use of our personal data as the focus on regulatory oversight moves from Europe to the U.S.

Keeping Humans in the Tech Equation

Technologies that aren’t proprietary to a particular company but are instead used across industries require getting over additional hurdles to ensure that they are meeting human needs and avoiding technology-specific harms for users and the rest of us. This week, I was reading up on a positive development regarding artificial intelligence (AI) that only came about because serious concerns were raised about the transparency of AI’s inner workings.

AI’s ability to solve problems (from processing big data sets to automating steps in a manufacturing process or tailoring a social program for a particular market) is only as good as the algorithms it uses. Given concern about personal identity markers such as race, gender and sexual preference, you may already know that an early criticism of artificial intelligence was that an author of an algorithm could be unwittingly building her own biases into it, leading to discriminatory and other anti-social results.  As a result, various countermeasures are being undertaken to minimize grounding these kinds of biases in AI code. With that in mind, I read a story this week about another systemic issue with AI processing’s “explainability.”

It’s the so-called “black box” problem. If users of systems that depend on AI don’t know how they work, they won’t trust them. Unfortunately, one of the prime advantages of AI is that it solves problems that are not easily understood by users, which presents the quandary that AI-based systems might need to be “dumbed-down” so that the humans using them can understand and then trust them. Of course, no one is happy with that result.

A recent article in Forbes describes the trust problem that users of machine-learning systems experience (“interacting with something we don’t understand can cause anxiety and make us feel like we’re losing control”) along with some of the experts who have been feeling that anxiety (cancer specialists who agreed with a “Watson for Oncology” system when it confirmed their judgments but thought it was wrong when it failed to do so because they couldn’t understand how it worked).

In a positive development, a U.S. Department of Defense agency called DARPA (or Defense Advanced Research Projects Agency) is grappling with the explainability problem. Says David Gunning, a DARPA program manager:

New machine-learning systems will have the ability to explain their rationale, characterize their strengths and weaknesses, and convey an understanding of how they will behave in the future.

In other words, these systems will get better at explaining themselves to their users, thereby overcoming at least some of the trust issue.

DARPA is investing $2 billion in what it calls “third-wave AI systems…where machines understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real word phenomena,” according to Gunning. At least with the future of warfare at stake, a problem like “trust” in the human interface appears to have stimulated a solution. At some point, all machine-learning systems will likely be explaining themselves to the humans who are trying to keep up with them.

Moving beyond AI, I’d argue that there is often as much “at stake” as sucessfully waging war when a specific technology is turned into a consumer product that we use in our workplaces and homes.

While there is heightened awareness today about the problems that Facebook poses, few were raising these concerns even a year ago despite their toxic effects. With other consumer-oriented technologies, there are a range of potential harms where little public dissent is being voiced despite serious warnings from within and around the tech industry. For example:

– how much is our time spent on social networks—in particular, how these networks reinforce or discourage certain of our behaviors—literally changing who we are?  
 
– since our kids may be spending more time with their smart phones than with their peers or family members, how is their personal development impacted, and what can we do to put this rabbit even partially back in the hat now that smart phone use seems to be a part of every child’s right of passage into adulthood? 
 
– will privacy and surveillance concerns become more prevalent when we’re even more surrounded than we are now by “the internet of things” and as our cars continue to morph into monitoring devices—or will there be more of an outcry for reasonable safeguards beforehand? 
 
– what are employers learning about us from our use of technology (theirs as well as ours) in the workplace and how are they using this information?

The technologies that we use demand that we understand their harms as well as their benefits. I’d argue our need to become more proactive about voicing our concerns and using the tools at our disposal (including the political process) to insist that company profit and consumer convenience are not the only measures of a technology’s impact.

Since invention of the printing press a half-millennia ago, it’s always been hard but necessary to catch up with technology and to try and tame its excesses as quickly as we can.

This post was adapted from my October 28, 2018 newsletter.

Filed Under: *All Posts, Building Your Values into Your Work, Continuous Learning Tagged With: Amazon, Apple, ethics, explainability, facebook, Google, practical ethics, privacy, social network harms, tech, technology, technology safeguards, the data industrial complex, workplace ethics

What Good is My English Degree?

November 25, 2012 By David Griesing Leave a Comment

What kind of job are you going to get with an English degree (or a degree in History, Classics, Religion, Music, Art History, Anthropology, Philosophy or French)? You’re wondering. Maybe your parents are too.

Or then again: you studied the humanities and now you’re 10 or 15 or 25 years out of school. But every workday, you feel like you need a crash course in technology, social media, marketing, engineering and accounting? Why does what you studied seem to have so little value? Why do all these other things seem so important?

What you learned by studying the humanities does have value—tremendous value.  But there are many reasons you might not think so, and a brief look at some of them might be helpful before discussing how the humanities can bring the greatest value to your job today.

The Industrial Revolution kicked off an explosion of technological advancement that has only accelerated in our lifetimes. (It’s x amount of memory on that chip today, twice as much tomorrow, and so on.)  At the same time, advances in science created an experiment-based way of explaining the world that clashed with—and has now largely overtaken—a faith- or story-based worldview, at least in so-called “advanced societies.” (It’s less church attendance and more individualized spirituality, when faith remains a part of our lives at all.)

In their upward trajectory, technology and science were also vastly improving our standard of living. At the same time that questions of meaning and purpose became more personalized, many of us were also feeling that we no longer needed the humanities to improve the quality of our lives.  Technology and science were attending to our material comfort along with our wellbeing.

Or so we thought.

Several writers have lamented the sidelining of the humanities. For example, Anthony Kronman has argued that as the arts have lost their prominence in our schools, we have almost lost the ability to develop an important dimension in our lives.

Where education used to mean exposure to a canon of Western thought to help students determine “how I should live my life,” that canon has increasingly come under attack. Some viewed it as propaganda from a group of white, Eurocentric oppressors, while others challenged these texts for presenting “subjective” interpretations of reality instead of the “objective” (and therefore more reliable) view that science and technology provides.

Book Burning

So in the face of this powerful onslaught, where is the value in your English degree?

Its value is to give you something that science and technology never can: a personal story that gives your life as well as your work both meaning and purpose. Despite our human flaws and ultimate mortality, the story you’re writing recounts how you can make a difference for yourself and others in your community by what you chose to do everyday. Through the humanities, you have lifelong access to role models and ideas that help you to live a good and fulfilling life.

It is the insight gained from these stories that business needs the most today.

In his “How to Avoid a Bonfire of the Humanities,” Michael S. Malone notes that since the best products and services aim at meeting real human needs and making our lives better, the best way to bring them to market is with stories that resonate in people’s lives.

Given the dominance of science and technology and its associated impacts today, fewer people know how to find what’s meaningful on their own, and fewer still can deliver it to them. Asked what made his company special, Steve Jobs said: “It’s in Apple’s DNA that technology alone is not enough—it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing.”

Think about it. It’s not what the product is that makes you buy it, or how you use it, but why it makes your life better. (Simon Sinek’s much-viewed TED talk is about just this point.)  Your humanities degree has economic value precisely because it enables you to understand “the why.”

Where the English major is needed is at the intersection between the company and its customers.  Having studied “humanity,” you have the ability to focus your company on meeting basic human needs in ways that neither science nor technology ever can. It is a priceless perspective that is needed in marketing, sales, and customer service, but also at every stage of product development and design. Again: Apple ads, Apple stores, and Apple products satisfy Apple customers as much as they do because of Apple’s English majors.

Malone concludes his “Bonfires” article by noting that in the future the market advantage will go to companies like this:

that can effectively employ imagination, metaphor, and most of all, storytelling. And not just creative writing, but every discipline in the humanities, from the classics to rhetoric to philosophy.  Twenty-first century storytelling: multimedia, mass customizable, portable and scalable, drawing upon the myths and archetypes of the ancient world, on ethics, and upon a deep understanding of human nature and even religious faith.

The humanities have been undervalued and shunted aside, but what they have given us is more essential in the best jobs than ever.  Far from putting you at a disadvantage in the workforce, they give you a powerful advantage.  And the places where you should want to be working know it.

Filed Under: *All Posts, Continuous Learning Tagged With: Apple, customer service, education, fulfilling life and work, good life, humanities, marketing, perspective, product design, product development, sales, science, Steve Jobs, technology

About David

David Griesing (@worklifeward) writes from Philadelphia.

Read More →

Subscribe to my Newsletter

Join all the others who have new posts, recommendations and links to explore delivered to their inboxes every week. Please subscribe below.

David Griesing Twitter @worklifereward

My Forthcoming Book

WordLifeReward Book

Search this Site

Recent Posts

  • Liberating Trump’s Good Instincts From the Rest April 21, 2025
  • Delivering the American Dream More Reliably March 30, 2025
  • A Place That Looks Death in the Face, and Keeps Living March 1, 2025
  • Too Many Boys & Men Failing to Launch February 19, 2025
  • We Can Do Better Than Survive the Next Four Years January 24, 2025

Follow Me

David Griesing Twitter @worklifereward

Copyright © 2025 David Griesing. All Rights Reserved.

  • Terms of Use
  • Privacy Policy