David Griesing | Work Life Reward Author | Philadelphia

  • Blog
  • About
    • Biography
    • Teaching and Training
  • Book
    • WorkLifeReward
  • Newsletter Archive
  • Contact
You are here: Home / Archives for autonomy

Mobs Are Like Weapons Pointed at All of Us

November 30, 2021 By David Griesing Leave a Comment

For years now, I’ve been drawn to the mind of Robert D. Kaplan. It’s not that I’ve always agreed with him, but his insights have rarely failed to pull me in. 
 
Maybe it’s for all the reasons that I wrote last week’s post, A Deeper Sense of Place is an Anchor in Turbulent Times. For much of his lengthy career, Kaplan has written about geography’s influence on politics and national power. He believes that where a people are located,” the place they “call home”—it’s proximity to powerful nations, it’s access to river systems, the extent of its undeveloped frontier, its natural resources (or lack of them), whether they’re protected by mountain ranges or oceans—has “a determining effect” on how these people view themselves and the world around them.
 
The importance of place was my way into Kaplan’s writing because, in my gut, I always felt he was right.
 
Three years ago, I wrote about a book by Kaplan called Earning the Rockies: How Geography Shapes America’s Role in the World (2017). It was another illuminating read. This was a time when the executives helming America’s tech companies were still using their prominence and financial clout to lecture the rest of us about “what progress should look like” from their ideological points of view. In that post, I wondered how their rugged individualism, honed on technology’s frontier, jived with Americans skepticism about ideologies and their commandments, another take-away from the “frontier mentality” that Kaplan ascribed to most Americans.

Frontiers [like America’s] test ideologies like nothing else. There is no time for the theoretical. That, ultimately, is why America has not been friendly to communism, fascism, or other, more benign forms of utopianism. Idealized concepts have rarely taken firm root in America and so intellectuals have had to look to Europe for inspiration. People here are too busy making money—an extension, of course, of the frontier ethos, with its emphasis on practical initiative…[A]long this icy, unforgiving frontier, the Enlightenment encountered reality and was ground down to an applied wisdom of ‘commonsense’ and ‘self evidence.’ In Europe an ideal could be beautiful or liberating all on its own, in frontier America it first had to show measurable results.

In the tumultuous years that have followed, my question has been answered in part by a populist, Know-Nothing revulsion aimed at “thought-leaders,” big-shots and experts of all kinds who think they know better. 
 
So perhaps it’s not a coincidence that this same three years has brought Kaplan to the short essay that appeared in the Wall Street Journal this week.  It’s called “The Tyranny of the 21st Century Crowd”  and it came with the following elaboration: “Mobs that form from the bottom up may prove even harder to defeat than totalitarian regimes.” (Here’s a link that makes his essay available beyond the usual paywall.)
 
What does any of this have to do with our work next week, the work that all of us should be doing, or the over-all quality of our lives at this place and time? As it turns out, quite a lot.
 
I had dinner this past week with a small business owner whose office is indirectly buffeted on a regular basis by mob-related mentalities. One of his longtime employees—someone who happens to have an advanced degree—is from a family of anti-vaxers, refuses to send her child to a school that requires vaccinations, and is pushing hard for an accommodation to move her work (from live to remote) to another state where she thinks the schools will be more lenient. A second longtime employee is a member of two oppressed groups (based on age and on race). This employee apparently doesn’t feel like working any more, but also holds the implicit (if meritless) threat of a discrimination action if he is either disciplined or fired. Of course, getting an 8-hour workday out of either of these “disgruntled employees” has turned into a daily minefield. 
 
I couldn’t help but sympathize.
 
Who needs the expense and aggravation of being dragged by either of these people into a courtroom because they believe (and therefore can claim, without evidence) that their employer is treating them unfairly by refusing to give them what they want? 
 
How can my friend (indeed how can anyone) run a business today when employees can assert the abridgement to some freedom- or identity-based right when all he is demanding is that they come into the office and do the work that they’re being paid to do? 
 
I got a close-to-the-ground view of the mobs that loomed behind my friend’s two employees over dinner this week. But beyond examples like these, Kaplan foresees today’s mob-based threats causing wider, deeper and even more troubling consequences for a way of living and working that we assume is far more resilient than it actually is.

Pavlov’s dog parade is by a favorite artist, the late cartoonist and social commentator Saul Steinberg. (If it looks familiar, I also featured it in my post, We’re All Acting Like Dogs Today, on the refusal by regulators (and the public behind them) to confront the user manipulation and mob tendencies that are an inherent feature of dominant tech platforms like Google.Twitter and Facebook.)

While Kaplan implicitly acknowledges the American peoples’ general hostility to foreign ideologies like communism and fascism, along with its “heartland’s” hostility to the progressive ideologies of the East and West Coasts, he certainly recognizes the populist impulses that bubble beneath all of these debates.
 
For Kaplan, the Peoples’ arguments over their deeply held political beliefs usually represent “a profound abasement of reason.”  In other words, populists of all stripes generally feel the rightness of their views instead of reasoning themselves into the convictions that they hold. Under these circumstances, it’s difficult if not impossible to foresee how America willl be able to maintain its democratic way of life when every quadrant of our politics is being actively overtaken by its own version of a mob. (While Kaplan doesn’t delve into these divisions, George Packer recently described “the four political belief systems” that are operating in the U.S. today in “How America Fractured Into Four Parts,” an article of his that I discussed here in June.)
 
What Kaplan does do is quote liberally from a book about mobs that I’d never heard of: Crowds and Power by Elias Canetti.

The crowd, Canetti says, emerges from the need of the lonely individual to conform with others. Because he can’t exert dominance on his own, he exerts it through a crowd that speaks with one voice. The crowd’s urge is always to grow, consuming all hierarchies, even as it feels persecuted and demands retribution. The crowd sees itself as entirely pure, having attained the highest virtue. 

Thus, one aim of the crowd is to hunt down the insufficiently virtuous. The tyranny of the crowd has many aspects, but Canetti says its most blatant form is that of the ‘questioner,’ and the accuser. ‘When used as an intrusion of power,” the accusing crowd ‘is like a knife cutting into the flesh of the victim. The questioner knows what there is to find, but he wants actually to touch it and bring it to light.’

The tyranny and violence of the mob reaches its crescendo when it exercises the monopoly that it believes it has on virtue. ‘If you don’t agree with us,’ Canetti says of them, ‘you are not only wrong but morally wanting, and as such should not only be denounced, but destroyed.’ Then he deploys notions about nations and their exercise of power to provide historical perspective as well as a glimpse into the future of America’s power. Where once America’s (and the West’s) power resided in its political, educational and media institutions and in the civic cohesion they produced, today that foundation is increasingly undermined not by counter-institutions (that seek social change for the better) but by mob power (whose primary interest is in weakening, when not actively seeking to destroy, the institutions that once bound us together). 

Nazi Germany and the Soviet Union were defeated by U.S. military and industrial power. Civilizations rest not only on intellectual and cultural foundations but also on coarser aspects of strength and power. The historic West, which is ultimately about the freedom of the individual to rise above the crowd, survived the 20th century thanks to American hard power, itself maintained by a system of individual excellence in the arts and sciences, in turn nurtured by an independent and diverse media. But that media is now becoming immersed in the crowd, where it demands virtue in its purest ideological form, so that much of the media too often plays the role of Canetti’s accuser.

The lust for purity combined with the tyranny of social-media technology in the hands of the young—who have little sense of the past and of tradition—threatens to create an era of the most fearsome mobs in history. The upshot of such crowd coercion is widespread self-censorship: the cornerstone of all forms of totalitarianism….

This ultimately leads toward a controlled society driven by the bland, the trivial and the mundane, wearing the lobotomized face of CNN weekday afternoon television. Outright evil can surely be dealt with, but a self-righteous conformity is harder to resist. Left unchecked, this is how the West slowly dies.

The self-censorship that this kind of tyranny causes and the masks it forces us to wear are more isolating than any restrictions that were imposed during the pandemic. Reasonable people withdraw from free exchange for fear of having their livelihoods and reputations challenged by self-righteous mobs. Effectively “lobotimizing ourselves,” we mask up to avoid being “destroyed.”

One of the Saul Steinberg and Inge Morath images from The Mask Series (1959-1963).

Reading Kaplan’s essay reminded me of a book that I hadn’t read since college, The Revolt of the Masses by Orega y Gasset, a Spanish essayist.

Sounding like an Old Testament prophet 85 years ago, Ortega wrote about the undermining of “liberalism” by mobs of communist and right-wing agitators. He feared the “tyranny of [any] majority” and the “collective mediocrity” of the “masses” (and the so-called “mass-men” that populated them). Ortega believed they threatened both individuality and freedom of thought with annihilation. Much like Kaplan, he wrote:

The mass crushes beneath it everything that is different, that is excellent, individual, qualified, and select. Anybody who is not like everybody, who does not think like everybody, runs the risk of being eliminated. And it is clear that this ‘everybody’ is not ‘everybody.’ ‘Everybody’ was normally the complex unity of the mass and the divergent, specialized elite groups. Nowadays, ‘everybody’ is the mass alone.

Twenty years later, in Homage to Catalonia  (George Orwell’s sobering account of his own time fighting for the Republicans during the Spanish Civil War), the eventual author of 1984 and Animal Farm reached the same conclusion as Ortega about the mobs of the left and the right that were squeezing the life blood our of their homeland. It was an experience that eviserated the romanticism that an idealistic young man had once felt for his own republican principles.
 
Even with their differences, Orwell, Ortega and Kaplan would probably agree that it was the power of America and the West—the only champions of “liberal” values left standing—that liberated at least some of the civilized world from the mobs that were overtaking it before World War II. As we sit here today, it’s hardly misplaced to wonder: Who, if anyone, will do so again?

In the course of his essay, Robert Kaplan doesn’t mention the mob that attempted to interrupt the Electoral College vote in Washington last January; not a woke mob enforcing its virtue from prominent positions in the nation’s media and universities, but a MAGA mob that was encouraged by a president who’d just been defeated at the polls. 

The “insurrection” was another side of the same coin.

In a post from a month before the Capitol assault, I wrote about “the big lie” that was told to the German people following their defeat in World War I. “You didn’t actually lose,” conspiracists told them. “Our terrible surrender was the result of a plot by leftists, Jews, bankers and others who stood to gain from it.” That it was a lie hardly mattered, because it fed so seamlessly into the resentment, anger and economic hardship that many German soldiers, their families and communities were already feeling. It was these “regular people” who fed the mobs that led to national socialism and, only twenty years later, a second world war.

I think the wrong question to take from these historical similarities is whether Donald Trump is another Adolf Hitler.  Instead, as I wrote a year ago:

Are there genuine parallels between Germany in the 1920s and 30’s and the U.S in the 2020’s and 30’s?  

Were there political leaders (both then and now) who were willing to tell “a big, almost preposterous lie” if it could stoke existing grievances and rally their supporters so they could gain additional power?  

Did the German people permit their leaders to send fellow Germans who were supposedly to blame for their tribulations to concentration camps?  

How could so many free people, who had enjoyed democracy and the right to determine their futures, been overtaken by such a lie? 

Surely, they knew then (as we know now) what was happening around them, as reporters today are called ‘enemies of the state’ and election officials are targeted for assassination.

Did they pretend (and are we pretending now) not to see the breakdowns in the fabric of our society that continue and only seem to get worse?

To paraphrase [the poet, W.H.] Auden: “Did the best among us on both sides really lack conviction, while only the worst / were full of passionate intensity”?

In a new HBO documentary about last January’s revolt of the masses, called “Four Hours at the Capitol” (link to the film’s trailer), a police officer who was interviewed recalled a piece of advice that he had gotten during his military training as he thought back to where he found himself that day: 

Individuals aren’t usually a problem. But when they get together and create a mob, then, the mob is the weapon.

Too few in America and in the West today are actively trying to disarm these weapons, which are being stoked every day by social media, by too many in the legacy media, and by the demagogues who give voice to every flavor of them.
 
Will we need the purifying force of another world war—another battle to the death for the best and against the worst in our civilization—in order to break the hold that mob rule increasingly exerts over our politics, our freedom of speech, and our ability to be anything more than mass-men or -women in one frenzied crowd or another? 
 
Maybe Kaplan and his intellectual forebears give us an alternative vision to hold onto: a view of America and the West that once again has the fortitude to stand up against every kind of mob in the world–not because of our theoretical beliefs about democracy and our Enlightenment traditions, but because we cherish our freedom and individuality for their practical benefits and refuse to give them up because weapons keep being pointed in our direction.

This post was adapted from my October 17, 2021 newsletter. Newsletters are delivered to subscribers’ in-boxes every Sunday morning and occasionally I post the content from one of them here. You can subscribe by leaving your email address in the column to the right.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Work & Life Rewards Tagged With: a mob s a weapon, autonomy, Elias Canetti, freedom, Geoge Orwell, individuality, mob, mob rule, mobs, Ortega y Gasset, populism, populist, Robert D Kaplan, self-censorship, tyranny of crowd

Who’s Winning Our Tugs-of-War Over On-Line Privacy & Autonomy?

February 1, 2021 By David Griesing Leave a Comment

We know that our on-line privacy and autonomy (or freedom from outside control) are threatened in two, particularly alarming ways today. There are the undisclosed privacy invasions that occur from our on-line activities and the loss of opportunities where we can speak our minds without censorship.

These alarm bells ring because of the dominance of on-line social media platforms like Facebook, YouTube and Twitter and text-based exchanges like What’s App and the other instant messaging services—most of which barely existed a decade ago. With unprecedented speed, they’ve become the town squares of modern life where we meet, talk, shop, learn, voice opinions and engage politically. But as ubiquitous and essential as they’ve become, their costs to vital zones of personal privacy and autonomy have caused a significant backlash, and this past week we got an important preview of where this backlash is likely to take us.

Privacy advocates worry about the harmful consequences when personal data is extracted from users of these platforms and services. They say our own data is being used “against us” to influence what we buy (the targeted ads that we see and don’t see), manipulate our politics (increasing our emotional engagement by showing us increasingly polarizing content), and exert control over our social behavior (by enabling data-gathering agencies like the police, FBI or NSA). Privacy advocates are also offended that third parties are monetizing personal data “that belongs to us” in ways that we never agreed to, amounting to a kind of theft of our personal property by unauthorized strangers.

For their part, censorship opponents decry content monitors who can bar particular statements or even participation on dominant platforms altogether for arbitrary and biased reasons. When deprived of the full use of our most powerful channels of mass communication, they argue that their right to peaceably assemble is being eviscerated by what they experience as “a culture war” against them. 

Both groups say they have a privacy right to be left alone and act autonomously on-line: to make choices and decisions for themselves without undue influence from outsiders; to be free from ceaseless monitoring, profiling and surveillance; to be able to speak their minds without the threat of “silencing;” and, “to gather” for any lawful purpose without harassment. 

So how are these tugs-or-war over two of our most basic rights going?

This past week provided some important indications.

This week’s contest over on-line privacy pit tech giant Apple against rivals with business models that depend upon selling their users’ data to advertisers and other third parties—most prominently, Facebook and Google.

Apple announced this week that it would immediately start offering its leading smartphone users additional privacy protections. One relates to its dominant App Store and developers like Facebook, Google and the thousands of other companies that sell their apps (or platform interfaces) to iPhone users.

Going forward—on what Apple chief Tim Cook calls “a privacy nutrition label”—every app that the company offers for installation on its phones will need to share its data collection and privacy practices before purchase in ways that Apple will ensure “every user can understand and act on.” Instead of reading (and then ignoring) multiple pages of legalese, for the first time every new Twitter or YouTube user for example, will be able through their iPhones to either “opt-in” or refuse an app’s data collection practices after reading plain language that describes the personal data that will be collected and what will be done with it. In a similar vein, iPhone users will gain a second advantage over apps that have already been installed on their phones. With new App Tracking Transparency, iPhone users will be able to control how each app is gathering and sharing their personal data. For every application on your iPhone, you can now choose whether a Facebook or Google has access to your personal data or not.

While teeing up these new privacy initiatives at an industry conference this week, Apple chief Tim Cook was sharply critical of companies that take our personal data for profit, citing several of the real world consequences when they do so. I quote at length from his remarks last Thursday because I enjoyed hearing someone of Cook’s stature speaking to these issues so pointedly, and thought you might too:

A little more than two years ago…I spoke in Brussels about the emergence of a data-industrial complex… At that gathering we asked ourselves: “what kind of world do we want to live in?” Two years later, we should now take a hard look at how we’ve answered that question. 

The fact is that an interconnected ecosystem of companies and data brokers, of purveyors of fake news and peddlers of division, of trackers and hucksters just looking to make a quick buck, is more present in our lives than it has ever been. 

And it has never been so clear how it degrades our fundamental right to privacy first, and our social fabric by consequence.

As I’ve said before, ‘if we accept as normal and unavoidable that everything in our lives can be aggregated and sold, then we lose so much more than data. We lose the freedom to be human.’….

Together, we must send a universal, humanistic response to those who claim a right to users’ private information about what should not and will not be tolerated….

At Apple…, [w]e have worked to not only deepen our own core privacy principles, but to create ripples of positive change across the industry as a whole. 

We’ve spoken out, time and again, for strong encryption without backdoors, recognizing that security is the foundation of privacy. 

We’ve set new industry standards for data minimization, user control and on-device processing for everything from location data to your contacts and photos. 

At the same time that we’ve led the way in features that keep you healthy and well, we’ve made sure that technologies like a blood-oxygen sensor and an ECG come with peace of mind that your health data stays yours.

And, last but not least, we are deploying powerful, new requirements to advance user privacy throughout the App Store ecosystem…. 

Technology does not need vast troves of personal data, stitched together across dozens of websites and apps, in order to succeed. Advertising existed and thrived for decades without it. And we’re here today because the path of least resistance is rarely the path of wisdom. 

If a business is built on misleading users, on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform….

At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement — the longer the better — and all with the goal of collecting as much data as possible.

Too many are still asking the question, “how much can we get away with?,” when they need to be asking, “what are the consequences?” What are the consequences of prioritizing conspiracy theories and violent incitement simply because of their high rates of engagement? What are the consequences of not just tolerating, but rewarding content that undermines public trust in life-saving vaccinations? What are the consequences of seeing thousands of users join extremist groups, and then perpetuating an algorithm that recommends even more?….

[N]o one needs to trade away the rights of their users to deliver a great product. 

With its new “data nutrition labels” and “app tracking transparency,” many (if not most) of Apple’s iPhone users are likely to reject other companies’ data collection and sharing practices once they understand the magnitude of what’s being taken from them. Moreover, these votes for greater data privacy could be a major financial blow to the companies extracting our data because Apple sold more smartphones globally than any other vendor in the last quarter of 2020, almost half of Americans use iPhones (45.3% of the market according to one analyst), more people access social media and messaging platforms from their phones than from other devices, and the personal data pipelines these data extracting companies rely upon could start constricting immediately.   
 
In this tug-of-war between competing business models, the outcry this week was particularly fierce from Facebook, which one analyst predicts could start to take “a 7% revenue hit” (that’s real cash at $6 billion) as early as the second quarter of this year. (Facebook’s revenue take in 2020 was $86 billion, much of it from ad sales fueled by user data.) Mark Zuckerberg charged that Apple’s move tracks its competitive interests, saying its rival “has every incentive to use their dominant platform position to interfere with how our apps and other apps work,” among other things, a dig at on-going antitrust investigations involving Apple’s App Store. In a rare expression of solidarity with the little guy, Zuckerberg also argued that small businesses which access customers through Facebook would suffer disproportionately from Apple’s move because of their reliance on targeted advertising. 
 
There’s no question that Apple was flaunting its righteousness on data privacy this week and that Facebook’s “ouches” were the most audible reactions. But there is also no question that a business model fueled by the extraction of personal data has finally been challenged by another dominant market player. In coming weeks and months we’ll find out how interested Apple users are about protecting their privacy on their iPhones and whether their eagerness prompts other tech companies to offer similar safeguards. We’ll get signals from how advertising dollars are being spent as the “underlying profile data” becomes more limited and less reliable. We may also begin to see the gradual evolution of an on-line public space that’s somewhat more respectful of our personal privacy and autonomy.
 
What’s clearer today is that tech users concerned about the privacy of their data and freedom from data-driven manipulation on-line can now limit at least some of the flow of that information to unwelcome strangers in ways that they never had at their disposal before.

All of us should be worried about censorship of our views by content moderators at private companies (whether in journalism or social media) and by governmental authorities that wish to stifle dissenting opinions.  But many of the strongest voices behind regulating the tech giants’ penchant “to moderate content” today come from those who are convinced that press, media and social networking channels both limit access to and censor content from those who differ with “their liberal or progressive points of view.” Their opposition speaks not only to the extraordinary dominance of these tech giants in the public square today but also to the air of grievance that colors the political debates that we’ve been having there.
 
Particularly after President Trump’s removal from Facebook and Twitter earlier this month and the temporary shutdown of social media upstart Parler after Amazon cut off its cloud computing services, there has been a concerted drive to find new ways for individuals and groups to communicate with one another on-line in ways that cannot be censored or “de-platformed” altogether. Like the tug-of-war over personal data privacy, a new polarity over on-line censorship and the ways to get around it could fundamentally alter the character of our on-line public squares.
 
Instead of birthing a gaggle of new “Right-leaning” social media companies with managers who might still be tempted to interfere with irritating content, blockchain software technology is now being utilized to create what amount to “moderation-proof” communication networks.
 
To help with basic blockchain mechanics, this is how I described it here in 2018.

A blockchain is a web-based chain of connections, most often with no central monitor, regulator or editor. Its software applications enable every node in its web of connections to record data which can then be seen and reviewed by every other connection. It maintains its accuracy through this transparency. Everyone with access can see what every other connection has recorded in what amounts to a digital ledger…

Blockchain-based software can be launched by individuals, organizations or even governments. Software access can be limited to a closed network of participants or open to everyone. A blockchain is usually established to overcome the need for and cost of a “middleman” (like a bank) or some other impediment (like currency regulations, tariffs or burdensome bureaucracy). It promotes “the freer flow” of legal as well as illegal goods, services and information. Blockchain is already driving both modernization and globalization. Over the next several years, it will also have profound impacts on us as individuals. 

If you’d gain from a visual description, this short video from The MIT Technology Review will also show you the basics about this software innovation.  
 
I’ve written several times before about the promise of blockchain-driven systems. For example, Your Work is About to Change Forever (about a bit-coin-type financial future without banks or traditional currencies); Innovation Driving Values (how secure and transparent recording of property rights like land deeds can drive economic progress in the developing world); Blockchain Goes to Work (how this software can enable gig economy workers to monetize their work time in a global marketplace); Data Privacy & Accuracy During the Coronavirus (how a widely accessible global ledger that records accurate virus-related information can reduce misinformation); and, with some interesting echoes today, a 2017 post called Wish Fulfillment (about why a small social media platform called Steem-It was built on blockchain software).    
 
Last Tuesday, the New York Times ran an article titled: They Found a Way to Limit Big Tech’s Power: Using the Design of Bitcoin. That “Design” in the title was blockchain software. The piece highlighted:

a growing movement by technologists, investors and everyday users to replace some of the internet’s basic building blocks in ways that would be harder for tech giants like Facebook or Google [or, indeed, anyone outside of these self-contained platforms] to control.

Among other things, the article described how those “old” internet building blocks would be replaced by blockchain-driven software, enabling social media platforms that would be the successors to the one that Steem-It built several years ago. However, while Steem-It wanted to provide a safe and reliable way to pay contributors for their social media content, in this instance the over-riding drive is “to make it much harder for any government or company to ban accounts or delete content.” 

It’s both an intoxicating and a chilling possibility.

While the Times reporter hinted about the risks with ominous quotes and references to the creation of “a decentratlized web of hate,” it’s worth noting that nothing like it has materialized, yet. Also implied but never discussed was the urgency that many feel to avoid censorship of their minority viewpoints by people like Twitter’s Jack Dorsey or even the New York Times editors who effectively decide what to report on and what to ignore. So what’s the bottom line in this tech-enabled tug-of-war between political forces?

The public square that we occupy daily—for communication and commerce, family connection and dissent—a public square that the dominant social media platforms largely provide, cannot (and must not) be governed by @Jack, the sensibilities of mainstream media, or any group of esteemed private citizens like Facebook’s recently appointed Oversight Board. One of the most essential roles of government is to maintain safety and order in, and to set forth the rules of the road for, our public square. Because blockchain-enabled social networks will likely be claiming more of that public space in the near future—even as they strive to evade its common obligations through encryption and otherwise—government can and should enforce the rules for this brave new world.

Until now, our government has failed to confront either on-line censorship or its foreseeable consequences. Because our on-line public square has become (in a few short years) as essential to our way of life as our electricity or water, its social media and similar platforms should be licensed and regulated like those basic services, that is, like utilities—not only for our physical safety but also for the sake of our democratic institutions, which survived their most recent tests but may not survive their next ones if we fail to govern ourselves and our awesome technologies more responsibly.

In this second tug-of-war, we don’t have a moment to lose.

This post was adapted from my January 31, 2021 newsletter. Newsletters are delivered to subscribers’ in-boxes every Sunday morning. You can sign up by leaving your email address in the column to the right.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself Tagged With: app tracking transparency, Apple, autonomy, blockchain, censorship, commons, content monitoring, facebook, freedom of on-line assembly, human tech, privacy, privacy controls, privacy nutrition label, public square, social media platforms

Finding the Will to Protect Our Humanity

December 16, 2019 By David Griesing Leave a Comment

I want to share with you a short, jarring essay I read in the New York Times this week, but first a little background. 
 
For some time now, I’ve been worrying about how tech companies (and the technologies they employ) harm us when they exploit the very qualities that make us human, like our curiosity and pleasure-seeking. Of course, one of the outrages here is that companies like Google and Facebook are also monetizing our data while they’re addicting us to their platforms. But it’s the addiction-end of this unfortunate deal (and not the property we’re giving away) that bothers me most, because it cuts so close to the bone. When they exploit us, these companies are reducing our autonomy–or the freedom to act that each of us embodies. 
 
Today, it’s advertising dollars from our clicking on their ads, but  tomorrow, it’s mind-control or distraction addiction: the alternate (and equally terrible) futures that George Orwell and Aldous Huxley were worried about 80 years ago in the cartoon essay I shared with you a couple of weeks ago.
 
In “These Tech Platforms Threaten Our Freedom,” a post from exactly a year ago, I tried to argue that the price for exchanging our personal data for “free” search engines, social networks and home deliveries is giving up more and more control over our thoughts and willpower. Instead of responding “mindlessly” to tech company come-ons, we could pause, close our eyes, and re-think our knee-jerk reactions before clicking, scrolling, buying and losing track of what we should really want. 
 
But is this mind-check even close to enough?
 
After considering the addictive properties of on-line games (particularly for adolescent boys) in a post last March, the reply was a pretty emphatic “No!”  Games like Fortnite are using the behavioral information they syphon from young players to reduce their ability to exit the game and start eating, sleeping, doing homework, going outside or interacting (live and in person) with friends and family.
 
But until this week, I never thought that maybe our human brains aren’t wired to resist the distracting, addicting and autonomy-sapping power of these technologies. 
 
Maybe we’re at the tipping point where our “fight or flight” instincts are finally over-matched.
 
Maybe we are already inhabiting Orwell’s and Huxley’s science fiction. 
 
(Like with global warming, I guess I still believed that there was time for us to avoid technology’s harshest consequences.)
 
When I read Tristan Harris’s essay “Our Brains Are No Match for Our Technology” this week, I wanted to know the science, instead of the science fiction, behind its title. But Harris begins with more of a conclusion than a proof, quoting one of the late 20th Century’s most creative minds, Edward O. Wilson. When asked a decade ago whether the human race would be able to overcome the crises that will confront us over the next hundred years, Wilson said:

Yes, if we are honest and smart. [But] the real problem of humanity is [that] we have Paleolithic emotions, medieval institutions and godlike technology.

Somehow, we have to find a way to reduce this three-part dissonance, Harris argues. But in the meantime, we need to acknowledge that “the natural capacities of our brains are being overwhelmed” by technologies like smartphones and social networks.

Even if we could solve the data privacy problem, humanity will still be reduced to distraction by encouraging our self-centered pleasures and stoking our fears. Echoing Huxley in Brave New World, Harris argues that “[o]ur addiction to social validation and bursts of ‘likes’ would continue to destroy our attention spans.” Echoing Orwell in Animal Farm, Harris is equally convinced that “[c]ontent algorithms would continue to drive us down rabbit holes toward extremism and conspiracy theories.” 

While technology’s distractions reduce our ability to act as autonomous beings, its impact on our primitive brains also “compromises our ability to take collective action” with others.

[O]ur Paleolithic brains aren’t build for omniscient awareness of the world’s suffering. Our online news feeds aggregate all the world’s pain and cruelty, dragging our brains into a kind of learned helplessness. Technology that provides us with near complete knowledge without a commensurate level of agency isn’t humane….Simply put, technology has outmatched our brains, diminishing our capacity to address the world’s most pressing challenges….The attention [or distraction] economy has turned us into a civilization maladapted for its own survival.

Harris argues that we’re overwhelmed by 24/7 genocide, oppression, environmental catastrophe and political chaos; we feel “helpless” in the face of the over-load; and our technology leaves us high-and-dry instead of providing us with the means (or the “agency”) to feel that we could ever make a difference. 
 
Harris’s essay describes technology’s assault on our autonomy—on our free will to act—but he never describes or provides scientific support for why our brain wiring is unable to resist that assault in the first place. It left me wondering: are all humans susceptible to distraction and manipulation from online technologies or just some of us, to some extent, some of the time? 
 
Harris heads an organization called the Center for Humane Tech, but its website (“Our mission is to reverse human downgrading by realigning technology with our humanity”) only scratches the surface of that question. 
 
For example, it links to a University of Chicago study involving the distraction that’s caused by smartphones we carry with us, even when they’re turned off. These particular researchers theorized that having these devices nearby “can reduce cognitive capacity by taxing the attentional resources that reside at the core of both working memory and fluid intelligence.”  In other words, we’re so preoccupied when our smartphones are around that our brain’s ability to process information is reduced. 
 
I couldn’t find additional research on the site, but I’m certain there was a broad body of knowledge fueling Edward O. Wilson’s concern, ten years ago, about the misalignment of our emotions, institutions and technology. It’s the state of today’s knowledge that could justify Harris’s alarm about what is happening when “our Paleolithic brains” confront “our godlike technologies,” and I’m sure he’s familiar with these findings.  But that research needs to be mustered and conclusions drawn from it so we can understand, as an impacted community, the risks that “our brains” actually face, and then determine together how to protect ourselves from it. 

To enable us to reach this capable place, science needs to rally (as it did in an open letter about artificial intelligence and has been doing on a daily basis to confront global warming) and make its best case about technology’s assault on human autonomy. 
 
If our civilization is truly “maladapted to its own survival,” we need to find our “agency” now before any more of it is lost. But we can only move beyond resignation when our sense of urgency arises from a well-understood (and much chewed-upon) base of knowledge. 

This post was adapted from my December 15, 2019 newsletter. When you subscribe, a new newsletter/post will be delivered to your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Daily Preparation Tagged With: agency, Aldous Huxley, autonomy, distraction, free will, George Orwell, human tech, humane technology, instincts, on-line addiction, technology, Tristan Harris

Blockchain Goes to Work

May 20, 2019 By David Griesing Leave a Comment

This week I’ve re-worked a post from last August in the first of a two-part consideration on the future of work. Today, it’s envisioning a workforce where more of us will be working for ourselves, selling increments of our time and talent in what amounts to a series of paying jobs. While it’s a response to the loss of “traditional jobs” to automation, it also holds the promise of greater autonomy, abundance and prosperity if we choose to value the right things by standing up for and safeguarding our human priorities along the way.

The future of work is being designed today. Perhaps the most exciting part is that each one of us has a role to play–is part of a broader negotiation–about how that future should unfold.

1            An Optimistic Vision

The future of work has never looked more abundant, although many don’t see it that way.
 
Some are busy projecting job losses from automation and brain-replacing artificial intelligence, telling us we’ll all be idled and that much poorer for it. Or they’re identifying the brainpower careers that will remain so we can point ourselves or our tuition payments in their direction. For these forecasters, the future of work is at best the pursuit of diminishing returns.
 
Some of the most pessimistic (or politically ambitious) among them have been formulating universal income plans to replace today’s more limited safety nets. They tell us that a stipend like this will liberate us to pursue our passions since new government checks will cover our basic necessities. This seems misguided to me. As George Orwell noted, some utopians simply cannot “imagine happiness except in the form of relief, either from effort or pain.”
 
An alternate vision focuses on innovations that could enable us to do more and better work while unlocking greater prosperity. 
 
One of the enabling technologies that is already ushering in this future is blockchain. Like the protocols for transmitting data across digital networks led to the Internet, blockchain-based software applications could fundamentally change the ways that we work.
 
A blockchain is a web-based chain of connections, most commonly with no central monitor or regulator. The technology enables every block in the chain to record data that can be seen and reviewed by every other block, maintaining its accuracy through its security protections and transparency. Everyone with access can see what every other connection has recorded in a digital ledger or transaction log. The need for and costs of a “middleman” (like a bank) and other impediments (like legal and financial gatekeepers) are avoided. Unlike traditional recordkeeping, there is no central database for meddlers to corrupt.
 
Blockchain technology supports the sale and use of digital currencies (like bitcoin) and just as importantly, “smart contracts” that enforce the rules about how value is exchanged by parties when they reach agreement. Ethereum utilizes its blockchain platform to host most of the projects that attract, manage and pay for time and talent in decentralized ways today. Tantalizing glimpses into this future are also available at the social network Steemit and on the payment platform Bitwage. 
 
Steemit’s uses a digital currency called Steem that you can redeem for cash for your contributions to the social network’s “hivemind.” For example, users are paid for posts, for the number of people liking their posts, for how quickly you spot another post that becomes popular, that is, for the value of your contributions to the network. Users are funding jobs like travel blogging while they crisscross the world and, reportedly, one early adopter has already earned more than a million dollars worth of Steem. In more traditional buying-and-selling transactions, Bitwage’s payment application allows employees or freelancers to receive their wages in bitcoin without requiring either their employers or clients to use a digital currency exchange. 
 
For work-based ecosystems built on blockchains to evolve further, they will need to become faster and more scalable without sacrificing the security and decentralization that are their hallmarks. In this pursuit, Ethereum and a raft of competitors are experimenting with a protocol called Lightening that can settle millions of digital currency transactions more quickly and cheaply but that needs “to go off the blockchain” in order to do so. These companies are also exploring structural changes to basic blockchain technology. The prize that drives them is an online platform that is durable enough to support a global marketplace where every kind of work can be bought and sold. 
 
Let’s call it a work2benefit exchange. 
 
Because your time and talent has value and is in limited supply, you could sell it in a market that’s vibrant enough to buy it. A blockchain-based exchange might easily handle transactions that involve very small as well as larger, project-oriented jobs. Because you have capabilities that you’ve sold before and others that you’ve given away because there was no way to be compensated, an exchange like this could help secure prior income streams while providing you with new ones. Such a marketplace would easily dwarf Walmart’s in size without the downsides of a company middleman taking his profits, making you keep his work schedule, commute to his place of business or contribute to his overhead. 
 
Previously unrealized income streams—even small ones—will be particularly welcome.
 
Suppose you’re asked to provide 5 minutes of feedback on your recent doctor’s visit. Your scarce resources are the time and judgment that you might not provide if you weren’t being paid for them. Their one-time value might be modest, but as the demands for your input keep coming, payments for it will add up. A blockchain exchange could pay you for editing a resume in 20 minutes or designing a company’s logo in 2 hours; providing traffic-cam information on heavily traveled routes you are already taking; matchmaking acquaintances with service providers that have something they need; selling your personal data to marketers who want you to buy their products;  maybe even a government incentive for completing your tax returns or voting in the next election. Similarly, when I need the benefit of someone else’s work, this marketplace could connect me to it, even if the time and talent is half a world away.
 
Work2benefit exchanges that can handle incremental transactions like these haven’t been built yet, let alone populated by enough buyers and sellers to make them viable—but they’re coming. You’ll still need your judgment, vision and hustle, but before long it will be possible to make a living in a marketplace where you (and maybe billions of others) will each be blocks in a global blockchain. Many people will continue to work in groups. Offices and factories won’t vanish.  But traditional jobs that once came with pensions, health benefits and provable credit will become increasingly scarce. The stripped-down, “independent contractor” work that’s left will almost certainly be supplemented by new ways of getting paid for your human resources. 
 
Blockchain and related technologies will unlock new categories of personal wealth and autonomy. They could fill the future of work with greater abundance for us to share with one another. Tomorrow’s challenge won’t be finding enough work to make a living but reimagining and re-bundling job securities like health care and creditworthiness around all the new jobs we’ll be doing. Next week, I’ll introduce you to some of the people and companies that are helping to build these protections around our increasingly autonomous workforce. 

2.            The Future Begins With a Vision

A vision should linger and inspire for long enough that it fixes in the minds eye where it becomes part of the imagination, a cause for hope, and fuel that’s needed to overcome the obstacles that will always stand in its way. Here, in brief, are some of the challenges that a bold-enough vision will need to see us through, starting with the inevitable turf wars and technology challenges:
 
-There is resistance from the mainstream banking community to digital currencies and the exchanges that convert them into cash for gig economy paychecks. For example, a story in today’s Wall Street Journal chronicles the banking controversy that has already embroiled one digital currency exchange. Some of the current banking industry will need to be disrupted so that new “fin-tech” mechanisms can take their place.
 
-There are technology challenges to making digital platforms large enough to handle the smart contracts that will bring all these new buyers and sellers of work together. The ecosystem of applications will need to be robust enough to attract, manage and compensate the sale of goods and talent in a global marketplace. To meet these challenges, new applications are being developed outside of blockchain’s architecture (with its attendant security risks and middleman costs) while some of the fundamentals behind blockchain technology itself are being reconsidered. If you’re interested in a deeper dive, more about blockchain’s “scalability” hurdles can be found here.
 
-Managing yourself to a stable, reliable income from many jobs in a way that meets your needs and your family’s needs requires its own expertise. The freedom to decide when to work and how often to work is liberating, but as the recent strikes by Uber drivers illustrate, it isn’t easy to cobble a patchwork of compensated time “into a living” while also selling your services at “a market price.”  We’ll all have to learn more about how to put our livelihoods together while finding new ways to bargain effectively for what we need from each one of our work-based exchanges.
 
-Not everyone is naturally suited to be an entrepreneur, so we’ll have to learn how to embrace additional parts of our entrepreneurial spirit too. Working for yourself involves not only doing your paying jobs but also functioning as your back and front offices by doing your own marketing, accounting, taxes, establishing and monitoring your co-working relationships, maintaining your skill levels, and determining the prices for your goods and services. Most 9-5 jobs didn’t require you to do all these things, but as jobs like this disappear, you’ll be doing more of them yourself—with both the upsides and downsides that new opportunities for growth and mastery can bring.
 
Thinking through the hurdles hopefully reminds us of the promises. We’ll thrive with greater freedom, convenience and efficiency by working where, when and how we want to. We’ll be paid for increments of our time that we used to give away for free. We’ll increasingly stand both behind our work and out in front of it in ways that will make “what we do” an even more powerful demonstration of who we are and what is important to us. 
 
This future of work is being written today. 

We’re building it with our ideas and conversations as new ecosystems gradually evolve around it.

What comes next will be exciting and daunting, both creative and destructive, as the familiar is replaced by something that few of us have experienced before. 
 
This future can have a human face, an opportunity for workers, families and communities to flourish, as long as we don’t leave the ideas and conversations about how that can happen to someone else.

This post was adapted from my May 19, 2019 newsletter. When you subscribe, a new newsletter/post will be delivered to your inbox every Sunday morning. 

Filed Under: *All Posts, Continuous Learning, Entrepreneurship, Introducing Yourself & Your Work Tagged With: autonomy, Bitwage, blockchain, blockchain scalability, crypto currency, digital currency, entrepreneurship, future of work, gig economy, gig workers, gig workforce, independent contractor, smart contracts, Steemit

These Tech Platforms Threaten Our Freedom

December 9, 2018 By David Griesing Leave a Comment

We’re being led by the nose about what to think, buy, do next, or remember about what we’ve already seen or done.  Oh, and how we’re supposed to be happy, what we like and don’t like, what’s wrong with our generation, why we work. We’re being led to conclusions about a thousand different things and don’t even know it.

The image that captures the erosion of our free thinking by influence peddlers is the frog in the saucepan. The heat is on, the water’s getting warmer, and by the time it’s boiling it’s too late for her to climb back out. Boiled frog, preceded by pleasantly warm and oblivious frog, captures the critical path pretty well. But instead of slow cooking, it’s shorter and shorter attention spans, the slow retreat of perspective and critical thought, and the final loss of freedom.

We’ve been letting the control booths behind the technology reduce the free exercise of our lives and work and we’re barely aware of it. The problem, of course, is that the grounding for good work and a good life is having the autonomy to decide what is good for us.

This kind of tech-enabled domination is hardly a new concern, but we’re wrong in thinking that it remains in the realm of science fiction.

An authority’s struggle to control our feelings, thoughts and decisions was the theme of George Orwell’s 1984, which was written 55 years before the fateful year that he envisioned. “Power,” said Orwell, “is in tearing human minds to pieces and putting them together again in new shapes of your own choosing.” Power persuades you to buy something when you don’t want or need it. It convinces you about this candidate’s, that party’s or some country’s evil motivations. It tricks you into accepting someone else’s motivations as your own. In 1984, free wills were weakened and constrained until they were no longer free. “If you want a picture of the future,” Orwell wrote, “imagine a boot stamping on a human face—for ever.”

Maybe this reflection of the present seems too extreme to you.

After all, Orwell’s jackbooted fascists and communists were defeated by our Enlightenment values. Didn’t the first President Bush, whom we buried this week, preside over some of it? The authoritarians were down and seemed out in the last decade of the last century—Freedom Finally Won!—which just happened to be the very same span of years when new technologies and communication platforms began to enable the next generation of dominators.

(There is no true victory over one man’s will to deprive another of his freedom, only a truce until the next assault begins.)

20 years later, in his book Who Owns the Future (2013), Jaron Lanier argued that a new battle for freedom must be fought against powerful corporations fueled by advertisers and other “influencers” who are obsessed with directing our thoughts today.

In exchange for “free” information from Google, “free” networking from Facebook, and “free” deliveries from Amazon, we open our minds to what Lanier calls “siren servers,” the cloud computing networks that drive much of the internet’s traffic. Machine-driven algorithms collect data about who we are to convince us to buy products, judge candidates for public office, or determine how the majority in a country like Myanmar should deal with a minority like the Rohingya.

Companies, governments, groups with good and bad motivations use our data to influence our future buying and other decisions on technology platforms that didn’t even exist when the first George Bush was president but now, only a few years later, seem indispensible to nearly all of our commerce and communication. Says Lanier:

When you are wearing sensors on your body all the time, such as the GPS and camera on your smartphone and constantly piping data to a megacomputer owned by a corporation that is paid by ‘advertisers” to subtly manipulate you…you are gradually becoming less free.

And all the while we were blissfully unaware that this was happening because the bath was so convenient and the water inside it seemed so warm. Franklin Foer, who addresses tech issues in The Atlantic and wrote 2017’s World Without Mind: The Existential Threat of Big Tech, talks about this calculated seduction in an interview he gave this week:

Facebook and Google [and Amazon] are constantly organizing things in ways in which we’re not really cognizant, and we’re not even taught to be cognizant, and most people aren’t… Our data is this cartography of the inside of our psyche. They know our weaknesses, and they know the things that give us pleasure and the things that cause us anxiety and anger. They use that information in order to keep us addicted. That makes [these] companies the enemies of independent thought.

The poor frog never understood that accepting all these “free” invitations to the saucepan meant that her freedom to climb back out was gradually being taken away from her.

Of course, we know that nothing is truly free of charge, with no strings attached. But appreciating the danger in these data driven exchanges—and being alert to the persuasive tools that are being arrayed against us—are not the only wake-up calls that seem necessary today. We also can (and should) confront two other tendencies that undermine our autonomy while we’re bombarded with too much information from too many different directions. They are our confirmation bias and what’s been called our illusion of explanatory depth.

Confirmation bias leads us to stop gathering information when the evidence we’ve gathered so far confirms the views (or biases) that we would like to be true. In other words, we ignore or reject new information, maintaining an echo chamber of sorts around what we’d prefer to believe. This kind of mindset is the opposite of self-confidence, because all we’re truly interested in doing outside ourselves is searching for evidence to shore up our egos.

Of course, the thought controllers know about our propensity for confirmation bias and seek to exploit it, particularly when we’re overwhelmed by too many opposing facts, have too little time to process the information, and long for simple black and white truths. Manipulators and other influencers have also learned from social science that our reduced attention spans are easily tricked by the illusion of explanatory depth, or our belief that we understand things far better than we actually do.

The illusion that we know more than we think we do extends to anything that we can misunderstand. It comes about because we consume knowledge widely but not deeply, and since that is rarely enough for understanding, our same egos claim that we know more than we actually do. For example, we all know that ignorant people are the most over-confident in their knowledge, but how easily we delude ourselves about the majesty of our own ignorance.  For example, I regularly ask people questions about all sorts of things that they might know about. It’s almost the end of the year as I write this and I can count on one hand the number of them who have responded to my questions by saying “I don’t know” over the past twelve months.  Most have no idea how little understanding they bring to whatever they’re talking about. It’s simply more comforting to pretend that we have all of this confusing information fully processed and under control.

Luckily, for confirmation bias or the illusion of explanatory depth, the cure is as simple as finding a skeptic and putting him on the other side of the conversation so he will hear us out and respond to or challenge whatever it is that we’re saying. When our egos are strong enough for that kind of exchange, we have an opportunity to explain our understanding of the subject at hand. If, as often happens, the effort of explaining reveals how little we actually know, we are almost forced to become more modest about our knowledge and less confirming of the biases that have taken hold of us.  A true conversation like this can migrate from a polarizing battle of certainties into an opportunity to discover what we might learn from one another.

The more that we admit to ourselves and to others what we don’t know, the more likely we are to want to fill in the blanks. Instead of false certainties and bravado, curiosity takes over—and it feels liberating precisely because becoming well-rounded in our understanding is a well-spring of autonomy.

When we open ourselves like this instead of remaining closed, we’re less receptive to, and far better able to resist, the “siren servers” that would manipulate our thoughts and emotions by playing to our biases and illusions. When we engage in conversation, we also realize that devices like our cell phones and platforms like our social networks are, in Foer’s words, actually “enemies of contemplation” which are” preventing us from thinking.”

Lanier describes the shift from this shallow tech-driven stimulus/response to a deeper assertion of personal freedom in a profile that was written about him in the New Yorker a few years back.  Before he started speaking at a South-by-Southwest Interactive conference, Lanier asked his audience not to blog, text or tweet while he spoke. He later wrote that his message to the crowd had been:

If you listen first, and write later, then whatever you write will have had time to filter through your brain, and you’ll be in what you say. This is what makes you exist. If you are only a reflector of information, are you really there?

Lanier makes two essential points about autonomy in this remark. Instead of processing on the fly, where the dangers of bias and illusions of understanding are rampant, allow what is happening “to filter through your brain,” because when it does, there is a far better chance that whoever you really are, whatever you truly understand, will be “in” what you ultimately have to say.

His other point is about what you risk becoming if you fail to claim a space for your freedom to assert itself in your lives and work. When you’re reduced to “a reflector of information,” are you there at all anymore or merely reflecting the reality that somebody else wants you to have?

We all have a better chance of being contented and sustained in our lives and work when we’re expressing our freedom, but it’s gotten a lot more difficult to exercise it given the dominant platforms that we’re relying upon for our information and communications today.

This post was adapted from my December 9, 2018 newsletter.

Filed Under: *All Posts, Building Your Values into Your Work, Continuous Learning, Work & Life Rewards Tagged With: Amazon, autonomy, communication, confirmation bias, facebook, Franklin Foer, free thinking, freedom, Google, illusion of explanatory depth, information, information overhoad, Jaron Lanier, tech, tech platforms, technology

  • 1
  • 2
  • Next Page »

About David

David Griesing (@worklifeward) writes from Philadelphia.

Read More →

Subscribe to my Newsletter

Join all the others who have new posts, recommendations and links to explore delivered to their inboxes every week. Please subscribe below.

David Griesing Twitter @worklifereward

My Forthcoming Book

WordLifeReward Book

Search this Site

Recent Posts

  • Great Design Invites Delight, Awe June 4, 2025
  • Liberating Trump’s Good Instincts From the Rest April 21, 2025
  • Delivering the American Dream More Reliably March 30, 2025
  • A Place That Looks Death in the Face, and Keeps Living March 1, 2025
  • Too Many Boys & Men Failing to Launch February 19, 2025

Follow Me

David Griesing Twitter @worklifereward

Copyright © 2025 David Griesing. All Rights Reserved.

  • Terms of Use
  • Privacy Policy