David Griesing | Work Life Reward Author | Philadelphia

  • Blog
  • About
    • Biography
    • Teaching and Training
  • Book
    • WorkLifeReward
  • Newsletter Archive
  • Contact
You are here: Home / Archives for data

The Good Work of Getting What We Need As Patients

December 2, 2019 By David Griesing Leave a Comment

Since recent posts here and here about work in healthcare—discussing burnout among health professionals, concerns about misuse of patient data, and questions about who is policing our rapidly changing health system—I’ve continued to follow developments in the field.  
 
Over the past few weeks, some of you have also shared your troubled reactions about how work in healthcare has been evolving.
 
The net of these developments is that while there are grounds for alarm about the uses of our health data, its proliferation presents some extraordinary opportunities too. Concepts like “precision medicine” become more realistic as the amount and quality of the data improves. More and better data will also help us to live longer and healthier lives. On the other hand, whether AI and other data-related technologies can enable us to improve the quality of the work experience for millions of healthcare professionals is a stubbornly open question.
 
In this last healthcare-related post for a while, there are two, practical rules of thumb that might give us greater sense of control over how our healthcare data is being used, as well as a couple of ways in which more and better health-related information is already producing better patient outcomes.
 
The good work of getting the healthcare that we need as patients (both for ourselves and for others that we’re caring for) requires healthy doses of optimism as well as pessimism, together with understanding as much as we can about when excitement or alarm are warranted.

Two Rules of Thumb To Inhibit Misuse of Our Medical Data

The first rule of thumb involves insisting upon as much transparency as possible around the uses of our medical information. That includes knowing who is using it (beyond the healthcare provider) and minimizing the risks of anyone’s misuse of it.

Unfortunately, more of this burden falls on patients today. As health systems increasingly look to their bottom lines, they may be less incentivized to protect our personal data streams. And even when our interests are aligned, doctors and hospitals may not be able to protect our data adequately. As I wondered here a couple of weeks ago: “Can these providers ‘master the learning curve’ [of big data-related technologies] quickly enough to prevent sophisticated consultants like Google from exploiting us, or will the fox effectively be running the chicken coop going forward?”

An article last weekend in the Wall Street Journal called “Your Medical Data Isn’t As Safe As You Think It Is” raised a couple of additional issues. As patients, we may be lulled into complacency by the fact that much of our data is rendered “anonymous” (or stripped of our personal identifiers) before it is shared in big databases. But as this article describes at length, “de-identified” data in the hands of one of the tech companies can easily be “triangulated” with other data they already have on you to track your medical information back to you. That means they remain able to target you personally in ways you can imagine and some you cannot.

Moreover, even if it remains anonymous, your medical data “in a stranger’s hands” may still come back to haunt you. As one expert in data sharing observed, companies that monetize personal data currently provide very little information about their operations. That means we know some of the risks to us but are in the dark about others. Of the known risks around data dispersal, you may suddenly find yourself paying higher health-related insurance premiums or barred from obtaining any coverage at all:

Google will be in a good position to start selling actuarial tables to insurance companies—like predictions on when a white male in his 40s with certain characteristics might be likely to get sick and expensive. When it comes to life and disability insurance, antidiscrimination laws are weak, he says. ‘That’s what creates the risk of having one entity having a really godlike view of you as a person that can use it against you in ways you wouldn’t even know.’

Our first rule of thumb as customers in the health system is to insist upon transparency around how our providers are sharing our medical information, along with the right to prevent it from being shared if we are concerned about how it is will be used or who will be using it.
 
The second rule of thumb has always existed in healthcare, but may be more important now than ever. You should always be asking: is my medical information going to be used in a way that’s good for me?  If it’s being used solely to maximize Google’s revenues, the answer is clearly “No.” But if your information is headed for a health researcher’s big data set, you should ask some additional questions: “Was someone like me considered as the study was being constructed so the study’s results are likely to be relevant to me?”  “Will I be updated on the findings so my ongoing treatment can benefit from them?” (More questions about informed consent before sharing your medical data were set forth in another article this past week.) 

Of course, understanding “the benefits to you beforehand” can also help you determine whether a test, drug or treatment program is really necessary, that is, if it’s possible to assess the pros and cons with your doctor in the limited time that you have before he or she orders it.
 
With medical practitioners becoming profit (or loss) centers for health systems that operate more like businesses, the good work of protecting yourself and your loved ones from misuse of your data requires both attention and vigilance at a time when you’re likely to be pre-occupied by a range of other issues.

More and Better Data Is a Cause for Excitement Too

There is an outfit called Singularity University that holds an annual conference each year with speakers who discuss recent innovations in a range of fields. Its staff also posts weekly about the most exciting developments in technology on a platform called Singularity Hub. One of its recent posts and one of the speakers at its conference in September highlight why more and better medical data is also a cause for excitement.
 
To understand the promise of today’s medical data gathering, it helps to recall what medical information looked like until very recently. Most patient information stayed in medical offices and was never shared with anyone. When groups of patients were studied, the research results varied widely in quality and were not always reconciled with similar patient studies. Medicine advanced through peer reviewed papers and debates over relatively small datasets in scholarly journals. Big data is upending that system today.
 
For us as patients, the most exciting development is that more high quality data will give us greater control over our own health and longevity. This plays out in (at least) two ways.
 
In the first instance, big data will give each of us “better baselines” than we have today about our current health and future prospects. According to the Singularity Hub post, companies as well as government agencies are already involved in large-scale projects to:

measure baseline physiological factors from thousands of people of different ages, races, genders, and socio-economic backgrounds. The goal is to slowly build a database that paints a comprehensive picture of what a healthy person looks like for a given demographic…These baselines can then be used to develop more personalized treatments, based on a particular patient.

Although it sounds like science fiction, the goal is essentially “to build a digital twin of every patient,” using it in simulations to optimize diagnoses, prevention and treatments. It is one way in which we will have personalized treatment plans that are grounded in far more accurate baseline information than has ever been available before.
 
The second breakthrough will involve changes in what we measure, moving organized medicine from treatment of our illnesses to avoidance of most illnesses altogether and the greater longevity that comes with improved health. As these developments play out, it could become commonplace for more of us to live well beyond a hundred years.
 
At Singularity University’s conference two months ago, Dr. David Karow spoke about the data we should be collecting today to treat a broad spectrum of medical problems in their early stages and increase our life expectancy. He argues that his start-up, Human Longevity Inc., has a role to play in that future.
 
Four years ago, Karow conducted a trial involving nearly 1,200 presumably healthy individuals. In the course of giving them comprehensive medical checkups, he utilized several cutting edge diagnostic technologies. These included whole-genome and microbiome sequencing, various biochemical measurements and advanced imaging. By analyzing the data, his team found a surprisingly large number of early stage tumors, brain aneurysims, and heart disease that could be treated before they produced any lasting consequences. In another 14% of the trial participants, significant, previously undetected conditions that required immediate treatment were discovered. 
 
Karow’s argument is that we’re “not measuring what matters” today and that we should be “hacking longevity” with more pre-sympomatic diagnoses. For example, if testing indicates that you have the risk factors for developing dementia, you can minimize at least some of those risks now “because of third of the risks are modifiable.” 
 
Every start up company needs its evangelists and Karow is selling “a fountain of youth” that “starts with a fountain of data.”  This kind of personal data gathering is expensive today and not widely available but it gestures towards a future where these sorts of “deep testing” may be far more affordable and commonplace. 
 
We need these promises of more personalized and preventative medicine—the hope of a better future—to have the stamina to confront the current risks of our medical data being monetized and misused long before we ever get there. As with so many other things, we need to hold optimism in one hand, pessimism in the other, and the ability to shuttle between them.

This post was adapted from my December 1, 2019 newsletter. When you subscribe, a new newsletter/post will be delivered to your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Building Your Values into Your Work, Continuous Learning Tagged With: baseline health measures, big data, control of your data, data, ethics, health and longevity, health care industry, healthcare, misuse of patient data, pre-symptomatic diagnoses, work, work of being a patient

Companies That Wear Their Values on Their Sleeves

March 31, 2019 By David Griesing Leave a Comment

We lead with our values because it’s easier to connect with somebody who shares our priorities, but it’s a trickier proposition for companies that want our loyalty because we have rarely been so divided about what is important to us. 

Deepening divisions over our common commitments make Apple’s roll-out of a suite of new services this week both riveting—and potentially fraught.

As the company’s newly announced services like AppleTV+ and AppleNews+ tie themselves more closely to Hollywood and the cover stories that sell glossy magazines, is Apple cloaking itself in values that could alienate or confuse many of the customers it aims to bond with more closely?

In 1997, on the eve of launching a new, national advertising campaign, Steve Jobs gave a short talk about the link between Apple’s values and its customers. “At our core, we believe in making the world a better place,” he said.  So our ads will “honor people who have changed the world for the better—[because] if they ever used a computer, it would be a Mac.”  In its marketing, Apple aligned itself with tech innovators like Thomas Edison, a genius who had already changed the world as Jobs and Apple were about to do.

A little more than 20 years later, with Apple following in the footsteps of Edison’s light-bulb company to industry dominance, the question is whether it would have a better chance of preserving that dominance by once again aligning itself with technology innovators who have already changed the world instead of those, like Steven Spielberg and Oprah Winfrey, who aim to do so by relying on their own approaches to social betterment? To set the stage for your possible reactions, here is a link, beginning with Spielberg and ending with Winfrey, that covers the highlights from Apple’s new services launch this past week in a vivid, all-over-the-place 15 minutes.

I should confess that I have a bit of a horse in this race because I want Apple to keep winning by bringing me world-class products and customer care, but I’m not sure that it can by pursuing services (like entertainment, news and games through the new AppleArcade) that put the company in lockstep with industries that muddy its focus and dilute its values proposition.

Instead of bringing me a global version of Oprah’s book club or more of Steven Speilberg’s progressive reminiscences, I was hoping to hear that Apple would be providing—even in stages over months or years—an elegantly conceived and designed piece of technology that would (1) allow me to cut the cable cord with my internet provider while (2) upgrading me to an interface where I could see and pay for whatever visual programming I choose whenever I want it. An ApplePay-enabled entertainment router. Now that would be the kind of tech innovation that would change my world for the better again (and maybe yours too) while staying true to its founder’s messaging from twenty-odd years ago.  

Tech commentator Christopher Mims talked this week about why Apple was claiming values (like those embedded in Oprah’s notion of social responsibility) to maintain its market share, but he never really debated whether it should do so. I’d argue that Apple should continue to make its own case for the social benefits of its tech solutions instead of mixing its message about priorities with the aspirations of our celebrity culture.

When it comes to Silicon Valley’s mostly hypocritcal claims about social responsibility, I start with the skepticism of observers like Anand Giridharadas. To him, Facebook, Google, Amazon and Apple all use values as tools to gain the profits they’re after, covering their self-serving agendas with feel-good marketing.

In a post last October, I discussed some of his observations about Facebook (and by implication, most of the others) in the context of his recent book, Winners Take All.

The problem, said Giridharadas, is that while these companies are always taking credit for the efficiencies and other benefits they have brought, they take no responsibility whatsoever for the harms… In their exercise of corporate social responsibility, there is a mismatch between the solutions that the tech entrepreneurs can and want to bring and the problems we have that need to be solved. “Tending to the public welfare is not an efficiency problem, The work of governing a society is tending to everybody. It’s figuring out universal rules and norms and programs that express the value of the whole and take care of the common welfare.” By contrast, the tech industry sees the world more narrowly. For example, the fake news controversy lead Facebook not to a comprehensive solution for providing reliable information but to what Giridharadas calls “the Trying-to-Solve-the-Problem-with-the-Tools-that-Caused-It” quandary.

In the face of judgments like his, I’d argue that Apple should be vetting its corporate messaging with the inside guidance of those who understand the power of values before it squanders the high ground it still holds. 
 
Beyond “sticking with the tech innovations that it’s good at” and the Edison-type analogies that add to their luster, what follows are three proposals for how the company might build on its values-based loyalty while still looking us in the eye when it does so.
 
Each one has Apple talking about what its tech-appreciating customers still care about most when they think—with healthy nostalgia—about all the things that Apple has done for them already.

The fate that the company is aiming to avoid

1.         Apple should keep reminding us about its unparalleled customer service

The unbelievable service I have come to expect from Apple feeds my brand loyalty. I feel that we share the value of trustworthiness. When someone relies on me for something, I stand behind it instead of telling them I don’t have the time or it’s too expensive to fix. For me, Apple has consistently done the same.
 
So I was surprised when I had to argue a bit harder than I thought was necessary for Apple’s battery fix for an older iPhone, and I started following other customer complaints against the company to see if a change of priorities was in the air. Since I’m writing to you on a MacBook Air, problems with the Air’s later generation keyboards have apparently escalated to the point that national class-action litigation is in the offing. Not unlike the iPhone battery fix, Apple has gone on record as being willing to replace any sticky keyboard for free within 4 years of purchase, but is it really as easy as it sounds? As recently as last week, there was this plea to Apple from a tech reviewer in a national newspaper after profiling a litany of customer difficulties:

For any Apple engineers and executives reading: This is the experience you’re providing customers who shell out $1200 or more—sometimes a lot more. This is the experience after THREE attempts at this keyboard design.

When you are one of the richest companies in history, relatively inexpensive problems like this need to be solved before they get this far. A reputation for world-class customer service is a terrible thing to waste. Be glad to fix your technology on those rare occasions when it breaks down, and solve the technology problem with these keyboards before you sell or replace any more of them. Don’t make customers who were loyal enough to pay a premium for an Apple device take you to court because they can’t get enough of your attention any other way. Caring for your customers is a core value that needs polishing before its shine begins to fade and your customer loyalty slips away.

2.         Apple should keep telling us how much it’s different from Google, Facebook and Amazon

The uses that the dominant tech platforms are making of our personal data are on everyone’s mind.
 
Beyond invasions of privacy, deep concerns are also being voiced about the impact of “surveillance capitalism” on Western democracy, not only because of meddling with our elections but, even more fundamentally, because of how this new economic model disrupts “the organic reciprocities involving employment and consumption” that undergird democratic market systems. These are profound and increasingly wide-spread concerns, and Apple for one seems to share them. 
 
This is from another post last October called “Looking Out For the Human Side of Technology”: 

I was also struck this week by Apple CEO Tim Cook’s explosive testimony at a privacy conference organized by the European Union…:
 
‘Our own information—from the everyday to the deeply personal—is being weaponized against us with military efficiency. Today, that trade has exploded into a data-industrial complex.
 
These scraps of data, each one harmless enough on its own, are carefully assembled, synthesized, traded, and sold. This is surveillance. And these stockpiles of personal data serve only to enrich the companies that collect them. This should make us very uncomfortable.
 
Technology is and must always be rooted in the faith people have in it. We also recognize not everyone sees it that way—in a way, the desire to put profits over privacy is nothing new.’
 
‘Weaponized’ technology delivered with ‘military efficiency.’ ‘A data-industrial complex.’ One of the benefits of competition is that rivals call you out, while directing unwanted attention away from themselves…so Cook’s (and Apple’s) motives here have more than a dollop of competitive self-interest where [companies like] Google and Facebook are concerned. On the other hand, Apple is properly credited with limiting the data it makes available to third parties and rendering the data it does provide anonymous. There is a bit more to the story, however.
 
If data privacy were as paramount to Apple as it sounded this week, it would be impossible to reconcile Apple’s receiving more than $5 billion a year from Google to make it the default search engine on all Apple devices.

In its star-studded launch of TV, News and Arcade services this week, Apple’s presenters always reiterated that none of these services would be “ad selling models” targeting Apple users. They’re good reminders about Apple’s values but…while $5B is a lot of revenue to forsake if new purchasers of Apple devices got to pick their own search engines, it’s also a significant amount of support for an antithetical business model.

Not selling my data to others for use against me, like Apple’s standing behind the functionality of my devices, are core contributors to the company’s good reputation in my mind, and never more so than today. If Apple continues to differentiate itself from its competitors on the use of our data—and I think that it should—it needs to find ways to be more forthright about its own conflicts of interest while doing so.

When you stand on your values and back it up with your actions, the competitors you are challenging will always seek to undermine you when your actions are inconsistent with your words. Why let that inconsistency be tomorrow’s headline, Tim? Why not be ready to talk more forthrightly about the quandary with Google, and how the company is trying to address it, when asked to comment before a “gotcha” story like this gets published?

3.         Apple should get ahead of its new services launch by proactively addressing likely problems with real consequences for the rest of us

In its roll-out presentation, Apple announced plans for a new service that will link players to games that Apple will be creating. Few tech areas have seen more startling impacts from the use of behavioral data that’s being gathered from players by those who are behind these on-line games. I recently talked here about how the programmers and monitors behind Fortnite and the updated Tetris games are using data about how players react to as many as 200 “persuasive design elements” in these games to enhance the intensity of the player experience while making it more addictive to boys and other susceptible populations. 

Apple’s engineers know about these issues already. Its programmers are making its new games ready for primetime as soon as next fall. To differentiate itself from others in the on-line gaming industry, to strike a more principled note than its competitors have, and to broaden the scope of Apple’s values when it comes to personal data, the company could tell us some or all of the following in the coming months:

-whether it will be using behavioral data it generates from players through real time play to make its games more absorbing or addictive;

-whether it intends to restrict certain classes of users (like pre-teen boys) from playing certain games or restrict the hours that they can play them;

-what other safeguards it will be implementing to limit the amount of “player attention” that these games will be capturing;

-whether it will be selling game-related merchandise in the Apple store so its financial incentives to encourage extensive game-playing are clear from the outset; and

-whether it will be using data about player behavior to encourage improved learning, collaborative problem-solving, community building and other so-called “pro-social” skills in any of the games it will be offering.

I have no reason to doubt that Apple is serious about protecting the user data that its devices and services generate. Its new venture into gaming provides an opportunity to build on Apple’s reputation for safeguarding the use of its customers’ information. Tim Cook and Apple need to be talking to the rest of us, both now and next fall, about how it will be applying its data-related values to problems its customers care about today in the brave new world of on-line gaming.

+ + +

Apple’s stated values will hold its current customers and attract new ones when there is “a match” between the company’s solutions and the problems the rest of us have that need solving. Affiliation and loyalty grow when there are shared priorities in communities, in politics and in the marketplace.

That means Apple should keep talking about what its tech-appreciating customers care about most in the light of the wonders that Apple has given us already. Despite its recently announced forays into entertainment, the company should never take its eye too far from what it does best—which is to make world-changing devices—even when they take more time to develop than short-term financial performance seems to demand. 

When a company knows what it is and acts accordingly, it can always take risks for the rewards that can come from wearing its values on its sleeves. 

This post was adapted from my March 31, 2019 newsletter. You can subscribe (to the right) and receive it in your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Building Your Values into Your Work Tagged With: Anand Giridharadas, Apple, behavioral data, Christopher Mims, core corporate values, corporate values, customer service, data, gaming, personal data use, priorities, Steve Jobs, surveillance capitalism, tech platforms, Tim Cook, values

About David

David Griesing (@worklifeward) writes from Philadelphia.

Read More →

Subscribe to my Newsletter

Join all the others who have new posts, recommendations and links to explore delivered to their inboxes every week. Please subscribe below.

David Griesing Twitter @worklifereward

My Forthcoming Book

WordLifeReward Book

Search this Site

Recent Posts

  • Great Design Invites Delight, Awe June 4, 2025
  • Liberating Trump’s Good Instincts From the Rest April 21, 2025
  • Delivering the American Dream More Reliably March 30, 2025
  • A Place That Looks Death in the Face, and Keeps Living March 1, 2025
  • Too Many Boys & Men Failing to Launch February 19, 2025

Follow Me

David Griesing Twitter @worklifereward

Copyright © 2025 David Griesing. All Rights Reserved.

  • Terms of Use
  • Privacy Policy