David Griesing | Work Life Reward Author | Philadelphia

  • Blog
  • About
    • Biography
    • Teaching and Training
  • Book
    • WorkLifeReward
  • Subscribe to my Newsletter
  • Contact
You are here: Home / Archives for health care industry

The Good Work of Getting What We Need As Patients

December 2, 2019 By David Griesing Leave a Comment

Since recent posts here and here about work in healthcare—discussing burnout among health professionals, concerns about misuse of patient data, and questions about who is policing our rapidly changing health system—I’ve continued to follow developments in the field.  
 
Over the past few weeks, some of you have also shared your troubled reactions about how work in healthcare has been evolving.
 
The net of these developments is that while there are grounds for alarm about the uses of our health data, its proliferation presents some extraordinary opportunities too. Concepts like “precision medicine” become more realistic as the amount and quality of the data improves. More and better data will also help us to live longer and healthier lives. On the other hand, whether AI and other data-related technologies can enable us to improve the quality of the work experience for millions of healthcare professionals is a stubbornly open question.
 
In this last healthcare-related post for a while, there are two, practical rules of thumb that might give us greater sense of control over how our healthcare data is being used, as well as a couple of ways in which more and better health-related information is already producing better patient outcomes.
 
The good work of getting the healthcare that we need as patients (both for ourselves and for others that we’re caring for) requires healthy doses of optimism as well as pessimism, together with understanding as much as we can about when excitement or alarm are warranted.

Two Rules of Thumb To Inhibit Misuse of Our Medical Data

The first rule of thumb involves insisting upon as much transparency as possible around the uses of our medical information. That includes knowing who is using it (beyond the healthcare provider) and minimizing the risks of anyone’s misuse of it.

Unfortunately, more of this burden falls on patients today. As health systems increasingly look to their bottom lines, they may be less incentivized to protect our personal data streams. And even when our interests are aligned, doctors and hospitals may not be able to protect our data adequately. As I wondered here a couple of weeks ago: “Can these providers ‘master the learning curve’ [of big data-related technologies] quickly enough to prevent sophisticated consultants like Google from exploiting us, or will the fox effectively be running the chicken coop going forward?”

An article last weekend in the Wall Street Journal called “Your Medical Data Isn’t As Safe As You Think It Is” raised a couple of additional issues. As patients, we may be lulled into complacency by the fact that much of our data is rendered “anonymous” (or stripped of our personal identifiers) before it is shared in big databases. But as this article describes at length, “de-identified” data in the hands of one of the tech companies can easily be “triangulated” with other data they already have on you to track your medical information back to you. That means they remain able to target you personally in ways you can imagine and some you cannot.

Moreover, even if it remains anonymous, your medical data “in a stranger’s hands” may still come back to haunt you. As one expert in data sharing observed, companies that monetize personal data currently provide very little information about their operations. That means we know some of the risks to us but are in the dark about others. Of the known risks around data dispersal, you may suddenly find yourself paying higher health-related insurance premiums or barred from obtaining any coverage at all:

Google will be in a good position to start selling actuarial tables to insurance companies—like predictions on when a white male in his 40s with certain characteristics might be likely to get sick and expensive. When it comes to life and disability insurance, antidiscrimination laws are weak, he says. ‘That’s what creates the risk of having one entity having a really godlike view of you as a person that can use it against you in ways you wouldn’t even know.’

Our first rule of thumb as customers in the health system is to insist upon transparency around how our providers are sharing our medical information, along with the right to prevent it from being shared if we are concerned about how it is will be used or who will be using it.
 
The second rule of thumb has always existed in healthcare, but may be more important now than ever. You should always be asking: is my medical information going to be used in a way that’s good for me?  If it’s being used solely to maximize Google’s revenues, the answer is clearly “No.” But if your information is headed for a health researcher’s big data set, you should ask some additional questions: “Was someone like me considered as the study was being constructed so the study’s results are likely to be relevant to me?”  “Will I be updated on the findings so my ongoing treatment can benefit from them?” (More questions about informed consent before sharing your medical data were set forth in another article this past week.) 

Of course, understanding “the benefits to you beforehand” can also help you determine whether a test, drug or treatment program is really necessary, that is, if it’s possible to assess the pros and cons with your doctor in the limited time that you have before he or she orders it.
 
With medical practitioners becoming profit (or loss) centers for health systems that operate more like businesses, the good work of protecting yourself and your loved ones from misuse of your data requires both attention and vigilance at a time when you’re likely to be pre-occupied by a range of other issues.

More and Better Data Is a Cause for Excitement Too

There is an outfit called Singularity University that holds an annual conference each year with speakers who discuss recent innovations in a range of fields. Its staff also posts weekly about the most exciting developments in technology on a platform called Singularity Hub. One of its recent posts and one of the speakers at its conference in September highlight why more and better medical data is also a cause for excitement.
 
To understand the promise of today’s medical data gathering, it helps to recall what medical information looked like until very recently. Most patient information stayed in medical offices and was never shared with anyone. When groups of patients were studied, the research results varied widely in quality and were not always reconciled with similar patient studies. Medicine advanced through peer reviewed papers and debates over relatively small datasets in scholarly journals. Big data is upending that system today.
 
For us as patients, the most exciting development is that more high quality data will give us greater control over our own health and longevity. This plays out in (at least) two ways.
 
In the first instance, big data will give each of us “better baselines” than we have today about our current health and future prospects. According to the Singularity Hub post, companies as well as government agencies are already involved in large-scale projects to:

measure baseline physiological factors from thousands of people of different ages, races, genders, and socio-economic backgrounds. The goal is to slowly build a database that paints a comprehensive picture of what a healthy person looks like for a given demographic…These baselines can then be used to develop more personalized treatments, based on a particular patient.

Although it sounds like science fiction, the goal is essentially “to build a digital twin of every patient,” using it in simulations to optimize diagnoses, prevention and treatments. It is one way in which we will have personalized treatment plans that are grounded in far more accurate baseline information than has ever been available before.
 
The second breakthrough will involve changes in what we measure, moving organized medicine from treatment of our illnesses to avoidance of most illnesses altogether and the greater longevity that comes with improved health. As these developments play out, it could become commonplace for more of us to live well beyond a hundred years.
 
At Singularity University’s conference two months ago, Dr. David Karow spoke about the data we should be collecting today to treat a broad spectrum of medical problems in their early stages and increase our life expectancy. He argues that his start-up, Human Longevity Inc., has a role to play in that future.
 
Four years ago, Karow conducted a trial involving nearly 1,200 presumably healthy individuals. In the course of giving them comprehensive medical checkups, he utilized several cutting edge diagnostic technologies. These included whole-genome and microbiome sequencing, various biochemical measurements and advanced imaging. By analyzing the data, his team found a surprisingly large number of early stage tumors, brain aneurysims, and heart disease that could be treated before they produced any lasting consequences. In another 14% of the trial participants, significant, previously undetected conditions that required immediate treatment were discovered. 
 
Karow’s argument is that we’re “not measuring what matters” today and that we should be “hacking longevity” with more pre-sympomatic diagnoses. For example, if testing indicates that you have the risk factors for developing dementia, you can minimize at least some of those risks now “because of third of the risks are modifiable.” 
 
Every start up company needs its evangelists and Karow is selling “a fountain of youth” that “starts with a fountain of data.”  This kind of personal data gathering is expensive today and not widely available but it gestures towards a future where these sorts of “deep testing” may be far more affordable and commonplace. 
 
We need these promises of more personalized and preventative medicine—the hope of a better future—to have the stamina to confront the current risks of our medical data being monetized and misused long before we ever get there. As with so many other things, we need to hold optimism in one hand, pessimism in the other, and the ability to shuttle between them.

This post was adapted from my December 1, 2019 newsletter. When you subscribe, a new newsletter/post will be delivered to your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Building Your Values into Your Work, Continuous Learning Tagged With: baseline health measures, big data, control of your data, data, ethics, health and longevity, health care industry, healthcare, misuse of patient data, pre-symptomatic diagnoses, work, work of being a patient

An Unhappy Healthcare System

November 19, 2019 By David Griesing Leave a Comment

It came as no surprise. 

After writing about the growing challenges and responsibilities facing medical professionals and patients last week, I happened upon two posts about the burnout rates for the professionals who are charged with promoting health and healing in the rest of us.

The first was a PBS Newshour segment about the extent of the problem and, more recently, some possible cures. It cited studies from recent years that doctors commit suicide at twice the rate of the general population, that 86 percent of the nurses at one hospital met the criteria for burnout syndrome, that 22 percent had symptoms for post-traumatic stress disorder, and that the PTSD numbers for critical care nurses were comparable to those of war veterans returning from Afghanistan and Iraq. The reporter described what is happening as “a public health crisis.”  In a small ray of hope, providers have also begun to create outlets—like arts programs—so healthcare workers can “process some of the trauma” they are experiencing on a daily basis and begin to recover.

The second post, in Fast Company, discussed the most stressful jobs that are being done by women today, including the category “nurses, psychiatric healthcare provides and home health aides.” It noted that registered nurses are 14-16% more likely to have poor cardiovascular health than the rest of the workforce, “a surprising result” because the job is so physically active and nurses are more knowledgeable about the risk factors for cardiovascular disease than the workforce in general.

Several of you who work in the health care industry wrote to me this week about your experiences at work, which (sadly) mirror these discouraging reports.

The other follow-up development relates to the data that is being gathered from patients by the health care industry. Earlier this week, the Wall Street Journal reported hat Google had struck “a secret deal” with Ascension, one of the nation’s largest hospital systems, to gather and analyze patient data including lab results, doctor diagnoses and hospital records. Called Project Nightingale by Google and “Project Nightmare” by others, the data extraction and analysis “amounts to a complete health history, including patient names and dates of birth.” Having all of our medical information instantly available for analysis in one place is clearly a game changer.

The first alarm bells sounded about Project Nightingale involved the privacy of patient data. (Indeed, the day after its initial report, the Journal reported that the government had launched an investigation into Google’s medical data gathering on the basis of these concerns.) Among the privacy-related questions: will access to a patient’s data be restricted to practitioners who are involved in improving that patient’s outcomes? If this data can be used by others, how will it be used and how is the hospital system ensuring that those uses are consistent with that provider’s privacy policies? The governing statute, the Health Insurance Portability and Accountability Act of 1996, provides only the loosest of restrictions today. (Hospitals can share data with business partners without telling patients as long as the information is used “only to help the covered entity carry out its health care functions.”) 

On the positive side, the aggregation of patient data can facilitate more accurate diagnoses and more effective patient treatment.

Google in this case is using the data in part to design new software, underpinned by advanced artificial intelligence and machine learning that zeroes in on individual patients to suggest changes to their care.

More troubling, given Medicine’s continued drift from “profession” to “business,” is how providers can realize more profits from their patients by prescribing more medications, tests and procedures. How can patients distinguish between what they truly need to promote their healing and what is profit-making by the health care provider? As the Journal story also reports:

Ascension, the second-largest health system in the U.S., aims in part to improve patient care. It also hopes to mine data to identify additional tests that could be necessary or other ways in which the system could generate more revenue from patients, documents show.

How will patients be protected from unnecessary interventions and expense, or, unlike today, be enabled by industry reporting about medical outcomes to protect themselves? As I argued last week, the ethical responsibilities for everyone in healthcare–including for patients–are shifting in real time.
 
Earlier this year, I posted (here, here and here) on a similar Google initiative regarding smart cities. In places like Toronto, the company is helping government to gather and “crunch” data that will allow their cities to operate “smarter” and provide greater benefits for their citizens from the efficiencies that are achieved. As with Project Nightingale, there are privacy concerns that Google is attempting to address. But there are also key differences between this tech giant’s plans for monetizing citizen data in smart cities and its plans for monetizing patient data in the medical system.
 
In healthcare, your most personal information is being taken and used. This data is far more vital to your personal integrity and survival than information about your local traffic patterns or energy usage.

Moreover, in smart cities there are governments and long-established regulatory bodies that can channel citizen concerns back to government and its tech consultants, like Google. Because these interfaces are largely absent in health care, monitoring and enforcement is up to individual patients or hospital-sponsored patients’ rights committees. In other words, if you (as a patient) aren’t “watching the store,” almost no one will be doing so on your behalf.
 
To this sort of concern, Google responds both early and often, “Trust us. We’ve got your interests at heart,” but there are many reasons to be skeptical.  Another Fast Company article that was posted yesterday documented (with a series of links) some of Google’s recent history mishandling user data.

Google has gotten in trouble with European lawmakers for failing to disclose how it collects data and U.S. regulators for sucking up information on children and then advertising to them. The company has exposed the data of some 52 million users thanks to a bug in its Google+ API, a platform that has been shutdown. Even in the field of health, it has already made missteps. In 2017, the U.K.’s Information Commissioner’s Office found the way patient data was shared between the Royal Free Hospital of London and [Google affiliate] DeepMind for a health project to be unlawful. The app involved…has since controversially been moved under the Google Health umbrella. More recently, a lawsuit accused Google, the University of Chicago Medical Center, and the University of Chicago of gross misconduct in handling patient records.

Much of moving forward here depends on trust.

Will health care providers, that suddenly have the profit-making potential of big data, protect us as patients or see us only as revenue generators?

Can these providers “master the learning curve” quickly enough to prevent sophisticated consultants like Google from exploiting us, or will the fox effectively be running the chicken coop going forward?

What will Google and the other data-gatherers do to recover trust that seems to be damaged almost daily wherever their revenues depend upon selling our data to advertisers and others who want to influence us?

Is Google’s business model simply incompatible with the business model that is evolving today in health care?

As the future of medicine gets debated, we all have a say in the matter.

This post was adapted from my November 17, 2019 newsletter. When you subscribe, a new newsletter/post will be delivered to your inbox every Sunday morning.

Filed Under: *All Posts, Being Part of Something Bigger than Yourself, Continuous Learning Tagged With: Ascension Health System, big data, Google, health care industry, medical professional burnout, nurse burnout, patient due dilligence, patient privacy rights, Project Nightingale, provider profit motives, PTSD in medical profession, unnecessary medical treatments

About David

David Griesing (@worklifeward) writes from Philadelphia.

Read More →

David Griesing Twitter @worklifereward

Subscribe to my Newsletter

Join all the others who have new posts, recommendations and links to explore delivered to their inboxes every week. You can read all published newsletters via the Index on the Subscribe Page.

My Forthcoming Book

WordLifeReward Book

Writings

  • *All Posts (215)
  • Being Part of Something Bigger than Yourself (106)
  • Being Proud of Your Work (33)
  • Building Your Values into Your Work (83)
  • Continuous Learning (74)
  • Daily Preparation (52)
  • Entrepreneurship (30)
  • Heroes & Other Role Models (40)
  • Introducing Yourself & Your Work (23)
  • The Op-eds (4)
  • Using Humor Effectively (14)
  • Work & Life Rewards (72)

Archives

Search this Site

Follow Me

David Griesing Twitter @worklifereward

Recent Posts

  • An Artist Needs to Write Us a Better Story About the Future March 9, 2023
  • Patagonia’s Rock Climber February 19, 2023
  • We May Be In a Neurological Mismatch with Our Tech-Driven World January 29, 2023
  • Reading Last Year and This Year January 12, 2023
  • A Time for Repair, for Wintering  December 13, 2022

Navigate

  • About
    • Biography
    • Teaching and Training
  • Blog
  • Book
    • WorkLifeReward
  • Contact
  • Privacy Policy
  • Subscribe to my Newsletter
  • Terms of Use

Copyright © 2023 David Griesing. All Rights Reserved.

  • Terms of Use
  • Privacy Policy