Top
Navigation
2024 The Al Neuharth Innovation in Investigative Journalism Award, Medium Newsroom winner

Denied by AI: Consequences for Sick and Vulnerable Americans

About the Project

In an explosive, four-part investigative series, STAT revealed that health insurance companies are using a computer algorithm to deny care to seriously ill older and disabled patients — overriding the advice of their own doctors. Reporters Casey Ross and Bob Herman uncovered hidden details surrounding the use of an unregulated predictive algorithm to prematurely end rehab care of sick and injured Medicare Advantage beneficiaries. Relying on internal sources, confidential company documents, and court records, they exposed an explicit strategy by UnitedHealth Group, the nation’s largest health insurer, to increase profits in its Medicare Advantage business at the expense of vulnerable Americans. It’s a business strategy that may breach Medicare’s coverage rules, and it also reduces older adults and people with disabilities to numbers. AI technology is supposed to make care more personalized for everyone. Instead, it’s doing the exact opposite.

This was the most important health care business story over the past 12 months, surfacing the dangers of the rapid introduction of AI into medicine at the same time that Medicare is undergoing a massive transformation — it is now mostly run by private health insurers rather than the government. This has dramatic implications not just for the more than 15 million people whose care is affected by the companies we investigated, but for all of us who will age into the program. Turning 65 now requires everyone to either pick traditional Medicare or choose a plan offered by a Medicare Advantage insurer. Ross and Herman documented that choosing the latter means you may not receive rehab care you’re entitled to when you need it most — sometimes with life-or-death consequences. Their stories gave voice to family members who had to watch sick relatives suffer at the hands of a faceless and unresponsive bureaucracy, whose decisions were bent by bias, profit motives, and bad math.

Each installment contained new revelations: The first uncovered the very existence of the algorithm, developed by a UnitedHealth subsidiary called NaviHealth. That led to a story about NaviHealth whistleblowers who voiced concerns about how technology was usurping their medical discretion, especially in cases when patients obviously still needed rehab care.

UnitedHealth and NaviHealth repeatedly told us the algorithm was merely a guide — but the third story exposed how that was not the case at all. In fact, NaviHealth employees were pressured to adhere almost exactly to the algorithm’s predicted nursing home stays for patients, or face termination. The company set an overall target to keep patient stays within 1% of the days projected by the algorithm. Our final story, based on employee tips about what they deemed an unethical practice, exposed UnitedHealth’s use of secret rules that prevented some patients, including those with cognitive impairment, from getting rehab in the first place.

At a time when the hype surrounding AI is at its highest, STAT’s series revealed the technology’s hidden dangers in health care, where its use didn’t help patients, but deprived them of their health, safety, and life savings at their most desperate hours.