Top
Navigation
2024 Excellence in Technology Reporting, Medium Newsroom winner

Denied by AI: Consequences for Sick and Vulnerable Americans

About the Project

In an explosive, four-part investigative series, STAT revealed that health insurance companies are using a computer algorithm to deny care to seriously ill older and disabled patients — overriding the advice of their own doctors. Reporters Casey Ross and Bob Herman uncovered hidden details surrounding the use of an unregulated predictive algorithm to prematurely cut off rehab care of sick and injured Medicare Advantage beneficiaries. Relying on internal sources, confidential company documents, and court records, they exposed an explicit strategy by UnitedHealth Group, the nation’s largest health insurer, to increase profits in its Medicare Advantage business at the expense of vulnerable Americans. It’s a business strategy that may run afoul of Medicare’s coverage rules, and it also reduces older adults and people with disabilities to numbers. AI technology is supposed to make care more personalized for everyone. Instead, it’s doing the exact opposite.

This was the most important health technology story of 2023, surfacing the dangers of the rapid introduction of AI into medicine. Reporting it required gaining an understanding of the technical design of a proprietary algorithm whose training and application was not only opaque to regulators, but to the patients on whom it was being used.

In addition to getting sources to provide documents and internal communications, the reporters also got them to demonstrate the use of the algorithm in real time, and on real patients. Names of patients were hidden to avoid exposing private health information. But these demonstrations allowed STAT to report in detail how the algorithm was configured, what data it relied on, and how employees were using it in their work processes.

Each installment of the series contained new revelations: The first uncovered the very existence of the algorithm, developed by a UnitedHealth subsidiary called NaviHealth. That led to a story about NaviHealth whistleblowers who voiced concerns about how technology was usurping their medical discretion, especially in cases when patients obviously still needed rehab care.

UnitedHealth and NaviHealth repeatedly told us the algorithm was merely a guide — but the third story exposed how that was not the case at all. In fact, NaviHealth employees were pressured to adhere almost exactly to the algorithm’s predicted nursing home stays for patients, or face termination. The company set an overall target to keep patient stays within 1% of the days projected by the algorithm. Our final story, based on employee tips about what they deemed an unethical practice, exposed UnitedHealth’s use of secret rules that prevented some patients, including those with cognitive impairment, from getting rehab in the first place.

At a time when the hype surrounding AI is at its highest, STAT’s series revealed the technology’s hidden dangers in health care, where its use didn’t help patients, but deprived them of their health, safety, and life savings at their most desperate hours.