Will 2023 Be the Year AI Disrupts Healthcare?

January 23, 2023

Before we get to 2023, let’s talk about 2022—specifically about a technology that DIDN’T fulfill its promise last year: cryptocurrency.

It was a genuinely cataclysmal year for crypto. Read crypto blogger Molly White’s jaw-dropping list for the details on just how badly things have gone, but the latest disaster, the FTX collapse, is instructive. A young man widely seen as brilliant sold his vision of how cryptocurrency could be better (and safer) than banks. He sold the vision to investors, to crypto traders, and almost to Congress. Along the way, he told everyone he talked to that his goal was to make the world a better place (although it now appears the main person he was making it better for was himself).

Watching this debacle unfold, I was struck by some uncomfortable parallels. Take out some of the more lurid elements, like the polyamory at a Bahamian estate and the Ponzi-scheme vibe, and the story of crypto has more than a little in common with the story of AI in medicine.

To be clear: by most current accounts, Sam Bankman-Fried is a grifter and a con man, as are many other people selling crypto and its near relatives. I don’t think most people developing AI systems for healthcare are con artists. But, like the blockchain’s promise of “trustless banking” and “smart contracts”, the promises of most healthcare-related AI systems seem overhyped and always just over the horizon.

AI companies, like crypto companies, aren’t shy about making big promises. Take Viz.ai for example. It has a brilliant founder (a literal brain surgeon) who is proposing an innovative way to apply AI (“intelligent care coordination”) and touting remarkable benefits, including that the product will “save time…[provide] better outcomes…increase access to care… [and] reduce length of stay.” Or BostonGene, valued at more than $2 billion, who says they use “artificial intelligence…to personalize your therapy”, and that their AI can “dramatically improve clinical outcomes”. Or Sensely, who says their technology is capable of “increasing access…lowering costs…[and] improving health.”

Now, I’m not saying that these companies are committing fraud. But they provide zero evidence to back up their claims. Those in the industry know how high standards for claims in healthcare can be, and we understand why: healthcare is not an iPhone. Just ask Elizabeth Holmes. Standards are high because we need to trust a claim that a new cancer immunotherapy extends life or that a novel treatment for multiple sclerosis reduces relapses.

AI has solved some real-world problems in health care. Dictating a patient note, which used to require a human typist and took days, now happens instantaneously. No need for an RCT to be sure that’s an improvement. But if you’re going to claim AI can, right now, today, improve health, I think you should have to prove it. For some tasks, like interpreting pathology slides, there is emerging evidence that a physician assisted by AI is better than a physician alone. Still, that evidence is uneven. A recent, sobering GAO report puts this in perspective, using the words “promise” or “promising” on almost every page of its 103 pages and giving a “20-30 year” timeframe for the fulfillment of most clinical benefits.

Innovation is important, so applying AI to a diverse set of problems is a good idea. And we live in a capitalist society, so trying to stop companies from making money with inflated promises is hard. But we do have laws, regulations, and agencies designed to protect us from the unfettered marketplace. A company that claims its AI will improve clinical outcomes is begging for action from the FDA.

Even more importantly, when these AI companies dangle the false hope of transformative change through software, they distract us from more important and well-established ways of improving health. The leading causes of death and disability in America are heart disease, cancer, COVID, accidents (including drug overdoses), all of which are caused or exacerbated by poor access to care, inequality, and poverty. “Actual intelligence” hasn’t solved these problems, and I doubt artificial intelligence can. Social problems aren’t puzzles with solutions that can be discovered. These problems can only be “solved” collectively when enough people see that the status quo is unacceptable and that dramatic (and often painful) change is the only way forward. That’s how we got the civil rights movement, Social Security, and the Affordable Care Act.

So, I’ll leave you with two thoughts: 2023 will not be the year AI fixes healthcare. And smart people selling something too good to be true are generally grifters, not geniuses.

I hope the FDA is more up to the challenge of regulating AI-based software than the SEC was for regulating crypto. As for me, I will continue to believe that the hardest part of healthcare is the human part.

Dr. Michael Broder, a board-certified obstetrician and gynecologist, has 30 years’ experience in health economic and outcomes research. He received his research training in the Robert Wood Johnson Clinical Scholars Program at UCLA and RAND, attended medical school at Case Western Reserve University, and received his undergraduate degree from Harvard University.

In 2004, Dr. Broder founded PHAR, a clinically-focused health economics and outcomes research consultancy.  PHAR is a team of dedicated, highly trained researchers —individuals who are singularly focused on delivering high-quality health economics and outcomes research insights to the life science industry. PHAR has successfully conducted hundreds of studies resulting in more than 800 publications on a wide variety of therapeutic areas, and maintains an expansive network of collaborators, including 8 of the top 10 academic institutions in the US, as measured by NIH funding. Download our bibliography here.

Unencumbered by corporate bureaucracy, PHAR can efficiently execute contracts and complete projects on time and on budget. PHAR prides itself on being reliable and responsive to clients’ changing needs, and welcoming the challenge of tackling problems others can’t.

Share This Story!