How to avoid malpractice suits stemming from AI utilization—or lack thereof

Physicians who reject an AI-based recommendation of established care guidelines in order to provide more personalized medicine are at heightened risk of being sued for malpractice should the patient be harmed.

So are physicians who go along with AI when it recommends a nonstandard treatment and a bad patient outcome follows.   

The authors of an analysis running in JAMA break out the scenarios that practitioners using AI would do well to consider before hypotheticals become actual cases.

Because tort law is inherently conservative, notes W. Nicholson Price II, JD, PhD, of the University of Michigan, and colleagues, “reliance on medical AI to deviate from the otherwise known standard of care will likely be a defense to liability well before physicians are held liable for rejecting AI recommendations.”

But physicians should watch for developments because the legal environment may change quickly, they add.

The authors offer several steps physicians should take now to cut their risk of legal exposure later, including:

1. Learn how to use and interpret AI algorithms in your practice. In the process, consider situations in which an available medical AI might be applied and how much confidence you might place in an algorithmic recommendation, Price and colleagues advise.

“This is a challenge, and evaluation tools are still very much under development,” they write.

At the same time, they point out, physicians can play a major role in shaping the discussion around liability and medical AI.

2. Encourage your professional organization(s) to take active steps to evaluate practice-specific algorithms. “Review by the FDA will provide some quality assurance, but societies will be well placed to provide additional guidelines to evaluate AI products at implementation and to evaluate AI recommendations for individual patients,” the authors write.

3. Push for high-level administrative efforts within your affiliated hospitals and health systems to guide AI deployments that align with clinical needs. “When external AI products are procured, physicians should advocate for safeguards to ensure that such products are rigorously vetted before procurement, just as with other novel medical devices,” Price and co-authors write.

4. Get your malpractice insurer to detail its coverage policies around the use of medical AI in practice. The authors ask: Is care that relies on AI recommendations covered the same as care without such recommendations, or does the insurer treat such practices differently? Are practices different for more opaque algorithms that provide little or no reasoning?

“Collectively, physicians and their hospital systems may be able to make demands for changes in terms of insurance coverage to better accommodate the needs of a future of AI-enabled medicine,” they write.

As AI enters medical practice, physicians “need to know how law will assign liability for injuries that arise from interaction between algorithms and practitioners,” Price et al. conclude. “These issues are likely to arise sooner rather than later.”

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Around the web

The tirzepatide shortage that first began in 2022 has been resolved. Drug companies distributing compounded versions of the popular drug now have two to three more months to distribute their remaining supply.

The 24 members of the House Task Force on AI—12 reps from each party—have posted a 253-page report detailing their bipartisan vision for encouraging innovation while minimizing risks. 

Merck sent Hansoh Pharma, a Chinese biopharmaceutical company, an upfront payment of $112 million to license a new investigational GLP-1 receptor agonist. There could be many more payments to come if certain milestones are met.