Reality-checking 3 assumptions about healthcare AI

AI enthusiasts have varying aims and incentives for pushing the technology into healthcare, but many parrot a common set of justifications:

  • AI will take care of repetitive tasks so physicians can spend more time with patients.
  • By augmenting diagnostics and guiding treatments, AI will paradoxically personalize medicine.
  • Machine learning will increase efficiencies and cut costs.

The chorus comes under inspection for tone deafness in an opinion piece running in the May edition of the AMA Journal of Ethics.

“If these assumptions are not acknowledged and addressed now,” write Mathew Nagy, MPH, and Bryan Sisk, MD, “then novel technologies might exacerbate, rather than mitigate, current challenges” to “the healing patient-clinician relationship.”

Nagy is a second-year medical student at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University. Sisk is a third-year clinical fellow in pediatric hematology and oncology at Washington University School of Medicine in St Louis.

Here are synopses of the three assumptions they put forth and flesh out in some detail:

Assumption 1: AI will relieve physicians of tedious work. Reality check: “[A]lthough many current tasks of clinical care might be offloaded to an algorithm in the future,” Nagy and Sisk write, “the time demand and intentional effort required to provide high-quality clinical care might not decrease and could in fact increase.”

Assumption 2: AI will increase administrative as well as clinical efficiencies, benefiting all stakeholders. Reality check: “If AI decreases the time required for a patient visit, the healthcare system might respond by increasing the volume of patients seen per day rather than allowing time for relationship development and shared decision making.”

Assumption 3: Equipped with more time and richer data, clinicians will be able to engage meaningfully in relationship-building activities with patients. Reality check: “[M]any clinicians report low confidence in their ability to engage in difficult or emotionally charged conversations as a reason for not engaging in shared decision making. Similarly, some clinicians avoid discussing their patient’s psychosocial concerns because they are unsure how to respond.”

“Without forethought and planning, the implementation of new technologies might diminish the patient-clinician relationship in the name of efficiency, accuracy or cost reduction,” Nagy and Sisk conclude. “As such, clinicians, technology developers, administrators and patient advocates should take steps to maintain the centrality of the healing relationship in medical care as AI technologies are developed and further integrated into the healthcare system.”

The AMA has posted the piece in full for free.

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Around the web

The tirzepatide shortage that first began in 2022 has been resolved. Drug companies distributing compounded versions of the popular drug now have two to three more months to distribute their remaining supply.

The 24 members of the House Task Force on AI—12 reps from each party—have posted a 253-page report detailing their bipartisan vision for encouraging innovation while minimizing risks. 

Merck sent Hansoh Pharma, a Chinese biopharmaceutical company, an upfront payment of $112 million to license a new investigational GLP-1 receptor agonist. There could be many more payments to come if certain milestones are met.