Medical researchers must understand risks associated with AI

AI is poised to change the healthcare industry forever—but risks remain that researchers must take seriously. The need for such caution is one of the key takeaways from a new commentary piece published in The Hill on Jan. 16.

“AI can augment and improve the healthcare system to serve more patients with fewer doctors,” wrote author Enid Montague, PhD, an associate professor of computing at DePaul University and adjunct associate professor of general internal medicine at Northwestern University. “However, health innovators need to be careful to design a system that enhances doctors’ capabilities, rather than replace them with technology and also to avoid reproducing human biases.”

Due to physician shortages throughout the world and rising rates of burnout, Montague wrote, healthcare providers desperately need the help AI technologies can offer. However, he added “AI systems can also cause problems.”

“Increased medical error is a real potential consequence of poorly designed AI in medicine,” he wrote.

Other potential problems related to AI include “complacency from physicians” and patients who are less engaged with their own healthcare, Montague added. He also emphasized the importance of AI being developed by a diversified workforce, noting that a group of all white men “may not be trained to think about bias comprehensively.”

Click the link below to read Montague’s full commentary:

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

The final list also included diabetes drugs sold by Boehringer Ingelheim and Merck. The first round of drug price negotiations reduced the Medicare prices for 10 popular drugs by up to 79%. 

HHS has thought through the ways AI can and should become an integral part of healthcare, human services and public health. Last Friday—possibly just days ahead of seating a new secretary—the agency released a detailed plan for getting there from here.

Philips is recalling the software associated with its Mobile Cardiac Outpatient Telemetry devices after certain high-risk ECG events were never routed to trained cardiology technicians as intended. The issue, which lasted for two years, has been linked to more than 100 injuries.