AI needs more diversity to avoid data bias

AI has the potential to disrupt the healthcare industry and improve healthcare outcomes of patients through faster diagnosis and more accurate, targeted treatment. But how AI algorithms are trained needs some improvement, according to Naga Rayapati, founder and CEO/CTO at online marketplace GoGetter, which penned an article for Forbes.

Namely, the data that AI systems are trained with needs to be more diverse to avoid bias in the algorithms. In healthcare, bias can inadvertently harm patients through discrimination.

“AI companies have a moral obligation to their customers, and to themselves, to actively address data bias,” Rayapati wrote.

Not addressing bias in the AI space could have detrimental impacts, including the possible rejection of the technology or sub-par products. In addition, bias could have legal implications in the future, according to Rayapati.

While the machine learning systems aren’t biased themselves, the data used to create algorithms can have built-in bias. For example, an AI system used in assisting sentencing guidelines recommended stricter guidelines disproportionately for minorities, Rayapati wrote. To ensure AI data is unbiased, the issue must be dealt with when data is collected or curated. Above all, the data must be diverse.

See the full story below:

Amy Baxter

Amy joined TriMed Media as a Senior Writer for HealthExec after covering home care for three years. When not writing about all things healthcare, she fulfills her lifelong dream of becoming a pirate by sailing in regattas and enjoying rum. Fun fact: she sailed 333 miles across Lake Michigan in the Chicago Yacht Club "Race to Mackinac."

Around the web

Boston Scientific has announced another significant M&A deal, scooping up an Israeli medtech company focused on RDN technology. 

Harvard’s David A. Rosman, MD, MBA, explains how moving imaging outside of hospitals could save billions of dollars for U.S. healthcare.

The recall comes after approximately 3% of patients treated with the device during the early stages of its U.S. rollout experienced a stroke or transient ischemic attack following surgery. The expected stroke rate is closer to 1%, the FDA explained.