3 ways to solve the bias problem in AI

As AI systems leave laboratories and are implemented in real world settings, bias will continue to be “an increasingly widespread problem.” So how can it be solved? 

Researchers at IBM are currently working on automated bias detection algorithms to combat the problem, but the solution may not just be AI itself. The problem is likely deeper, according to a report published in Forbes. Societal bias may the actual problem.

Across the healthcare space, bias is a well-documented issue in medicine. 

Artificial Intelligence Lead at Accenture, Rumman Chowdhury, PhD, noted societal bias could still put a wrench in situations where data and algorithms are clean. She listed three specific steps organizations can implement to minimize the impact of societal biases.

  1. Look at the algorithms and ensure that they are not coded in a way that extends bias.
  2. Look at if AI can help alleviate the risk of biased data—similar to what IBM is trying to accomplish.
  3. Regulate AI and design the proper parameters for AI to operate within. Teach algorithms what data is valid to learn from that is valuable and ethical. 

To read the story, click the link below.

""

As a senior news writer for TriMed, Subrata covers cardiology, clinical innovation and healthcare business. She has a master’s degree in communication management and 12 years of experience in journalism and public relations.

Around the web

Compensation for heart specialists continues to climb. What does this say about cardiology as a whole? Could private equity's rising influence bring about change? We spoke to MedAxiom CEO Jerry Blackwell, MD, MBA, a veteran cardiologist himself, to learn more.

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”