Researchers are trying to make AI a force for good

A team of researchers from New York University are dedicating work to ensure that AI remains a force for good as the technology becomes more prevalent across many industries––including healthcare.

One of the biggest issues with AI as the technology continues to emerge is bias, and many in the industry have pointed to a greater need for diversity in developing AI products and solutions to avoid bias.

However, there are other pitfalls of AI, Time reported, including abuse by disinformation campaigns and abuse of privacy bounds.

These dangers of AI are why NYU’s AI Now team of researchers is focusing on four challenges: rights and liberties, labor and automation, bias and inclusion, and safety and critical infrastructure. The group is studying the ethical issues of AI to better inform policymakers and regulators about the technology, and while the researchers aren’t the only ones looking into the area, the field of AI ethics “remains limited,” Time reported. And efforts may be stymied by trade secrecy protections and laws.

See the full story below:

Amy Baxter

Amy joined TriMed Media as a Senior Writer for HealthExec after covering home care for three years. When not writing about all things healthcare, she fulfills her lifelong dream of becoming a pirate by sailing in regattas and enjoying rum. Fun fact: she sailed 333 miles across Lake Michigan in the Chicago Yacht Club "Race to Mackinac."

Around the web

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”

FDA Commissioner Robert Califf, MD, said the clinical community needs to combat health misinformation at a grassroots level. He warned that patients are immersed in a "sea of misinformation without a compass."

Trimed Popup
Trimed Popup