How well do we really want AI to know our minds?

AI is enabling healthcare workers to understand people’s moods with a level of accuracy that may be as unnerving as it is exciting.

It’s undeniably positive when technology can warn a nurse of a loved one’s impending but secret suicide attempt. But would you want a stranger working with a high-tech tool to get deeper inside your head than you can go yourself?

New York Times opinion columnist David Brooks takes up the quandary in a piece posted Monday.

“When people suffering from depression speak, the range and pitch of their voice tends to be lower,” Brooks writes. “There are more pauses, starts and stops between words. People whose voice has a breathy quality are more likely to reattempt suicide. Machines can detect this stuff better than humans.”

Computers can also pick up body-language cues to depression that people wouldn’t recognize—everything from heads moving less than usual to smiles falling faster into poker faces.

“The upshot is that we are entering a world in which people we don’t know will be able to understand the most intimate details of our emotional life by observing the ways we communicate,” Brooks warns. “You can imagine how problematic this could be if the information gets used by employers or the state.”

Read the whole thing:

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Around the web

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”

FDA Commissioner Robert Califf, MD, said the clinical community needs to combat health misinformation at a grassroots level. He warned that patients are immersed in a "sea of misinformation without a compass."

Trimed Popup
Trimed Popup