Google AI getting better at heading off youth suicide

Following its search engine-based entry into algorithmic suicide prevention this past spring, the Trevor Project is fine-tuning Google’s AI technology by training algorithms on both initial conversations with counselors and the counselors’ post-session risk assessments.

And while the California-based nonprofit targets self-destructive behaviors specifically among LGBTQ youths, the logic behind its platforms—voice, text and instant messaging—is probably generalizable to other population segments.

The Atlantic posted an update on the work July 15.

The Trevor Project’s leaders are aiming to build “an AI system that will predict what resources youths will need—housing, coming-out help, therapy—all by scanning the first few messages in a chat,” reports Atlantic staff writer Sidney Fussell. “Long term, they hope to evolve the AI so it can recognize patterns in metadata.”

Fussell notes that, in May, Google injected $1.5 million into the project.  

Read his report:

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Around the web

“Now more than ever, we must recognize that our country’s leadership in groundbreaking medical research spurs scientific innovation, improves public health and creates new innovations that save and improve lives nationwide,” Joseph C. Wu, MD, PhD, explained in a statement. 

The technology used to diagnose, treat and manage cardiovascular disease is always evolving, keeping FDA officials quite busy. But have the agency's standards been slipping in recent years? A cardiologist with Cedars-Sinai Medical Center explored that very question.

No devices need to be returned at this time. However, the FDA warned, using these heart pumps without reviewing the updated instructions could result in "serious injury or death.”