Vanderbilt leveraging AI to integrate voice assistants into EHRs

The Vanderbilt University Medical Center (VUMC) is looking to “transform” electronic health records (EHRs) by leveraging voice and AI technology to allow EHRs to naturally interact with physicians to find relevant information and hopefully enhance patient care.  

“Voice is the most natural way of communicating that we have as human beings. Before there were EHRs, before paper and pencil, we would communicate with our voices,” Yaa Kumah-Crystal, MD, PhD, an assistant professor of biomedical informatics and pediatric endocrinology, said during the HIMSS conference. “It’s almost surprising that we’re not leveraging it more as a communication modality to find out things from the EHR.”

Kumah-Crystal and VUMC software engineer Timothy Coffman discussed the medical center’s newly developed voice user interface prototype—the Vanderbilt EHR Voice Assistant (VEVA)—and its effort to communicate with EHRs during the conference last week in Orlando, Florida.

VEVA was designed to naturally integrate into EHRs and uses natural-language processing (NLP) to listen to a user’s voice query, turn that speech into text, process the information through a language model, find the information and relay it back to the user/physician. NLP is described as a type of AI that helps computers understand, interpret and manipulate human language.

“Very much like Siri or Alexa, the role of a tool like this is to understand your request, the things that you want to know and serve it up back to you using natural voice,” Kumah-Crystal said.

“We embedded our application directly in the EHR so (physicians) can pull it up directly,” Coffman said. “As long as it’s listening, it will pass that information back to the library and the cloud, then process into our language recognition model.”

VEVA was considered to be “highly usable” and earned an 80.7 average system usability scale score following a usability study with 14 VUMC pediatric endocrinology providers. Of those providers, 64 percent of them said they were willing to use the system, while 36 percent of them said they weren’t.

The next steps for the prototype include: incorporating user feedback, adding support for clinical alerts and ensuring it's scalable for other hospital systems. Despite the progress, developers are still facing several challenges and design considerations with the VEVA prototype and integrating voice responses into EHRs. Challenges include issues with latency, automatic connections and already installed hardware like laptop and dictation microphones.

Kumah-Crystal also noted the VEVA system must provide brevity and clarity in order to analyze, interpret and provide information as accurate as possible, while also doing it in a way that sounds and feels right to the user.

“That’s the whole point about a voice user interface. If you don’t get it right, the person is too busy thinking about why it doesn’t sound right to hear the really important clinical information you’re trying to convey,” Kumah-Crystal said.

For other healthcare organizations also looking to incorporate a voice assistant into EHRs, Kumah-Crystal and Coffman offered several suggestions and recommendations to start the process.

  • Understand and measure provider frustration with EHRs
  • Create the business case for optimizing providers' experience and saving them time within their EHR workflow
  • Assemble a cross-functional team to build an overarching model and take providers’ pain points, workflow and information needs into consideration
  • Build and iterate a voice assistant prototype while users test and provide feedback
  • Understand information theory and map queries and content to satisfy user needs

“There’s something very special and magical about the way we respond to voice that we can really take advantage, and all the tools exist," Kumah-Crystal said. "All the tools have been there for awhile, we just need to think it through and figure out how to use it, where to plug it in and how to make it work.

“We want lots of people doing this, we want lots of people building this, so that it can be the norm so that the paradigm shift isn’t even something that we talk about. It’s just the new standard.”

""

Danielle covers Clinical Innovation & Technology as a senior news writer for TriMed Media. Previously, she worked as a news reporter in northeast Missouri and earned a journalism degree from the University of Illinois at Urbana-Champaign. She's also a huge fan of the Chicago Cubs, Bears and Bulls. 

Around the web

Compensation for heart specialists continues to climb. What does this say about cardiology as a whole? Could private equity's rising influence bring about change? We spoke to MedAxiom CEO Jerry Blackwell, MD, MBA, a veteran cardiologist himself, to learn more.

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”