Duke researchers use AI chatbots to bring maternal, mental health tools to Kenya

Two Duke research teams are taking advantage of the sudden rise of AI in healthcare by using chatbots to provide information about maternal health and mental health tools to women in Kenya.

The Duke Global Health Institute in Durham, North Carolina, recently profiled two ongoing research projects aimed at improving healthcare access through the use of AI and chatbots, which are computer programs designed to simulate conversations with human users. Several healthcare agencies are turning to AI chatbots and robots to improve efficiency and engage with patients to alleviate their systems.

“Just as technology helps all complex systems operate more efficiently and effectively, it seems reasonable to assume that technology can help scale up traditional and task-sharing models of care,” Duke assistant professor Eric Green said in the release.

Green’s research focuses on using AI chatbots to help treat perinatal depression in women, with his team designing, building and testing their own SMS-based chatbot. Their chatbot uses an AI program called Tess to deliver Thinking Healthy, a cognitive behavioral therapy-based curriculum, according to the release. The team also created a series of conversation prompts for the chatbot to use when interacting with a depressed women.

The chatbot was tested at a maternity hospital in Nairobi, Kenya, and received positive feedback. After completing additional testing and improvements, the research team plans to pilot the chatbot with 20 depressed women.

“The biggest and most important takeaway from our user testing sessions is that women liked talking with our Tess mock-up. Their enthusiasm for the idea really motivated us to learn about how to make it better,” Green said.

The second project, led by global health graduate student Mary Brannock, focuses on understanding how women feel about chatbots that provide pregnancy support in Kenya. Brannock’s team built and tested a chatbot that provided information and reminders about prenatal care visits, breastfeeding and postnatal family planning options.

The chatbot was built in a Facebook Messenger platform, which allowed it to be widely accessible and didn’t require additional storage space or money. Additionally, it was designed to have a structured conversation and featured buttons with specific questions for women.

Twenty-six women tested a prototype of the chatbot and provided feedback on ideal characteristics. Data from the research hasn’t been fully analyzed yet, but researchers have received some insight into what women prefer when it comes to chatbots.  

“For example, the team found that women liked icons best when they didn’t represent race or stage of pregnancy and were neither fully human nor fully robot. Focus group participants also appreciated prompts that reminded them to schedule timely prenatal care appointments or keep on track with family planning services,” the post stated. “Overall, Brannock found that women were most receptive to a compassionate, friendly voice that gave them physician-approved advice on practical matters like diet and family planning.”

""

Danielle covers Clinical Innovation & Technology as a senior news writer for TriMed Media. Previously, she worked as a news reporter in northeast Missouri and earned a journalism degree from the University of Illinois at Urbana-Champaign. She's also a huge fan of the Chicago Cubs, Bears and Bulls. 

Around the web

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”

FDA Commissioner Robert Califf, MD, said the clinical community needs to combat health misinformation at a grassroots level. He warned that patients are immersed in a "sea of misinformation without a compass."

Trimed Popup
Trimed Popup