Room for doubt in the m-health app discussion

Mary Stevens, Editor
The race is on to develop mobile applications that empower patients to take more responsibility for their own care. Although there is a big opportunity to build new devices that improve care through self-management, it’s important to consider human behavior in of all this, said Dan Feinberg, director of the health informatics graduate program at Northeastern University in Boston.

He’s not anti-tech—far from it—but believes everyone must understand that these apps by themselves can’t change patient behavior. Feinberg made his comments during a Massachusetts Health Data Consortium panel discussion on mobile health this week, injecting a bit of doubt into the conversation. His remarks point to a necessary part of the mobile conversation—human nature and the unintended consequences of “whiz-bang” innovation.

If a patient gets a device with impressive capabilities, that patient might try it out and get impressive results—then try out the same app on his or her friends, and possibly pets. In this case, patient-supplied multiple data points won’t produce a valid picture of how the patient is doing. More insidious, perhaps, is a situation where a diabetic patient knows a monitor will accurately measure and record his blood sugar level, but decides to eat a donut anyway, and just skips taking a reading, Feinberg said.

A weight-scale app might be a good way for a congestive heart failure patient to detect a rapid weight gain that could signal a dangerous change early on, alert the physician and potentially avoid hospitalization. Or it could be used by an eating disorder patient to discover new ways to continue losing weight.

And what happens when “the cool factor” wears off? Some studies show that after about eight to 10 months, people start slipping back to old, bad habits again, he said. Interestingly, there’s other data to show that patients will comply with treatment more when they’re confident that their doctor is watching, according to Feinberg.

Vendors and application developers need to think like patients who don't want to take more responsibility for their care, and might be inclined to use those cool new mhealth tools to game their health data. No one can plan for every vagary in self-monitoring technology, and there are no algorithms that can measure intent. But a little consideration of the dark side of human nature can go a long way.


Mary Stevens
Editor of CMIO
mstevens@trimedmedia.com
 

Around the web

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”

FDA Commissioner Robert Califf, MD, said the clinical community needs to combat health misinformation at a grassroots level. He warned that patients are immersed in a "sea of misinformation without a compass."

Trimed Popup
Trimed Popup