Factor in human behavior when designing for humans

BOSTON—“There’s a culture in security where there’s almost no training about humans in the system except that maybe they’re a weakness,” said Jennifer Golbeck, speaking at the Privacy and Security Forum. 

As a result of that culture, “we don’t know how to design good systems for people,” said Golbeck, who is an associate professor at the College of Information Studies at University of Maryland. That’s a problem because good systems require that the designer understands users, their environment and the context in which they’re interacting with the system, among other factors. Good systems should be invisible to users and “if you do have to interrupt them, the most secure thing for them to do should also be the easiest thing for them to do.”

Humans are part of the security system. “The revolution I would call for is for cybersecurity professionals to start seeing human beings as a core part of the system. People can’t be upgraded so we need to design around that and throw security on top of that. If we design around people, systems become more secure.”

Golbeck talked about how passwords are beaten up and with good reason. “They suck but they don’t need to suck as bad as they do.” People pick bad passwords because it’s not their job to pick passwords, she said. The average person can remember seven things but we make them create eight-character, random passwords. That already exceeds what most people can remember. “Should security people be asking humans to create passwords that, by their nature, are impossible for humans to remember? Maybe that’s not the best premise. If you try to tell security people they should change the way they require people to do passwords, they get very angry. But, regularly changing passwords make systems less secure.”

People will do insecure things to get their jobs done, she warned. The solution is not telling people how to be more secure because that’s dictating and won’t be effective. The key is to design the systems around the people—what they are capable of, how they’re interacting, their jobs—to “end up with systems that are just as secure but maybe even more so because people won’t go around it to do their work.” 

Beth Walsh,

Editor

Editor Beth earned a bachelor’s degree in journalism and master’s in health communication. She has worked in hospital, academic and publishing settings over the past 20 years. Beth joined TriMed in 2005, as editor of CMIO and Clinical Innovation + Technology. When not covering all things related to health IT, she spends time with her husband and three children.

Around the web

Compensation for heart specialists continues to climb. What does this say about cardiology as a whole? Could private equity's rising influence bring about change? We spoke to MedAxiom CEO Jerry Blackwell, MD, MBA, a veteran cardiologist himself, to learn more.

The American College of Cardiology has shared its perspective on new CMS payment policies, highlighting revenue concerns while providing key details for cardiologists and other cardiology professionals. 

As debate simmers over how best to regulate AI, experts continue to offer guidance on where to start, how to proceed and what to emphasize. A new resource models its recommendations on what its authors call the “SETO Loop.”