Factor in human behavior when designing for humans
BOSTON—“There’s a culture in security where there’s almost no training about humans in the system except that maybe they’re a weakness,” said Jennifer Golbeck, speaking at the Privacy and Security Forum.
As a result of that culture, “we don’t know how to design good systems for people,” said Golbeck, who is an associate professor at the College of Information Studies at University of Maryland. That’s a problem because good systems require that the designer understands users, their environment and the context in which they’re interacting with the system, among other factors. Good systems should be invisible to users and “if you do have to interrupt them, the most secure thing for them to do should also be the easiest thing for them to do.”
Humans are part of the security system. “The revolution I would call for is for cybersecurity professionals to start seeing human beings as a core part of the system. People can’t be upgraded so we need to design around that and throw security on top of that. If we design around people, systems become more secure.”
Golbeck talked about how passwords are beaten up and with good reason. “They suck but they don’t need to suck as bad as they do.” People pick bad passwords because it’s not their job to pick passwords, she said. The average person can remember seven things but we make them create eight-character, random passwords. That already exceeds what most people can remember. “Should security people be asking humans to create passwords that, by their nature, are impossible for humans to remember? Maybe that’s not the best premise. If you try to tell security people they should change the way they require people to do passwords, they get very angry. But, regularly changing passwords make systems less secure.”
People will do insecure things to get their jobs done, she warned. The solution is not telling people how to be more secure because that’s dictating and won’t be effective. The key is to design the systems around the people—what they are capable of, how they’re interacting, their jobs—to “end up with systems that are just as secure but maybe even more so because people won’t go around it to do their work.”