It’s long been said that humans are the weak link when it comes to cyber security and privacy. At RSAC 2019 in San Francisco on Wednesday, three associate professors from leading universities leveraged their recent research findings to facilitate a discussion on common human behaviors and mental models, together with thoughts on how to solve for “the human factor.”
Emilee Rader, Associate Professor and AT&T Scholar, Department of Media and Information, Michigan State University, spoke about how people tend to create what are known as “folk theories” based on their own experience or others’ experiences they hear second-hand. For example, in her research, Rader discovered that some companies automatically install security updates without users’ knowledge – under the premise that removing the human element entirely would be a better way to secure systems.
But as it happens, many users, upon discovering their systems have been updated without their consent, blame the security updates for unrelated problems or odd behavior with their computers. A folk theory is born: security updates screw up your system. Now, you have some portion of your user base who is against security updates.
Blacklisting weak passwords is a must, but the reason for blacklisting should be explained to users.
Rader recommends that it's better to include humans in key decision points, from consent through authentication, software updates and the like. Excluding them has too much potential for unintended consequences, leading to more harm than good.
Apu Kapadia, Associate Professor of Computer Science and Associate Director, Security Program, Indiana University Bloomington, discussed his research focusing on how photographs posted online can compromise a person's privacy. As cameras and digital photos have proliferated, far too many of the photos posted online contain sensitive user information such as location. Kapadia demonstrated how photos taken throughout a home or place of work could be used to determine the location and the layout of the place, along with the habits of the people living and working there. It would not take long for an attacker or thief to piece together enough information to act in a harmful way.
Kapadia does not believe humans can or will edit their photos as effectively as a programmed system, so he is working on a camera design that uses machine learning to obscure things like computer monitors out of a picture. The hard part, he says, is obscuring a part of a photo in a way that doesn't compromise the aesthetic quality of the entire picture. While he works to figure that out, Kapadia's recommends that everyone educate themselves on some of the basics of good digital photo hygiene.
The third to present was Lujo Bauer, Associate Professor, Electrical & Computer Engineering + Computer Science, Carnegie Mellon University. His research has him on a quest for usable and secure passwords. Ultimately, Bauer determined using a password manager to auto-generate passwords is better than letting humans set passwords for themselves. However, it isn't likely that humans will be eliminated from the act of setting passwords anytime soon, so he created guidelines for Information Security Officers and end users alike, based on the findings in his research.
For Information Security Officers, Bauer's research found that length is better than complexity when setting a password. (Though a little complexity can help, it's better to relax some rules around password setting so they aren't too strict or too complicated.) Blacklisting weak passwords is a must, but the reason for blacklisting should be explained to users. When offering feedback to users, remember that they have a hundred other accounts that are just as important to them.
For end users, Bauer recommendations what we've all heard, but it bears repeating. Don't use the same password for multiple accounts, don’t use your pet's name in a password, and include symbols and numbers in the password – not just at the end.
Each of the presenters had their own ideas on how to address the human factor when it comes to security and privacy – but all agreed that for now, it’s best to find ways to work with or build on how humans actually behave.
We encourage you to share your thoughts on your favorite social platform.