Inside the NIST team working to make cybersecurity more user-friendly
Cybersecurity is usually not a user’s primary duty, yet they suffer an increasing burden to respond to security warnings, maintain many complex passwords, and make security decisions for which they are not equipped.
This is the main reason why security needs to be usable and why the National Institute of Standards and Technology (NIST) has a team of researchers working on projects aimed at understanding and improving the usability of cybersecurity software, hardware, systems, and processes.
“Our team works towards influencing cybersecurity standards and guidelines. For example, we were responsible for the inclusion of usability considerations in the NIST Special Publication 800-63 Digital Identity Guidelines,” Mary Theofanos, the leader of the NIST Usable Cybersecurity team, explained to Help Net Security.
“We have also increased efforts to actively share NIST’s usable cybersecurity research with security practitioners, managers, end users, and other researchers who can apply our findings and offer feedback on the value and direction of our projects. For example, our phishing research has been a popular topic that we’ve presented at academic research conferences, security practitioner forums, and organization-wide security days.”
Pointing out the problems
With an academic background in mathematics and computer science and many years of work at the Oak Ridge National Laboratory and US federal agencies under her belt, Theofanos moved to NIST around 2003 to develop standards for usability. She is a convenor of an ISO Working Group developing user-centered design standards and has worked to apply user-centered design and usability principals to many domains including cloud computing, public safety communications and biometrics.
Julie Haney, the other co-lead for the Usable Cybersecurity program, ended up at NIST after getting degrees in CS, spending over 20 years working at the US Department of Defense as a cybersecurity professional and technical leader primarily in the cyber defense mission, and getting increasingly interested in the intersection of people and security and the factors impacting people’s willingness and ability to adopt security best practices and technologies.
“Several years ago, I returned to school to more formally study this area through the Human-centered Computing program at University of Maryland, Baltimore County, where I obtained a master’s degree and am now close to completing my PhD. I began working at NIST so that I could focus specifically on research in the usable cybersecurity area while being afforded the opportunity for my research to have a real-world impact on NIST’s standards, guidelines, and community partners,” she shared.
Such a specific vantage point gives them a singular view and insight on things that all participants in the wider cybersecurity sphere could do to raise the security bar.
Here’s an example: even though people are increasingly exposed to cybersecurity and privacy in the news and they profess to be concerned about their online privacy, they often take no action to protect it.
This is partly because most people don’t know how to protect themselves, partly because usable interfaces for changing security and privacy settings and informing users of privacy issues are not readily available, and partly because they suffer from “security fatigue.”
“We found that the security fatigue users experience contributes to their cost-benefit analyses in how to incorporate security practices and reinforces their ideas of lack of benefit for following security advice,” Theofanos noted. “With respect to security, people expressed a sense of resignation, loss of control, fatalism, risk minimization, and decision avoidance.”
In certain circumstances, many users are ready to take mental shortcuts that will allow them not to give a second thought to security. For example, in a workplace setting, many tend to rely too much on the organizational safeguards (e.g., email servers, firewalls) to protect them.
Making users care and be careful is not easy
Too many users are still choosing convenience over security. Security measures mean unwelcome friction when you just want to quickly set up an account to buy or do something online, and users still have a nebulous idea of the consequences of bad security.
Even if personal information is disclosed in a data breach, individuals might not experience any immediate, tangible harm, Haney explained. For the same reason, they care little about companies looking at their web activity or the security of consumer Internet of Things.
“We’re currently conducting a study to understand people’s experiences with and perceptions of smart home technologies. What we’re finding is that most people are not concerned about the privacy of the information being collected by their smart home devices. They feel that, because they’re not doing anything illegal, they have nothing to hide. They are also numb to their private information already being out there on the internet. Or, if they are concerned, they’re willing to accept the risks for the convenience of the devices,” Haney shared.
“Even fewer are concerned about smart home security. To them, the security threat consists of a nebulous group of hackers who likely would not be interested in targeting them directly. Therefore, they are not motivated to take action or learn about what they can do to protect themselves.”
In fact, the threat/risk of immediate financial repercussions is just about the only thing that gets users interested in security.
The financial sector is aware of that and is definitely investing in usable security to take the burden off users while increasing the underlying security of their systems and online services. However, Haney noted, many smaller businesses remain oblivious to the risks to their businesses and customers and/or lack the resources and skill to do anything about it.
Advice for security advocates
While there’s a dire need for those who design security technologies, interfaces, and processes to consider user needs and context and make it easy for them to do the right thing, security advocates are also needed to promote, educate about, and encourage security adoption.
As Haney and a colleague discovered after polling cybersecurity advocates from industry, higher education, government, and non-profits, non-technical audiences find security to be scary, confusing and dull.
To motivate their audiences to engage in beneficial security behaviors, cybersecurity advocates first have to establish trust with their audience. They should do so by demonstrating technical knowledge and by flexing their interpersonal skills to build relationships.
Next comes the task of overcoming those three negative perceptions of security.
“Advocates must strike a careful balance between being candid about security risks while being hopeful and encouraging. The latter are essential for developing a sense of empowerment in the audience. Too much fear can be debilitating,” Haney noted.
To make security less confusing and complex, security advocate should avoid technical jargon and reframe highly technical concepts using terms their audience can understand. The security message must be tailored to the audience’s context – their environment, constraints, concerns, and skill level – and include simple, practical recommendations commensurate with it.
Finally, advocates must overcome perceptions that security is irrelevant and boring by exhibiting enthusiasm, making security relatable, and incentivizing security behaviors (i.e., motivating people to take action). “They should not hesitate to think out-of-the-box and try novel awareness and education approaches,” she concluded.
Organizational cybersecurity problems
Organizations have their own specific cybersecurity blind spots, Theofanos and Haney noted.
Some don’t fully understand or relate to the security risk so they don’t prioritize security, while others choose the opposite end of the spectrum: they push towards the most secure and restrictive solutions without regard for the impact on users, resulting in user frustration (best-case scenario) or users circumventing the security to cope with the additional burden (worst-case scenario).
“We also have observed a compliance mentality among many organizations, where compliance to a security directive or guideline is seen as success without regard for effectiveness,” they added.
“This is especially problematic for security awareness training. Many organizations have an annual security awareness training mandate for which success is measured by the number of people in the workforce who have taken the training. However, this number tells us nothing about how effective the training is in teaching and changing behavior.”
Security awareness trainings should also be revamped, they feel.
“The current model of ‘death by PowerPoint’ or computer-based annual security training is not working. Scare tactics without actionable guidance are also not working. People need to first understand how security is applicable to them and then be provided simple, actionable guidance,” they pointed out.
Conversely, a thing Theofanos and Haney would like to see less of is the “us vs. them” mentality between security professionals and users: security professionals should be more aware of the human element and find a way to work with users as partners in facing cybersecurity problems, not as enemies or burdens.