How can we harness human bias to have a more positive impact on cybersecurity awareness?
Dr. Jessica Barker, Co-CEO of Cygenta, follows her passion of positively influencing cybersecurity awareness, behaviours and culture in organisations around the world.
Dr. Barker will be speaking about the psychology of fear and cybersecurity at RSA Conference 2020, and in this interview she discusses the human nature of cybersecurity.
What are some of the most important things you’ve learned over time when it comes to security culture? How important is it and why?
A positive and robust security culture is absolutely fundamental to the overall security maturity of an organisation. An organisation’s culture sets the tone for what is normal and accepted; it’s not what is written in a policy, it is what influences how people actually behave. From a security point of view, this is absolutely crucial and extremely influential.
Different cultures will influence whether people do what they should when it comes to security, for example a culture in which leadership demonstrate a strong commitment to, and respect for, security is much more likely to result in positive security behaviours than one in which leadership are dismissive of security.
The phenomenon of social proof, in which people model their behaviour on how others act (especially those in positions of authority or those they particularly admire), means that the role of leadership in security culture is vital. People in an organisation look to those in leadership to see how they should behave.
If leaders are seen to follow security policies and good practices, such as wearing identity badges and challenging tailgating, then others throughout the organisation are more likely to follow suit. A culture of fear in an organisation is very destructive. If people feel they are going to be blamed for clicking a link in an email they then suspect was phishing, for example, they are less likely to report such incidents when they happen. A culture of fear does not reduce the number of incidents, it just drives them underground and reduces the likelihood of people reporting those incidents.
When someone mentions security awareness training, there’s always a big split – some say it’s essential, others claim it’s a waste of money. What’s your take on this? Does it depend on the type of training?
Great security awareness training, that is part of a healthy cyber security culture and that is aimed at encouraging positive security behaviours, is essential. The problem is that awareness-raising training has a history of being dry, dull, technically-focused and ineffective. That is not engaging and not only will such awareness-raising fail to make a positive difference, it is actually likely to have a negative impact. Too often training has been designed by people with technical expertise who may know what they want to say, but not how best to deliver it or indeed what messaging is going to be most relevant and effective for the people they are communicating with.
For awareness training to be effective, it needs to be relevant to the people it is aimed at, it needs to be engaging, interesting and it needs to feel useful. Talking with people about security in their personal lives, for example, can be really powerful because it is something that everyone can relate to and when people engage with the content in relation to their home lives, they absorb it in terms of their working lives, too.
Awareness-raising that feels like an experience, for example a table top exercise or a live demonstration of a hack, is memorable and fun – people go away from experiences telling their colleagues, friends and family about them, which has a positive ripple effect. Using emotion in a constructive way is really powerful, for example by telling stories. I say “constructive” because it is most important that awareness-raising is empowering, and this is something that is overlooked way too often.
Eliciting fear has been one of the most used marketing strategies in the cybersecurity industry since its inception. Can scaring employees actually make an organization more secure?
Using fear, uncertainty and doubt (FUD) is generally a classic example of awareness-raising that engages with emotion in a destructive way. When we deliver cyber security awareness, we are often talking about the threats, which inevitably will scare a lot of people, so we need to be really responsible in how we do that.
Unfortunately, people often use fear as a blunt instrument, without an understanding of the affect it has. For years, sociologists and psychologists have been studying fear, and what happens when we talk about something scary as a means of promoting behavioural change. My keynote at RSA Conference 2020 will cover some of this work and the lessons we can learn in cyber security.
What’s your take on how many CISOs prefer to spend money on technology instead of educating employees. Can they really solve their security problems with tech purchases?
It’s been encouraging to see, in recent years, that more and more CISOs and security teams understand that security can’t be solved with technology alone. I understand the tendency to want to “fix” security with a piece of shiny kit, because if that worked it would be simple and very comforting. Unfortunately, security is not simply about technology, it’s about how people engage with technology, and for this we need to focus on people at least as much as we focus on tech.
What are the biggest misconceptions about security culture and what can security leaders do in order to make sure their employees are more security conscious?
One of the biggest misconceptions about security culture is the belief that it can’t be measured and tracked, in the way that other elements of security are. This is something I have been working on for my whole career in security: there are very effective ways to measure security culture and there are lots of metrics you can use to check progress. More so, it’s really important that leaders put these in place. When awareness-raising is not part of a strategy and there are no metrics to see if it is having the desired impact, it is usually not very effective. How can you know if something is working if you don’t have any ways of measuring success?