Dear CSO, do you know how to build security culture?
What do you really know about security culture? I am going out on a limb here and claim you know very little, if anything at all.
Your day job is about security, and like most CSOs out there, you have a IT background. Most likely, you are still quite handy with the tech, and if forced to, you are able to set some firewall rules, and possibly even change a routing table or two.
You are likely to have picked up on the trend that people are the weakest link in your security chain, and you most probably have some sort of user awareness training in place. You know it is important, and everybody does it, at least that is what your training supplier tells you. And you can tick that box off on your compliance sheet.
Like many other CSOs, you are also likely to not have reached the level of user awareness you imagined and hoped for, and you may have reached the level of frustration of Dave Aitel, who last year went all out and said that “You should not train employees for security awareness”.
The human mind has many flaws. Yours does, and mine does too. We are jumping to conclusions without considering all the relevant information. We are constructing facts from fiction, because it makes us able to do what we want, not what is right. We are extremely vulnerable to peer pressure. We are blind to things we do not know about.
This implies that even if you know a lot about security, you are likely not to know a lot about people, how they function, and how groups form and interact. You may (and probably do) think that you know a lot about people. Consider this, then: do you have a minor, or a major, in a social science? Do you know what social science is, anyway?
Social sciences is a collective term describing the different sciences about humans, human interaction and groups, including (but not limited to):
- Psychology
- Sociology
- Social anthropology
- Organizational theory.
One of the things we have learned from social sciences and their research is that humans come pre-wired with biases. These biases impact how we see and perceive the world, how we respond to it, and how we interact with it. Let’s take a look at the blind spot bias which I mentioned above.
The blind spot bias works to help you focus on what you have to focus on, and avoid being interrupted by thoughts that are not relevant. The flip-side of the blind spot bias is that it is very hard to accept things you do not know. For example, if you have grown up in the western world, you are likely to consider insects in any form to be inedible. Traveling to a part of the world where insects are part of the human diet, blind spot bias may have you respond with disbelief and decide that locals are crazy, wrong and plain stupid. If, however, you grew up in an environment where insects are a regular part of the diet, you would consider such a response from a visitor strange and stupid.
The blind spot bias works against you by making it very hard for you to accept and realize other possible solutions, and the further away the solution is from your “known environment”, the harder the blind spot bias will oppose such solutions.
Another interesting bias is the confirmation bias: the need to find evidence that confirms our theories and beliefs, which makes us disregard information that contradicts them. If we use Dave Aitel as an example (sorry, Dave), the confirmation bias made him see only the faults of and problems with user awareness trainings. The more proof he found that he was right, the less likely he was to look for contradictory evidence.
By theorizing that you have no knowledge about culture and social sciences, I’m making the same mistake right now. Instead of doing serious research, I just look at the CSOs I know to confirm my theory. Then I apply another bias to my somewhat limited sample of evidence – I generalize. By generalizing, I take whatever information I have, and scale it up to make it applicable to what I have set out to prove.
As a writer, I’m allowed to make such errors to make a point. As a scientist, doing the same should be and is a deadly sin. As a human, I’m always going to make these errors. It is, according to science, hardwired in our brains. My responsibility is to exercise strong self-control, and to be humbled for and by the errors I make.
“What does this have to do with security culture?,” you may ask. Let us define culture. According to the Oxford dictionary, culture is “the ideas, customs and social behaviors of a particular people or society”. By this definition, we see that culture is about the things we all do in a group of people. Security culture may then be the “ideas, customs and behaviors that impact security, both positive and negative, in a particular group of people”.
In that definition, security is only a part of the whole, just like security is in most organizations around the world. It is your part, that is right. As I demonstrated above, you are likely the expert on security, but not on human behavior. Setting out to create and maintain security culture in your organization is not a job you should be doing alone.
Consider this instead: If you know security, who knows culture in your organization? And this: why don’t you work to build your security culture with those who know culture and human behavior?