Skip to content

Transformational Security Awareness: What Neuroscientists, Storytellers, and Marketers can Teach Us About Driving Secure Behaviors

The first highlight I have for the book is “Just because I’m aware doesn’t mean that I care.” It’s a truth that we first get exposed to around the age of three, when our theory of mind begins to accept that others think differently than we do – or at least they have different information. However, it’s a key challenge to remember when it comes to how to create a security awareness program that works, as Perry Carpenter explains in Transformational Security Awareness: What Neuroscientists, Storytellers, and Marketers can Teach Us About Driving Secure Behaviors.

One of the things that I do from time to time is help some of my friends who work in information security. I’ll spend a few months helping them with some aspect of their systems or their programs, and then I’ll go do other things again. I’ve found that it’s hard for me to personally stay in information security for long, because I don’t want to remain vigilant for threats all the time.

Social Engineering

What’s the most vulnerable part of any security program? The answer is always humans. We can plan the best systems, implement the best hardware and the best software, only to find that a human is responsible for letting people through the back door. How many physical security plans have been thwarted by someone propping open a door, letting someone “official-looking” into the building, or failing to make sure the door is latched behind them? I can say that I’ve walked into buildings that I should have been escorted through simply by grabbing a door before it latched.

I learned about social engineering years ago through Social Engineering. Most of what people would consider social engineering, I’d put in the consultant’s essential toolkit for getting things done without authority. It’s about ways of getting folks to break – or ignore – the rules, and it works.

The Gaps

There are two critical gaps in security awareness that we’re always trying to solve. The first is the gap between knowledge and intention. We all know that we should exercise, get proper sleep, hydrate, and eat healthy. That’s our knowledge. However, few of us would say that it is our intention to do all these things. That is the first gap between what we know and what we intend. It’s further than the distance between knowing and caring – it’s more than caring, it’s deciding that we want to take action.

They say the road to hell is paved with good intentions. Note that it’s not good actions – it’s good intentions. Even if we’ve decided we want to take action towards something, will we actually do it? Many have had great intentions and set their New Year’s resolutions only to decide that their resolution was too much work – some before they even start.

With the two gaps, there’s often a huge gulf between what you’ve taught people and how they actually behave. Some of that may be due to the forgetting curve (that they’ve forgotten since they were taught), but this doesn’t account for the wide discrepancies.

Speaking of Security

When we’re speaking about security, we need to make sure that the messages are personal, emotional, and relevant. I often explain that everyone listens to one radio station – WIII-FM or “What is in it for me?” The result is that if the message isn’t tuned to them and their personal circumstances, it may not be heard.

One of the common challenges when working with messaging in business is that people assume the messages must be dry and devoid of any emotion. Organizations are starting to realize that you have to accept emotions in the workplace if you want a high-performance and high-functioning team. However, the idea that emotions are a useful tool hasn’t made its way to the folks who teach communications at most organization.

Finally, messages about security must be relevant. There’s a signal to noise ratio that people use for determining how much they’re going to pay attention to. The more irrelevant things they experience from you and your team, the less that they’ll pay attention to your messages. For the sustainability of the program, they must perceive things to be relevant to them when you communicate.

Events vs. Environment

Carpenter speaks about security awareness needing to be more than an event. Instead of the annual security awareness training, effective security awareness is continuous. It is, perhaps, punctuated by big events, but, overall, it’s an environment where everyone is reminded about the relevance of their need to be vigilant against threats. When Carpenter describes the difference between what most people do and what is effective, he uses the word “campaign.” However, even campaigns have ends. I prefer to think about how it’s a part of the continuous addition of information into the environment to reinforce the learning.

Lazy Brains

Carpenter points out that we use lots and lots of mental shortcuts for the work we do on a daily basis. Our brains are calorie hogs. They consume 20-30% of the glucose (or sugar, our bodies’ fuel) but represent only about 2-3% of our body mass. (See The Tell-Tale Brain for more.) Because our brains are lazy, we use lots of heuristics (shortcuts) that hackers exploit to get us to do things that we wouldn’t do if we fully considered them.

Motivation

Carpenter uses the Fogg Behavior Model – which basically states that you need enough motivation and ability to respond to a prompt. The greater the ability, the less motivation is required for a prompt to be effective. We all prompt people to do something. We ask them to do something, and they decide whether they will do it based on their motivation and their ability. This reminds me about how The Psychology of Hope speaks about hope being composed of two pieces – willpower (motivation) and waypower (ability). Kurt Lewin said that our behavior is a function of both person and environment – what we bring with us and the pressures or motivators of the environment. (See Leading Successful ChangeMoral Disengagement, and many more for Lewin’s equation.)

The Fogg model does have the benefit of clarity around a triggering event. A latent ability and a motivation will not activate without some sort of triggering event – that could be an internal thought or an external prompt, but something has to start the reaction. Think of it this way: gasoline has a great deal of potential – but that potential is only activated with some sort of a spark.

Facts and Frames

Carpenter aptly points out that one of the biases we all suffer from is that we’ll tend to ignore those facts that don’t fit our frame of thinking. When facts and frames meet, frames will almost always win. Our brains love consistency and abhor inconsistency. When something comes across our awareness, and it doesn’t fit our frame, we seek to find a consistent answer. Given the investment in our frame, the fact is often quickly jettisoned.

The frame further shapes the way we process our environment and how we’re motivated. It’s like walking around with rose-colored – or green-tinted – glasses. We see the world through those glasses – whether we want to or not.

Working with Users

Users may feel like a frustration from time to time, but the reality is that, without them, there would be no need for security or security awareness. We can think that they should know better, that they should keep separate passwords for separate sites and systems, but the reality is we’ve exceeded the reasonable capacity for our users to do the behaviors we think they should do. A typical user might have 100 sites they use in varying degrees of frequency. There’s no practical way to remember that many passwords without the aid of a system.

I don’t know any corporate infrastructure team that doesn’t have some sort of password management system to manage their passwords. Why would we deny users the same tools if we know, as professionals, we can’t manage passwords without a tool?

In our work with our users – I like to call them business clients – we must work with their limitations, or we’ll be constantly disappointed in them and in the success of our security awareness programs.

Emotions

It’s not supposed to be about emotions. It’s black and white, on and off, true or false. Or at least that’s what we want to believe. As Daniel Kahneman aptly points out in Thinking, Fast and Slow, our emotional, automatic, first response can lie to our thoughtful, rational brain – without us even realizing it. It’s this lying that we do to ourselves that malicious individuals can take advantage of, and it all comes down to emotions.

So, don’t get angry when users let the next malware into the organization; read Transformational Security Awareness so it doesn’t happen again.