IT Security and the Normalization of Deviance

by Bruce Schneier

Professional pilot Ron Rapp has written a fascinating article on a 2014 Gulfstream plane that crashed on takeoff. The accident was 100% human error and entirely preventable — the pilots ignored procedures and checklists and warning signs again and again. Rapp uses it as example of what systems theorists call the “normalization of deviance,” a term coined by sociologist Diane Vaughan:

Social normalization of deviance means that people within the organization become so much accustomed to a deviant behaviour that they don’t consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety. But it is a complex process with some kind of organizational acceptance. The people outside see the situation as deviant whereas the people inside get accustomed to it and do not. The more they do it, the more they get accustomed. For instance in the Challenger case there were design flaws in the famous “O-rings,” although they considered that by design the O-rings would not be damaged. In fact it happened that they suffered some recurrent damage. The first time the O-rings were damaged the engineers found a solution and decided the space transportation system to be flying with “acceptable risk.” The second time damage occurred, they thought the trouble came from something else. Because in their mind they believed they fixed the newest trouble, they again defined it as an acceptable risk and just kept monitoring the problem. And as they recurrently observed the problem with no consequence they got to the point that flying with the flaw was normal and acceptable. Of course, after the accident, they were shocked and horrified as they saw what they had done.

The point is that normalization of deviance is a gradual process that leads to a situation where unacceptable practices or standards become acceptable, and flagrant violations of procedure become normal — despite that fact that everyone involved knows better.

I think this is a useful term for IT security professionals. I have long said that the fundamental problems in computer security are not about technology; instead, they’re about using technology. We have lots of technical tools at our disposal, and if technology alone could secure networks we’d all be in great shape. But, of course, it can’t. Security is fundamentally a human problem, and there are people involved in security every step of the way. We know that people are regularly the weakest link. We have trouble getting people to follow good security practices and not undermine them as soon as they’re inconvenient. Rules are ignored.

As long as the organizational culture turns a blind eye to these practices, the predictable result is insecurity.

None of this is unique to IT. Looking at the healthcare field, John Banja identifies seven factors that contribute to the normalization of deviance:

The rules are stupid and inefficient! Knowledge is imperfect and uneven. The work itself, along with new technology, can disrupt work

behaviors and rule compliance.

I’m breaking the rule for the good of my patient! The rules don’t apply to me/you can trust me. Workers are afraid to speak up. Leadership withholding or diluting findings on system problems.

Dan Luu has written about this, too.

I see these same factors again and again in IT, especially in large organizations. We constantly battle this culture, and we’re regularly cleaning up the aftermath of people getting things wrong. The culture of IT relies on single expert individuals, with all the problems that come along with that. And false positives can wear down a team’s diligence, bringing about complacency.

I don’t have any magic solutions here. Banja’s suggestions are good, but general:

Pay attention to weak signals. Resist the urge to be unreasonably optimistic. Teach employees how to conduct emotionally uncomfortable

conversations.

System operators need to feel safe in speaking up. Realize that oversight and monitoring are never-ending.

The normalization of deviance is something we have to face, especially in areas like incident response where we can’t get people out of the loop. People believe they know better and deliberately ignore procedure, and invariably forget things. Recognizing the problem is the first step toward solving it.

Powered by themekiller.com anime4online.com animextoon.com