Skip to content

The Lucifer Effect: Understanding How Good People Turn Evil – The Devil Made Me Do It

Young children can say things that adults could never get away with. Ask a child why they did something wrong, and one answer you may get is, “The devil made me do it.” The personification of evil, they proclaim, can override their free will and cause them to take one more cookie after they’ve been told no more. We laugh at this childish idea. Of course, no one can make you do something against your will. Hypnotists reportedly can’t get you to do something you don’t want to do. So how silly is it that “the devil made me do it?” The Lucifer Effect: Understanding How Good People Turn Evil tries to help us understand that this may not be as far-fetched as we’d like to believe, but the devil isn’t in the details – the devil is in the system.

This is the first of three posts about The Lucifer Effect. The second post will address constructing a prison, and the third about “normal evil“.

Studies at Stanford

The linchpin of The Lucifer Effect is the study that Philip Zimbardo ran at Stanford University. The study randomly assigned healthy students into either a guard or a prisoner role. The situation was structured to create anonymity, deindividualization, and dehumanization. The structure worked too well. The experiment had to be terminated prematurely, because it was spinning out of control, as the mock guards were abusing the mock prisoners. (As a sidebar, Zimbardo has done other things as well, but none more popular than this experiment. One of his other books, The Time Paradox, is one I read years ago.)

Somehow, the reality that this was an experiment was lost and everyone descended into the belief that the prisoners and the guards were real. They started to act like the situation wasn’t contrived but was instead a result of misdeeds by the prisoners. The escape hatches (metaphorically speaking) to get out of the study were easy enough to realize, but, strikingly, no one reached for them, because no one seemed to believe that they could use them.

In this experiment, the power of the situation – or the system – overwhelmed the good senses of the guards and the prisoners and plunged them both into behaviors that weren’t characteristically theirs. Instead, these students’ behavior was shaped, as Kurt Lewin would say, by their environment.

B=f(P,E)

Kurt Lewin was a German-American psychologist who contributed greatly to our ability to understand how people behave. His famous equation is B[ehavior] = f[unction](P[erson], E[nvironment]). Put simply, the behavior of anyone is a function of both their person – their unique traits and personality – and the environment that they’re placed in. The mathematics of the function itself is unknown. The complexity of the person and the complexity of their environment make it difficult to predict how someone will really behave. (See Leading Successful Change for more discussion on Lewin’s equation.)

Our legal system rests on the notion that people are responsible for their behaviors, and the environment has no impact on our behavior. (See Incognito for more on this foundation.) However, Lewin says that this is incorrect. In Incognito, Eagleman explains how our will is far from free. Kahneman shares similar concerns in Thinking, Fast and Slow. He goes so far as to say that System 1 (automatic or emotional processing) lies to System 2 (higher-order reasoning.) The result of that deception is that we’re not really in control, we just think we are.

This is the dual-control model that Haidt explains in The Happiness Hypothesis about the rational rider and the emotional elephant. Our laws are constructed for the rational rider without the awareness that the rider isn’t really in control. We make only occasional allowances in our system of government for temporary insanity. This is the slightest acknowledgement where there are times that our emotions get the better of us – and would get the better of anyone.

However, the other variable to the equation is more challenging. Defining the environment is about what courts see as extenuating circumstances – even if they don’t exonerate people – that are worth considering. Zimbardo proves the power of the structural influences on the behavior of carefully screened, well-functioning students. However, he’s not alone in raising the alarm about how good people can be made to do bad things.

Shocking Authority

In the post-World War II world, it’s hard to understand how Adolf Hitler and the Nazi party could exterminate so many Jewish people. It’s unthinkable – yet it happened. The question was why people would agree to do such awful things. Stanley Milgram, as a Jew himself, was curious as to what people would do when they were told to. How quickly and easily would people bend to the power of authority. The experiment was simple in structure. Two volunteers would be selected and paired so that one was the teacher and the other was the learner. The teacher would be assessing the effect of electric shocks on the ability to improve learning retention.

At least it looked simple. The real assessment was whether normal people would be willing to administer what they believed to be life-threatening shocks to someone hidden from them. The learner was not a volunteer at all. The learner was a conspirator (or agent) of Milgram’s. The teacher would feel a small shock, then the learner and the teacher would be separated and would communicate through audio only. The teacher would administer what they thought were progressively larger and larger voltage to the learner – while he’d scream, indicate concerns for his heart, and generally indicate his displeasure.

In the presence of a researcher who pressured the teacher to press on, over 90% of people administered what they thought to be potentially lethal shocks to someone in another room. Of course, there were no shocks after the test shock the teacher received. However, the actual outcome of the research was that it was all too easy to get people to disengage their morals in the presence of a false authority. (See Mistakes Were Made for more on this terrifying research.)

Moral Disengagement

Bandura artfully explains the mechanisms that allow for Moral Disengagement. The tools of moral disengagement are the same tools that Zimbardo used to construct his mock prison experiment. The system setup for the Stanford Prison Experiment was designed – effectively – to disengage normal, healthy people’s moral safeguards. Free of these bonds, they were free to do anything. The study design in effect created a bubble of reality, of society, of culture that was free to evolve separate from the “real world” outside of the walls of the mock prison.

Bandura affirms that morality is relative to the environment that a person is in. In Paul Ekman’s autobiographical book Nonverbal Messages: Cracking the Code of My Life’s Pursuit, he shares how a chief’s statement that he would eat Ekman when he died made him a respected man. In this culture, the statement of eating a dead man caused him to achieve respect, while in most cultures, this idea would be repulsive.

Perhaps the greatest surprise wasn’t that morality was relative to culture, it was the speed with which the prison’s culture evolved on its own. It took hours to start to form and days to have a firm hold. By the end of the first week, it was strong enough to have psychologically broken three prisoners and to have shaken Zimbardo’s awareness of his responsibility for controls.

The Devil is the System

Maybe the childish beliefs aren’t so strange. Maybe the devil really did make them do it. However, maybe it’s the systems that we put in place that are the real devil. Maybe it’s the system that is The Lucifer Effect.