If you want to get people’s attention to a change session, mention that you’re going to talk about neuroscience, and you’ve got them. The problem is that the most powerful and fascinating things that we know about human behavior have nothing to do with what we’ve learned from neuroscience.
Functional Magnetic Resonance Imaging
Magnets are cool. Really powerful magnets are really cool. When you can monitor blood flow by monitoring minute changes in magnetic fields, you’ve got the ability to see inside of people’s heads. Literally, functional magnetic resonance imaging (fMRI) uses electromagnetic fields to create a picture of where blood is flowing in someone’s brain. Because increased neural activity causes increased blood flow, you know what parts of their brains are in use at any given time.
It’s great technology, and it creates pretty pictures, but from the point of view of the change manager, it doesn’t do much to explain how people are persuaded or how to make your change successful. For that, we’ve got to look at classic science and explore how people behave.
Imagine yourself as a poor college student signing up for a study for a few extra bucks – enough to buy a pizza on Friday night. You’re invited by Solomon Asch to a study of perception. You won’t be injected with anything. There are no tests to fill out. You just need to sit in a room and answer some questions about the length of lines.
These aren’t optical illusions. You and a few others you don’t know are shown one line on one card and are asked which of the several lines on the second card are the same length. It’s easy enough that you should get 100% on this test, and you silently wish your calculus test could be this easy. However, somewhere along the line, it takes a turn.
Some of the others start indicating a different line than the one you think is a match. When it was just one other person, you brush it off and think about what toppings you’ll have on your pizza. However, when two people give a different answer, you change your answer to theirs, assuming you must be seeing things wrong. More startling is that next time everyone in the group disagrees with your initial assessment, you blink, and suddenly their answer seems right to you. No longer are you adjusting your perception to match theirs – it’s done automatically.
The others in the group, you discover later, are collaborators, and the real study was on the impact of group pressure on perception. You’ve just proved that a lie, repeated often enough, becomes believed no matter how much of a lie it is. That leads us to the horror of the atrocities carried out by the Nazis and our attempts to discover how it could happen. (See Nudge for more on Asch’s experiments.)
It was on campus at Yale where Stanley Milgram ran another study. This time, he had you join another person in his office. That’s where you and the other person were told that this would be an experiment about the impact of negative reinforcement on memory. You’re divided into roles of teacher and learner. It’s decided that you’ll play the role of the teacher, and you’re instructed how to use the device that will administer shocks to the learner after incorrect answers. The device, you’re told, can administer harmful – but not fatal – levels of shock to the learner, who is hooked up to the device in a separate room out of sight. You get a chance to try it out on yourself, and at low settings, it’s painful but tolerable.
The experiment proceeds, and you’re instructed to provide progressively higher levels of shock to the unseen learner. If you’re like most, even when the learner indicates that they’ve got a heart condition and they’re afraid, you’ll go to the very top of the scale. The learner was a collaborator of Milgram and was never hooked up to the device or in danger.
The real study is on the obedience to authority, and if you’re like most, you’ll obey authority. However, there’s a trick: when the same experiment was done off-campus at a nondescript office, almost no one issued the highest levels of shock.
The message from classic psychology is that our obedience to authority is shaped by our perception of that authority. (See The Lucifer Effect and Moral Disengagement for more on Milgram’s experiments.)
Two key learnings from classic psychological research leads to a useful understanding about how messages can be believed if they’re repeated enough and how people can do awful things that would seem to violate their values if they’re presented to them with enough authority. It doesn’t matter which part of their brain was or wasn’t engaged if they saw things differently or were willing to harm others.