Skip to content

Cognitive Dissonance: Reexamining a Pivotal Theory in Psychology

There’s a lot of research that has been done on cognitive dissonance.  Cognitive Dissonance: Reexamining a Pivotal Theory in Psychology is a guide to evaluate the research and what we’ve learned over the 60+ years since it was first proposed.  (See A Theory of Cognitive Dissonance for Leon Festinger’s original work.)  The perspective of the editor is to provide differing points of view to allow readers to draw their own conclusions.  Contrast this a bit with Joel Cooper’s views in Cognitive Dissonance: Fifty Years of a Classic Theory, which doesn’t include criticisms of his proposed revisions to the model – some of which are relatively serious concerns.

The Cases

It’s convenient to speak of cognitive dissonance as a single unified theory as Festinger originally formulated it.  However, since his initial formulation, several special cases have appeared:

  • Free-Choice – People reevaluate their choices after making them, and often show a greater preference for the choice they’ve chosen.
  • Belief-Disconfirmation – Rather than changing a belief that has been contradicted by the evidence, a person works hard to protect that belief.
  • Effort Justification – The more effort a person puts into something, the more likely they are to like it.
  • Induced (Forced) Compliance – One can justify their behavior by claiming that their compliance was forced, thereby eliminating the dissonance between their actions and their beliefs.

The Theories

There are several variations of the theory that have emerged as well:

  • Self-Perception – Proposed by Daryl Bem, this revision states that people form their attitudes from their behavior. Bem proposed that people cannot access their mood and cognition to develop their attitudes.
  • Impression Management – The intersection of Erving Goffman’s work (see Stigma) on impression management and cognitive dissonance, this theory proposes that the dissonance is due to the impact on our ability to manage the impression of others about us.
  • Self-Affirmation – People want to affirm themselves.
  • Self-Consistency – People want their experiences to be consistent with their self-view.
  • Aversive-Consequences – People experience distress when their choices lead to adverse consequences. This is Joel Cooper’s perspective as fully described in Cognitive Dissonance: Fifty Years of a Classic Theory.

Misattribution

One of the ways that developing cognitive dissonance can go awry is if the person has another way to attribute their discomfort.  If they believe that a situation is inherently discomforting, then they may misattribute the dissonance created by their conflicting cognitions to the situation.  This “resolves” the dissonance, but only in a short-term way, as it’s likely the discrepancy will arise again in different circumstances.

When trying to make people aware of their inconsistencies, it’s important not to give them an easy “out.”  If you do, they make take that easy exit and neutralize the power the dissonance has to help them change.

The Role of Commitment

The power of dissonance comes from the difference between ideas and the degree to which those ideas are difficult to move.  Cognitions, which are not difficult to move, will not arouse much dissonance, because the cognition will be changed before it registers much dissonance.  The commitment to the ideas can be that they’ve communicated their beliefs to others (see Change or Die) or that it seems fundamental to who they are.

Lack of Choice

Another danger when trying to rely on the effects of cognitive dissonance for behavior change is the risk that you may accidentally trigger a reinforcement effect.  Cognitive dissonance only occurs when the person believes that they have a choice.  If they feel that they are directly or indirectly being coerced into a behavior, they’ll generate no dissonance.  Instead, there may be a reinforcement of their opposition to the behavior.

The tricky bit is visible when we look at some of the classic experiments in psychology which failed to replicate.  Walter Michel’s marshmallow test hasn’t replicated well.  (See The Marshmallow Test for the core experiment.)  Maybe there’s something to the fact that the nursery was connected to Stanford.  Asch has said that his conformity effect may have been a sign of the times.  (See The Upswing for more.)  The experiments of Stanley Milgram on people’s willingness to inflict seemingly lethal electrocution were powerful on the Yale campus (even in an unimportant basement room) but failed to replicate when moved down the street to a strip mall.  (See The Lucifer Effect and Moral Disengagement for more on these experiments.)  Even the Stanford Prison Experiment performed by Phillip Zimbardo has controversy about the degree to which Zimbardo coached the bad behavior he wanted.

What these have in common is a variable (which wasn’t replicable) that subtly influenced the results and therefore it made their experiments not replicate.  The introduction of a lab coat, title, or something about the environment led people to believe that they didn’t have a choice – or that led them to a different choice.

Justification for Behaviors

At the time, it was heretical.  Classic learning theory said that the stronger the reinforcement, the greater the learning.  Economists predicted that if you gave people $20, they’d change their behavior more than if you gave them $1.  Greater rewards led to greater results, but cognitive dissonance predicted something else.  Cognitive dissonance predicted that people would use the $20 they got as justification for their behavior.  It was enough to make most people decide they weren’t doing the behavior – they were earning money.

Those who received the paltry offering couldn’t often complete the mental gymnastics to believe they didn’t do it of their own accord.  The result was greater cognitive dissonance and therefore greater long-term behavior change.  This counter-intuitive hypothesis was confirmed.  People who were offered little for a variety of behaviors seemed to have more dissonance and more attitude and behavior changes.

Another pitfall that we have when using cognitive dissonance to motivate change is that if we offer too much in the way of incentives, we’ll break the effect.  This is like Edward Deci’s observation that explicit rewards often break intrinsic rewards.  (See Why We Do What We Do.)

Organizing Stories

The work of James Pennebaker is clear that allowing people to write down their stories has a positive effect on their ability to process their trauma.  (See Opening Up.)  Lisa Feldman Barrett speaks of her own challenges with decoding what she was feeling in How Emotions Are Made – including how she confused illness with love.  What this says is that our grasp of how we feel and what our body is doing is more tenuous than we’d like to believe.

When people are asked about their feelings before giving them a chance to build a narrative, we see amplification of feelings – and distorted perceptions.  That is to say that if we want the effects of cognitive dissonance to work in the right direction, we have to create space for people to process their experiences into their autobiography before asking them about their feelings.

Discrepancies Without Consequences

As was mentioned earlier, Joel Cooper proposed a revision that has cognitive dissonance only occurring when there’s a negative outcome.  The outcome didn’t need to be predictable, only foreseeable.  The requirement for free choice remains in Cooper’s revision.  The real challenge to this is where cognitive dissonance effects seem to occur when there are no discernable consequences.

While this is problematic for Cooper’s theory, it’s important to remember that all models are wrong – though some are useful.  Cooper’s revision captures a non-trivial percentage of the space of cognitive dissonance and provides a useful thought framework for trying to trigger cognitive dissonance.  It may be that it is wrong, as the evidence implies, but it may still be a good framework for creating action.

Belief Intensification

What if, rather than changing beliefs to match the reality that you’ve observed, your beliefs morph into a more virulent and intense form?  That’s what seems to happen when cults that make predictions are confronted with the reality that their predictions are wrong.  Their beliefs become even more extreme.  The dates move, and the reasoning becomes more complex.

Sometimes, people react to disconfirming evidence by strengthening their resolve that they’re right –and that can be problematic if your goal is to change their perspectives.

Weak and Strong Reinforcement

The forces that drive cognitive dissonance to change are like the slingshot effect that NASA uses to fling satellites into outer space.  The slingshot effect works because of the proximity of the satellite to the source of gravity.  It’s the proximity that does the work.  Similarly, it’s the degree to which two options are of equal strength that powers the cognitive dissonance-driven change.  As a result, the more even the options are, the more they’ll be driven apart by cognitive dissonance.

Sometimes, the learning behaviors that should be more highly coupled to bigger rewards show the reverse results with reinforcement working better when the reward is small.  Even pigeons seem to favor the treats that require more work.

Forced Reevaluation

In the end, dissonance forces the reevaluation of beliefs.  This is something that we rarely do as humans.  While we know that the Earth isn’t flat and it’s rotating, we speak of sunrise as if the sun is rising above a flat earth.  While this is a simple linguistic aspect of our world, it surfaces a deeper awareness that the things we learned as children – or that our ancestors learned as children – are rarely tested.  The need for consistency and the awareness of this inconsistency can force us to reevaluate our beliefs in ways that allow us to more accurately represent reality in our minds.

Many speak of meditation and downtime to allow for reevaluating life’s priorities, but nothing has a focusing effect like knowing that two beliefs that you hold dear are utterly incompatible.

Dissonance Reduction Strategies

As we’ve seen above, there is an alternative to resolving the discrepancy.  Here are some of those alternatives:

  • Ignore – Simply failing to recognize the discordant belief.
  • Discount – Providing a reason why the belief isn’t that important.
  • Provide Alternate Explanations – Providing alternative explanations that don’t require the beliefs to be in conflict.
  • Exception to the Rule – Viewing the data as a fluke or exception, thereby discounting it or limiting the degree to which is should be considered discordant.
  • Blame – Blame someone else so that the results can be explained away without internally discordant beliefs.
  • Numbing (e.g. alcohol) – Temporarily delaying awareness by numbing, often but not exclusively through alcohol or drugs. Binge-watching television, surfing the internet, and chronic business can all be forms of numbing.

Protecting Beliefs

Often, we will protect our beliefs so that we don’t realize conflicting beliefs, or we process them in ways that prevent falsification of our cherished beliefs.  Some of the strategies for protecting our beliefs are:

  • avoiding exposure to such information,
  • reducing negative feelings arising from inconsistency,
  • actively discounting the inconsistent information,
  • generating alternative explanations for the contradictory information,
  • deeming it as an exception to the rule, or
  • reinterpreting the status of one’s beliefs in a manner that makes them unfalsifiable.

Cognitive Dissonance Comes from Sense Making

As humans, we’re constantly trying to make sense of the world around us.  (See The Righteous Mind.)  We experience cognitive dissonance, because we can’t find a way to represent reality to ourselves.  We’re forced to find new ways to think and new beliefs to form.  Maybe it’s time to make sense of Cognitive Dissonance.