Paper Summaries
Cognitive Psychology

August 6, 2025 | 5 minute read

Cognitive Dissonance

by Leon Festinger

What I read

In this paper, the author describes cognitive dissonance, and offers three different sets of experiments to illustrate how the phenomenon works.

First, the author introduces the theory: “the idea that if a person knows various things that are not psychologically consistent with one another, he will, in a variety of ways, try to make them more consistent.” Dissonance occurs primarily when our expectations of what things go together, and what things do not, is not fulfilled.

Three studies are then described—related to decision making, lying, and temptation—to illustrate cognitive dissonance in action.

Immediately after making an irrevocable decision, dissonance occurs between the fact that a decision was made and the options that were not selected. There are two ways people reduce the dissonance. One is to persuade themselves that any positive parts of the non-selected choices were not actually positive; another is to exaggerate the good parts of the selection that was made. Experiments are described to show that dissonance occurs only after the decision is made.

Next, the author describes dissonance that appears as a result of lying. Saying something that is inconsistent with what they actually believe creates dissonance, and people try to resolve that inconsistency. The more someone justifies their statement, the less “bothersome” the dissonance will be, and one of the most common ways to increase the justification is by changing their private beliefs. If there was less pressure for making the original statement, the greater the dissonance, and the greater the opinion change.

Dissonance also occurs when people resist temptation. In a situation where someone is tempted to do something but does not do it, those things are in dissonance. One way to minimize the dissonance is by “derogating or devaluing” the activity towards which one was tempted; but this occurs only when there was “insufficient original justification for the behavior.” When there was a strong reason to abstain from the temptation, dissonance is small.

The paper abruptly ends, seemingly because the author has run out of allocated space.

What I learned and what I think

I don’t know too much about cognitive dissonance, but I did get into it in the context of stories; the stories we tell after conducting ethnographic research are intended to persuade our clients to change their perspectives, and so we provoke dissonance by highlighting things that are at odds with what they know to be “true” (usually the way they view their own business, or the overall market.) I know how I can produce more dissonance, but I don’t know how I can provide ways for people to reduce their own dissonance, but on my “terms” (by design, rather than left to their own devices.) A client starts to resolve their dissonance right away, or at least I would assume so, typically by rejecting what they hear: by throwing stones at it. But since I’m trying to urge a “decision,” at least as much as decisions are ever actually made during a strategy project, it seems that highlighting the large risk of pursuing a very surprising set of insights is actually easier for someone to reconcile then if the size of the risk of pursuing something surprising is perceived to be small.

I feel a little wrapped around the axle on that one. If I present something really surprising to a client, and then draw some insight-style conclusions and recommendations, those can be equally surprising, or we can temper them. As I’m understanding this, if they are equally as surprising, there’s a big gap between what the client first thought, and then heard, but little gap between what they heard and a decision they then make. Eh; that’s probably not right, I think I’m overbuilding this.

Whatever; that was a total side trip.

The bigger question that this prompts for me is the speed, frequency, and style at which the phenomenon shows up, and if it shows up during in-action designing. I come to a place during drawing where I can make a point of departure; the thing I just made provokes some sort of gap between what I want and what I see. At that point, I can resolve that dissonance by changing what I want, changing what I see, or by convincing myself that what I see actually is what I want. There’s judgement interwoven into the feeling of discomfort. But, the “what I want” is in flux, because I don’t know it yet.

I wonder what it means to treat my iterations as “lies.” I tell a lie by sketching something I don’t believe in, but there was little to no external pressure to tell it. I don’t believe in it because I treat my iterative work as throw-away; I’m giving it permanence, but I don’t actually want it to be permanent. The lie is out in the world, and I know the truth (? I don’t, unless the truth is the goal state, not the actual goal manifestation) and it doesn’t match. So I work to resolve the dissonance, either by convincing myself that it’s a good idea, or by drawing something else.

This feels dumb and wrong.

I like the topic a lot.

Also. The best part of this article is that it is in Scientific America, and from 1962, and in 1962 in Scientific America, there were ads for the world’s first fully mobile nuclear power plant.