October 30, 2015

It’s a no-brainer to understand why we lie to others when we’ve been caught making a mistake or doing something wrong: to avoid losing a job, a spouse, a reputation; to avoid a fight, a fine, a prison term; to pass responsibility to someone else. But self-justification occurs when people lie to themselves to avoid the realization they did anything wrong in the first place. It’s the reason that many people justify sticking with a mistaken belief or a disastrous course of action even when evidence shows they are dead wrong.

The motivational mechanism that underlies the reluctance to be wrong, or to change our ways of doing things, is called cognitive dissonance: the discomfort we feel when two beliefs or actions contradict each other. Like hunger, dissonance is uncomfortable, and like hunger, we are motivated to reduce it. For example, the awareness that “smoking is bad for me” is dissonant with “I’m a heavy smoker,” so smokers have all kinds of dissonance-reducing rationalizations for persisting in that unhealthy practice (“it reduces stress”; “it keeps me thin”). The brain likes consonance, and provides a number of biases in perception that foster it: One is the “confirmation bias,” which sees to it that we notice and remember information that confirms what we believe, and ignore, forget, or minimize information that disconfirms it.

Dissonance is most painful when information crashes into our view of ourselves as being competent, kind, smart, and ethical—when we have to face the evidence that we have made a bad mistake. We have a choice: either admit the mistake and learn from it (“Yes, that was a foolish/ incompetent/ unethical thing to do” or “boy, was I ever wrong”) or justify the mistake and keep doing it (“Everyone cheats a little. Besides, that study was flawed. Besides, it was their fault”). Guess which course of action is most popular? Dissonance reduction is a largely unconscious mechanism that allows us to lie to ourselves so that we can preserve self-esteem and our positive self-images. “I’m kind; you’re telling me I hurt you? You started this fight so you deserve whatever I did to you.” “I’m a devoted parent; you’re telling me that not vaccinating my child was a mistake? You think I’m stupid? OK, I agree vaccines don’t cause autism but they are bad for other reasons.”

The implications of dissonance theory are immense, because they show how many problems arise not just from bad people who do bad things, but from good people who justify the bad things they do, in order to preserve their belief that they are good people. For example: physicians who don’t want to accept evidence that it’s time to change outdated medical practices; district attorneys who can’t accept evidence that they put an innocent person in prison; quarreling couples who wound each other under the banner of self-righteousness. The more time, effort, money and reputation we have put into maintaining a belief or practice, the harder it will be to admit to ourselves that we were wrong, and the harder we will work to justify our mistakes.

The need to reduce dissonance can set us on a course of action that might take us far from our original intentions. Imagine a person facing a major decision: stay in a troubled relationship or leave, blow the whistle on unethical practices at work or don’t rock the boat, believe some sensational rape allegation in the news or wait for the full story. As soon as the person makes a decision, he or she will justify it to make the behavior consonant with the belief—and then stops noticing or actively seeking disconfirming evidence. That first small step thereby starts a process of action, justification, and further action that increases commitment to the original decision, no matter how much information later piles up that it was a mistake.

As citizens and professionals, we all face many situations in which we will make a decision or take an action, justify it—and then shut our mental doors. It’s good to hold an informed opinion and not change it with every fad or study that comes along; but it is also essential to be able to let go of an opinion or an accepted practice when the weight of the evidence dictates. Dissonance reduction may be built into our mental wiring, but how we think about our mistakes is not. We can learn to become more open-minded and less self-righteous. It’s not easy, but that is no justification for not trying.

Carol Tavris and Elliot Aronson are social psychologists and authors of Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts (revised edition, Mariner Books, 2015).

Contact us at editors@time.com.

You May Like

EDIT POST