Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

Paperback(Reprint)

$12.55 $15.00 Save 16% Current price is $12.55, Original price is $15. You Save 16%.
View All Available Formats & Editions

Temporarily Out of Stock Online

Eligible for FREE SHIPPING

Overview

Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris, Elliot Aronson

Why do people dodge responsibility when things fall apart? Why the parade of public figures unable to own up when they screw up? Why the endless marital quarrels over who is right? Why can we see hypocrisy in others but not in ourselves? Are we all liars? Or do we really believe the stories we tell?

Backed by years of research and delivered in lively, energetic prose, Mistakes Were Made (But Not by Me) offers a fascinating explanation of self-deception—how it works, the harm it can cause, and how we can overcome it.

Product Details

ISBN-13: 9780156033909
Publisher: Houghton Mifflin Harcourt
Publication date: 05/05/2008
Edition description: Reprint
Pages: 304
Product dimensions: 5.31(w) x 8.00(h) x 0.87(d)

About the Author

CAROL TAVRIS is a social psychologist and author of Anger and The Mismeasure of Woman. She has written for the Los Angeles Times, the New York Times, Scientific American, and many other publications. She lives in Los Angeles.

ELLIOT ARONSON is a social psychologist and author of The Social Animal. The recipient of many awards for teaching, scientific research, writing, and contributions to society, he is a professor emeritus at the University of California, Santa Cruz.

Read an Excerpt

CHAPTER 1
 
Cognitive Dissonance:
The Engine of Self-justification

 
Press release date: November 1, 1993
 
           we didn’t make a mistake when we wrote in our previous releases that New York would be destroyed on September 4 and October 14, 1993. We didn’t make a mistake, not even a teeny eeny one!
 
Press release date: April 4, 1994
 
           All the dates we have given in our past releases are correct dates given by God as contained in Holy Scriptures. Not one of these dates was wrong . . . Ezekiel gives a total of 430 days for the siege of the city . . . [which] brings us exactly to May 2, 1994. By now, all the people have been forewarned. We have done our job. . . .
 
           We are the only ones in the entire world guiding the people to their safety, security, and salvation!
 
           We have a 100 percent track record!1
 
 It’s fascinating, and sometimes funny, to read doomsday predictions, but it’s even more fascinating to watch what happens to the reasoning of true believers when the prediction flops and the world keeps muddling along. Notice that hardly anyone ever says, “I blew it! I can’t believe how stupid I was to believe that nonsense”? On the contrary, most of the time they become even more deeply convinced of their powers of prediction. The people who believe that the Bible’s book of Revelation or the writings of the sixteenth-century self-proclaimed prophet Nostradamus have predicted every disaster from the bubonic plague to 9/11 cling to their convictions, unfazed by the small problem that their vague and murky predictions were intelligible only after the event occurred.
 
           Half a century ago, a young social psychologist named Leon Festinger and two associates infiltrated a group of people who believed the world would end on December 21.2 They wanted to know what would happen to the group when (they hoped!) the prophecy failed. The group’s leader, whom the researchers called Marian Keech, promised that the faithful would be picked up by a flying saucer and elevated to safety at midnight on December 20. Many of her followers quit their jobs, gave away their homes, and dispersed their savings, waiting for the end. Who needs money in outer space? Others waited in fear or resignation in their homes. (Mrs. Keech’s own husband, a nonbeliever, went to bed early and slept soundly through the night as his wife and her followers prayed in the living room.) Festinger made his own prediction: The believers who had not made a strong commitment to the prophecy—who awaited the end of the world by themselves at home, hoping they weren’t going to die at midnight—would quietly lose their faith in Mrs. Keech. But those who had given away their possessions and were waiting with the others for the spaceship would increase their belief in her mystical abilities. In fact, they would now do everything they could to get others to join them.
 
           At midnight, with no sign of a spaceship in the yard, the group felt a little nervous. By 2 a.m., they were getting seriously worried. At 4:45 a.m., Mrs. Keech had a new vision: The world had been spared, she said, because of the impressive faith of her little band. “And mighty is the word of God,” she told her followers, “and by his word have ye been saved—for from the mouth of death have ye been delivered and at no time has there been such a force loosed upon the Earth. Not since the beginning of time upon this Earth has there been such a force of Good and light as now floods this room.”
 
           The group’s mood shifted from despair to exhilaration. Many of the group’s members, who had not felt the need to proselytize before December 21, began calling the press to report the miracle, and soon they were out on the streets, buttonholing passersby, trying to convert them. Mrs. Keech’s prediction had failed, but not Leon Festinger’s.
 
°°°
 
The engine that drives self-justification, the energy that produces the need to justify our actions and decisions—especially the wrong ones—is an unpleasant feeling that Festinger called “cognitive dissonance.” Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day.” Dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it. In this example, the most direct way for a smoker to reduce dissonance is by quitting. But if she has tried to quit and failed, now she must reduce dissonance by convincing herself that smoking isn’t really so harmful, or that smoking is worth the risk because it helps her relax or prevents her from gaining weight (and after all, obesity is a health risk, too), and so on. Most smokers manage to reduce dissonance in many such ingenious, if self-deluding, ways.
 
           Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity and, as Albert Camus observed, we humans are creatures who spend our lives trying to convince ourselves that our existence is not absurd. At the heart of it, Festinger’s theory is about how people strive to make sense out of contradictory ideas and lead lives that are, at least in their own minds, consistent and meaningful. The theory inspired more than 3,000 experiments that, taken together, have transformed psychologists’ understanding of how the human mind works. Cognitive dissonance has even escaped academia and entered popular culture. The term is everywhere. The two of us have heard it in TV newscasts, political columns, magazine articles, bumper stickers, even on a soap opera. Alex Trebek used it on Jeopardy, Jon Stewart on The Daily Show, and President Bartlet on The West Wing. Although the expression has been thrown around a lot, few people fully understand its meaning or appreciate its enormous motivational power.
 
           In 1956, one of us (Elliot) arrived at Stanford University as a graduate student in psychology. Festinger had arrived that same year as a young professor, and they immediately began working together, designing experiments to test and expand dissonance theory.3 Their thinking challenged many notions that were gospel in psychology and among the general public, such as the behaviorist’s view that people do things primarily for the rewards they bring, the economist’s view that human beings generally make rational decisions, and the psychoanalyst’s view that acting aggressively gets rid of aggressive impulses.
 
           Consider how dissonance theory challenged behaviorism. At the time, most scientific psychologists were convinced that people’s actions are governed by reward and punishment. It is certainly true that if you feed a rat at the end of a maze, he will learn the maze faster than if you don’t feed him; if you give your dog a biscuit when she gives you her paw, she will learn that trick faster than if you sit around hoping she will do it on her own. Conversely, if you punish your pup when you catch her peeing on the carpet, she will soon stop doing it. Behaviorists further argued that anything that was merely associated with reward would become more attractive—your puppy will like you because you give her biscuits—and anything associated with pain would become noxious and undesirable.
 
           Behavioral laws do apply to human beings, too, of course; no one would stay in a boring job without pay, and if you give your toddler a cookie to stop him from having a tantrum, you have taught him to have another tantrum when he wants a cookie. But, for better or worse, the human mind is more complex than the brain of a rat or a puppy. A dog may appear contrite for having been caught peeing on the carpet, but she will not try to think up justifications for her misbehavior. Humans think; and because we think, dissonance theory demonstrated that our behavior transcends the effects of rewards and punishments and often contradicts them.
 
           For example, Elliot predicted that if people go through a great deal of pain, discomfort, effort, or embarrassment to get something, they will be happier with that “something” than if it came to them easily. For behaviorists, this was a preposterous prediction. Why would people like anything associated with pain? But for Elliot, the answer was obvious: self-justification. The cognition that I am a sensible, competent person is dissonant with the cognition that I went through a painful procedure to achieve something—say, joining a group that turned out to be boring and worthless. Therefore, I would distort my perceptions of the group in a positive direction, trying to find good things about them and ignoring the downside.

Copyright © 2007 by Carol Tavris and Elliot Aronson
 
All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher.
 
Requests for permission to make copies of any part of the work should be submitted online at www.harcourt.com/contact or mailed to the following address: Permissions Department, Harcourt, Inc., 6277 Sea Harbor Drive, Orlando, Florida 32887-6777.

Table of Contents

CONTENTS
 
Introduction
Knaves, Fools, Villains, and Hypocrites: How Do They Live with Themselves?   1
Chapter 1
Cognitive Dissonance: The Engine of Self-justification          11
Chapter 2
Pride and Prejudice . . . and Other Blind Spots      40
Chapter 3
Memory, the Self-justifying Historian     68
Chapter 4
Good Intentions, Bad Science: The Closed Loop of Clinical Judgment  97
Chapter 5
Law and Disorder  127
Chapter 6
Love’s Assassin: Self-justification in Marriage      158
Chapter 7
Wounds, Rifts, and Wars        185
Chapter 8
Letting Go and Owning Up    213
Afterword               237
Endnotes 239
Index       277

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews

Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts 3.9 out of 5 based on 0 ratings. 10 reviews.
Guest More than 1 year ago
The authors intended to explain our behavior when we justify, rationalize and insist we were right when faced with the embarrassment of publicity of unintentional harm [often backed up by intentional harm.] The phenomenon of ¿acting out of a need to protect their egos¿ is ubiquitous, and even a poorly written book on the topic will be beneficial by reminding us how corruptible we are. Nevertheless, the authors have selected an explanation called ¿dissonance theory¿ and insist that it can explain just about any corrupt behavior. The problems with this overzealous application of this theory are 1--the book lacks evidence in the form of references to research on dissonance theory--the few experiments described were not originally about dissonance theory, but the authors nevertheless cavalierly rework conclusions to make it seem as proof of their theory. Thus scientific evidence becomes merely anecdotal evidence. One example of this is explaining the behavior of subjects in Milgram¿s experiment that they justified to themselves each progressive shock they administered. The actual research shows no such thing in fact the subjects themselves self-reported that they actually believed they lacked the necessary authority to determine whether or not to administer shocks, and therefore they believed that the burden was on the mock researcher, not them, to justify these decisions 2--the book ignores conventional explanations for corrupt behavior, providing no evidence against them nor for their own explanation. An example of this is saying that people continue to cheat on tests in order to justify the initial decision to cheat on a previous test, rather than that people continue to cheat because they have discovered that the consequences weren¿t as bad as they had feared 3--a neglect of differentiating factors among the various behaviors and their self-justifications: whether they followed up with narratives of the behavior or more cases of similar behavior, whether the initial behavior or its consequences were intentional or unintentional, whether the error or wrongdoing was ever publicized, whether the justification was of oneself or of other members of one¿s group, etc. In comparing experimental psychologists [presumably the authors themselves] with therapists behaving unethically, they write that science, because it depends on research, is ¿a form of arrogance control¿, which is ironic, because this is an arrogant self-excuse to convince the reader that the authors would not overapply an explanation of symptoms--the very thing the authors accuse repressed-memory theorists of doing. Nevertheless, dissonance theory likely can be used to understand certain behaviors, and the class of events as a whole comprised of malicious behavior motivated by ego preservation is very important to study.
Skeptical-DoDo More than 1 year ago
I was wonderfully impressed by the clarity and relevance of the material to actual life situations. As a student of errors this book has a lot to recommend it, as it give insights that become relevant in many aspects of life and profession. I heartily recommend this book to anyone who wishes to understand human nature and how it so easy for us, as human beings, to make mistakes and then twist facts to protect our own self images. It was great.
Anonymous More than 1 year ago
This is my favorite non-fiction book so far. I've owned (and lost to friends) many copies of this book. I keep reading it and re-reading it and never get bored. It gives an insight on how people justify things (beliefs, decisions, "mistakes"..) in an attempt to convince themselves and with time they just believe the lie ... Till you buy it too. A must-read.
Anonymous More than 1 year ago
Brilliant insights into a tendency we all have of deflecting blame, with great examples. The chapter on criminal justice iso particularly enlightening.
Anonymous More than 1 year ago
Anonymous More than 1 year ago
Anonymous More than 1 year ago
Anonymous More than 1 year ago
Anonymous More than 1 year ago
Anonymous More than 1 year ago