5 Must-Read Books on the Psychology of Being Wrong
What Ronald Reagan has to do with gorilla costumes, Shakespeare and fake pennies.
By Maria Popova
The intricate mechanisms of the human mind are endlessly fascinating. We’ve previously explored various facets of how the mind works — from how we decide to what makes us happy to why music affects us so deeply — and today we’re turning to when it doesn’t: Here are five fantastic reads on why we err, what it means to be wrong, and how to make cognitive lemonade out of wrongness’s lemons.
The pleasure of being right is one of the most universal human addictions and most of us spend an extraordinary amount of effort on avoiding or concealing wrongness. But error, it turns out, isn’t wrong. In fact, it’s not only what makes us human but also what enhances our capacity for empathy, optimism, courage and conviction. In Being Wrong: Adventures in the Margin of Error, which we featured as one of the 5 must-read books by TED 2011 speakers, Kathryn Schulz examines wrongology with the rigorous lens of a researcher and the cunning wit of a cultural commentator to reveal how the mind works through the eloquent convergence of cognitive science, social psychology and philosophical inquiry.
However disorienting, difficult, or humbling our mistakes might be, it is ultimately wrongness, not rightness, that can teach us who we are.” ~ Kathryn Schulz
From Shakespeare to Freud, Schulz examines some of history’s greatest thinkers’ perspectives on being wrong and emerges with a compelling counterpoint to our collective cultural aversion to wrongness, arguing instead that error is a precious gift that fuels everything from art to humor to scientific discovery and, perhaps most importantly, a transformative force of personal growth that to be embraced, not extinguished.
To err is to wander, and wandering is the way we discover the world; and, lost in thought, it is also the way we discover ourselves. Being right might be gratifying, but in the end it is static, a mere statement. Being wrong is hard and humbling, and sometimes even dangerous, but in the end it is a journey, and a story.” ~ Kathryn Schulz
WHY WE MAKE MISTAKES
In 2005, Joseph Hallinan wrote a front-page story for The Wall Street Journal, investigating the safety record of anesthesiologists with a dreadful track record in the operating room, letting patients turn blue and suffocate before their eyes. These mistakes, Hallinan found, were often attributed to “human error,” which assumes inevitability. Yet a closer analysis of these anesthesiologists’ process and practice revealed much could be done to avoid these deadliest of errors. So Hallinan spent nearly three years translating the insight from this particular story into the general world of human psychology, where error abounds in a multitude of realms.
Why We Make Mistakes: How We Look Without Seeing, Forget Things in Seconds, and Are All Pretty Sure We Are Way Above Average explores the cognitive mechanisms behind everything from forgetting our passwords to believing we can multitask (which we already know we can’t) to overestimating the impact of various environmental factors on our happiness. It’s essentially a study of human design flaws, examining our propensity for mistakes through a fascinating cross-section of psychology, neuroscience and behavioral economics.
We don’t think our perception is economical; we think it’s perfect. When we look at something, we think we see everything. But we don’t. Same with memory: we might think we remember everything, especially commonly encountered things like the words to the National Anthem, or the details on the surface of a penny—but we don’t. Our brains are wired to give us the most bang for the buck; they strip out all sorts of stuff that seems unimportant at the time. But we don’t know what’s been stripped out. One of the consequences of this is that we tend to be overconfident about the things we think we do know. And overconfidence is a huge cause of human error.” ~ Joseph Hallinan
Can you pick out the real penny? Check your answer here.
In 1999, Harvard researchers Christopher Chabris and Daniel Simons conducted a now-iconic selective attention experiment. Chances are, you’ve seen it, as the video made the viral rounds 10 years after the original experiment, but on the off-chance you haven’t, we won’t spoil it for you: Just watch this video in which 6 people — 3 in white shirts and 3 in black — pass basketballs around; you must keep a silent count of the number of passes made by the people in white shirts. Ready?
Now, be honest: Did you notice the gorilla that nonchalantly strolled through the middle of the action at one point? If you answered “yes,” you’re pretty exceptional. Chabris and Simons found that more than half of people didn’t notice it so, astounded, they set out to investigate the curious cognitive glitches that made the gorilla invisible — what is it that makes us so tragicomically susceptible to missing valuable information and misperceiving reality?
Published 11 years after the original experiment, The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us encapsulates Chabris and Simons’ findings on the mechanisms behind this “inattentional blindness” and how they translate into fundamental human behavior. Through six compelling everyday illusions of perception, they swiftly and eloquently debunk conventional wisdom on everything from the accuracy of memory to the correlation between confidence and competence. The book, much to our delight, is written with the subtext of being an antidote to Malcolm Gladwell’s Blink: The Power of Thinking Without Thinking which, for all its praises, is tragically plagued by out-of-context “research,” wishful dot-connecting and other classic Gladwellisms.
MISTAKES WERE MADE (BUT NOT BY ME)
In 1987, Ronald Reagan stood up in front of the nation in the wake of the Iran contra-scandal to deliver his State of the Union address, in which he famously declared, “Mistakes were made.” The phrase became an infamous hallmark of diffusion of responsibility and the failure to own our mistakes, which inspired the title of social psychologists Carol Tavris and Elliot Aronson‘s excellent Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts — an ambitious quest to unravel the underpinnings of self-justification and, in the process, make us better human beings.
As fallible human beings, all of us share the impulse to justify ourselves and avoid taking responsibility for any actions that turn out to be harmful, immoral or stupid. Most of us will never be in a position to make decisions affecting the lives and deaths of millions of people, but whether the consequences of our mistakes are trivial or tragic, on a small scale or a national canvas, most of us find it difficult, if not impossible, to say, ‘I was wrong; I made a terrible mistake.’ The higher the stakes — emotional, financial, moral — the greater the difficulty.”
Tavris and Aronson examine the root cause of these self-righteous yet erroneous behaviors: Cognitive dissonance — the mental anguish that results from trying to reconcile two conflicting ideas, such as a belief we hold and a circumstantial fact that contradicts it. In our deep-seated need to see ourselves as honorable, competent and consistent, we often bend reality to confirm this self-perception, which in turn results in a domino effect of errors. Mistakes Were Made (But Not by Me) holds up an uncomfortable but profoundly illuminating mirror that not only exposes the engine of self-justification but also offers rich insight into the behavioral tactics that prevent and mediate it.
HOW WE KNOW WHAT ISN’T SO
Written 20 years ago, How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life by Cornell psychologist Thomas Gilovich is arguably the most important critique on the biases of human reason ever published. It’s as much a throughly researched investigation into the science of mind as it is a compelling — and increasingly timely — treatise on the importance of not letting superstition and sloppy thinking cloud our judgement on a cultural and sociopolitical level.
Gilovich uses classic psychology experiments to extract practical insight and offer a recipe for using logical principles to predict and avoid our natural biases, from seeking confirmatory information to misattributing causality to random events and a wealth in between.
People do not hold questionable beliefs simply because they have not been exposed to the relevant evidence. Nor do people hold questionable beliefs simply because they are stupid or gullible. Quite the contrary. Evolution has given us powerful intellectual tools for processing vast amounts of information with accuracy and dispatch, and our questionable beliefs derive primarily from the misapplication or overutilization of generally valid and effective strategies for knowing. Just as we are subject to perceptual illusions in spite of, and largely because of, our extraordinary perceptual capacities, so too are many of our cognitive shortcomings closely related to, or even an unavoidable cost of, our greatest strengths.” ~ Thomas Gilovich
If this isn’t enough wrongology for you, we’ve compiled a complementary list of additional reading — take a look.
Published April 4, 2011