By Molly Rose Teuke

“It infuriates me to be wrong when I know I’m right.”
— Jean-Baptiste Poquelin, aka Molière, French playwright, actor and poet

Being right feels very real. Next to primal biological urges, being right may be the strongest human imperative. Yet, our rightness can be just as illusory as a fleeting mirage.

When someone tells you, or even hints, that you’re wrong, chances are you leap to resistance, defensiveness and self-justification. Acknowledgment, acceptance or responsibility probably don’t come into play quickly. It’s unlikely you’re inclined to say, “I was wrong.”

If and when you concede that you might not have been 100% right, you feel embarrassed, awkward, even humiliated, and maybe berate yourself for being foolish.

It turns out there is a scapegoat — your brain. Your brain wants life to be predictable and easily handled, and it has strategies to make it so. One of those strategies is cognitive bias, a kind of mental rule of thumb your brain can use to make judgments in the moment. Your mental rules of thumb are uniquely yours.

The brain has about 100 billion neurons. In “A User’s Guide to the Brain: Perception, Attention, and the Four Theaters of the Brain,” author John Ratey of Harvard Medical School says, “There are more possible ways to connect the brain’s neurons than there are atoms in the universe. The connections guide our bodies and behaviors [and our beliefs] even as every thought and action we take physically modifies their pattern.”

In other words, no two brains are alike. An anonymous author, writing in a London newspaper in 1831, expressed it this way: “We cannot see things as they are, for we are compelled by a necessity of nature to see things as we are. We never can get rid of ourselves.”

In “The 7 Habits of Highly Effective People,” motivational speaker Stephen Covey offers a 20th century spin: “We see the world, not as it is, but as we are — or as we are conditioned to see it. When we open our mouths to describe what we see, we in effect describe ourselves, our perceptions, our paradigms. When other people disagree with us, we immediately think something is wrong with them.”

How you perceive the truth of the world depends on your experiences and how they have been cataloged by your brain — how the hundred billion neurons you have upstairs connect and reconnect and reconnect again to form your world view.

When you teach children to mind their manners, you’re teaching them to adopt and adhere to acceptable truths about polite behavior, especially those that are driven by cultural norms. Chances are you were admonished as a child that it’s rude to slurp your noodles; in Japan, children learn a different truth.

Yet it’s more than just your experiences and how your brain catalogs them that’s to blame when you get things wrong. Your brain, like everyone else’s, relies on shortcuts. One of the most common is known to science as cognitive bias.

You have a hunch, you get a gut feeling, you feel it in your bones. That’s your brain taking the bias shortcut. Bias operates under your awareness; it’s an involuntary mechanism and it has a big impact on how you respond in the moment. Bias saves you from effortful thinking, but leaves you vulnerable to systematic errors in your thinking.

COGNITIVE BIAS

Researchers Amos Tversky and Daniel Kahneman first introduced the concept of cognitive bias in 1972 (see “Thinking, Fast and Slow” in resources). Since then, researchers have described a number of different types of bias that occur so predictably, cognitive science has named them:

  • Confirmation bias is your predisposition to pay attention to information that supports the way you already think, and ignore or discard information that feels contrary to your beliefs. You seek out information that confirms you’re right.
  • Availability bias is your tendency to give more credence to information you see more often. If you come across several news stories about car thefts, availability bias may lead you to insist car theft is common in your city — even when your neighbor says you’re wrong.
  • The halo effect is a bias by which you assume traits about a person based on a single feature. You might be seeking a keynote speaker and find yourself hiring the most physically attractive applicant because your brain unconsciously assumes that, because he or she’s attractive, he or she must also be smart and persuasive and funny. Likewise, someone wearing a wrinkled shirt must be an ignorant slob. One study found that jurors were less likely to find an attractive person guilty of a crime. Judging a book by its cover could leave you very wrong about what’s inside.
  • False consensus bias is your tendency to erroneously see your own beliefs and behaviors as common and appropriate, and to view alternate beliefs and behaviors as uncommon and inappropriate, sometimes even as deviant (and, of course, just plain wrong). You’re right and everyone knows you’re right, period.
  • The Dunning-Kruger effect, named for the social scientists who first described it, is about your tendency to greatly overestimate your own competence, especially when you have limited knowledge or skill. The worst drivers tend to think they’re better than the average driver. Entry-level chess players are more inclined to overestimate their own ability than players with more experience. As long ago as 1871, Charles Darwin wrote, “Ignorance more frequently begets confidence than does knowledge.”
  • Anchoring bias causes you to rely heavily on the first piece of information you encounter about anything. If your first encounter with a job applicant is that he showed up late, that’s likely to outweigh any later evidence of his sense of responsibility. Anchoring bias is what’s behind the time-honored adage, “First impressions matter most.”
  • Hindsight bias is your penchant for looking at an event that’s already happened that you could not have predicted and believing you knew it would happen. You scratch off a lottery ticket, win $100 and say, “I knew I was going to win today.” That’s hindsight bias. It’s just another way of demonstrating to others that you are right about … well, everything.

Any of these cognitive biases can lead you to declare you’re right when you judge a person, situation or event. The way to combat cognitive bias is to slow down your thinking. When you find yourself reacting quickly, back up and think through your judgment step by step. Take time to get past your brain’s assumptions and knee-jerk reactions.

Your brain is a wonder, making sense of the vast complexity of data you pour into it every minute of every day, and it has evolved adaptations to do the job well and efficiently. The downside of that efficiency is your impulse to believe your way is the best and sometimes only way, which can prevent you from expanding your perspective, broadening your own thinking and learning from others. Your brain’s habit of taking shortcuts can get you into trouble when bias shrouds your ability to look at the world more objectively.

It’s a safe bet that one thing you’re consistently wrong about is how often you’re right, but it helps to know it’s not you, it’s your brain.


Molly Rose Teuke has an enduring curiosity about what makes our brains tick. Pre-COVID-19, she offered a program for Nicolet College called Getting

Your Brain on Your Side and delivered brain-based leadership training for the NeuroLeadership Institute, a global leader in the performance arena. She also hosts a weekly classical music program on WXPR- FM. You can reach her at [email protected].

RESOURCES

  • “Being Wrong: Adventures in the Margin of Error,” Kathryn Schulz, ©2010, TED Talk: https://ed.ted.com/lessons/on-being-wrong-kathryn-schulz
  • “Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts,” Dr. Carol Tavris and Elliot Aronson, ©2007
  • The Skeptics Society’s Distinguished Science Lecture Series, Dr. Carol Tavris, https://bit.ly/3Ev5xIf • “Thinking, Fast and Slow,” Daniel Kahneman, ©2011