So You Think You Know Your Biases?

Everyone has biases. The best of us try and correct them. But take a minute—a literal minute—and look at the following questions:

  1. In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?
  2. If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?
  3. A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the ball cost?

How do you think you did on this little quiz? Easiest questions you’ve answered all week? Or do you feel you missed something?

What if I told you that only one in six people gets all three right? And that one-third of people get all three wrong?

Mosty of us feel good about spotting trick questions. But although we all have the cognitive ability to solve problems like this, many intelligent leaders still make crucial mistakes. This is especially true when they’re under pressure.

You’ll find answers to this quiz at the bottom of this article. But don’t skip there. Read on.

Your Biases on Auto-Pilot

The above questions are part of Yale professor Shane Frederick’s Cognitive Reflection Test, a tool that tests for dysrationalia. This is a fancy word for a common problem: behaving irrationally—even falling for a “trick question”—despite having ample intelligence.

Perhaps “tilt” is a more familiar example. Someone who is tilting keeps betting at the Blackjack table, or trying to land the perfect half-court shot, even though multiple failures have overloaded their emotions. They may know what to do, in theory, but they’re no longer learning from their experiences or making the rational choice of stepping away and getting themselves back under control. 

Persistent dysrationalia in otherwise brilliant leaders stands as evidence that knowing about our biases isn’t everything. We may even correct for our biases most of the time. But if we don’t do it consistently, that should give us pause.

In his book Impact, author Paul Gibbons breaks down dysrationalia by distinguishing between System I thinking and System II thinking. System I thinking involves our instincts and gut-reactions (“thinking fast”). System II thinking brings critical problem-solving and rational puzzling to the table (“thinking slow”). Good decision-making requires overriding System I and then applying the cognitive tools in System II. Dysrationalia, however, often results from overconfidence in System II’s cognitive tool: the smarter we are, the less critical we are of our snap judgments and the quality of our “gut reactions”. When running on auto-pilot, we never really engage the System II thinking that keeps our biases in check.

Automatic Biases and Risk to Your Business

The problem is, teams run on auto-pilot a lot. Individual people may know of their own cognitive biases and assume that awareness of these biases will improve their analytic thinking on the job. But most companies are so entrenched in “business as usual” that System II thinking doesn’t come online during projects as often as we think it does. The rational “net” meant to stop our biases from infecting our thinking never gets deployed. When this happens—all while we think we’ve been monitoring our biases—we look for explanations. We tilt. And our biases may actually get worse.

Consider, for example, the problem of “information deficit”. Today’s leaders feel overwhelmed with information, wondering how in the world they might sift through it all. When things go differently than they’d expected, or a team member makes a mistake, they might bemoan a lack of information, or even blame someone else for not doing their homework. But if the “information deficit” is really the problem, then the only logical conclusion is to never make a decision again. The growth rate of information is exponential, and we will never have all we need. Conversely, people make excellent decisions with limited information all the time, and some choices made with all available data still end up being the wrong ones. The “information deficit” is a false dilemma, but we are biased towards it all the same.

As Gibbons puts it, “Knowing that you, your staff, your team, and your organization have cognitive biases is a booby prize if you don’t know what to do about it.” If there’s a difference between being aware of our biases and actually doing something about them—what Gibbons actively calls “debiasing”—then where are we supposed to start? Much like with habit formation, it’s crucial that we monitor our System I thinking with easy, frequent checks, even drawing on our environment for help.

Killing Common Biases

Gibbons breaks cognitive biases into 3 categories: Perception, Problem-Solving, Solution-Selection. It’s easy for biases to color how we see a situation, how we name a problem, and how we choose a way to deal with it. According to the Decision Lab, three biases stand out more than others and have an outsized impact on our decisions:

Optimism bias

People like to believe their plans will work out. We favor information that supports our existing opinions. As a result, we give added weight to data that suggests our plans are on the right track. Debiasing our optimist requires commitment and accountability: make a rule that you will consider one cautionary piece of data for every two or three supportive pieces of data you used to make your decisions. Set aside time to talk with your teams about what you’ll do if everything goes wrong. Make a list of triggers and agrees, “If these things happen, we’ll pull the plug”. Don’t let optimism lead you to over-commit.

Present bias

Present bias involves focusing on immediate pleasure and comfort. It ignores how our future self has to deal with our present decisions. In her book Thinking In Bets, Annie Duke gives this bias a name and a character: Night Jerry. Night Jerry makes decisions while he’s tired and irritable, and Morning Jerry always regrets them.

But just as the Roman hero Ulysses had his soldiers tie him up so he wouldn’t do something stupid while under a spell, Night Jerry can tie himself up so that Morning Jerry doesn’t have to deal with his mess. Pack healthy snacks, set multiple alarms, etc. to stave off Night Jerry’s bad habits. If you’re struggling with motivation, Reconnaissance and “Premortem” meetings will help you map different pathways to the future and course-correct if things go badly. Start with possibility, and narrow to probability.

Social norms

Comparing our opinions to social norms is a good way to test for bias. This one may be controversial, and for good reason: it provokes us into examining our environments. If everyone around us believes one thing, but the outside majority believe something else, we may be guilty of creating an Echo-Chamber for our own views. But if we jump right onto the latest trends and conclusions without inspecting them, our Bandwagoning can cause just as many problems. If you find yourself in either total agreement or in total disagreement with public norms and opinions, that’s an opportunity to check your biases.

From Bias-Aware to De-biased

Still wondering about the questions at the beginning of this article? Here are the answers.:

  1. 47 days
  2. 5 minutes
  3. $0.05

If you got any of these wrong, take some time to think about how dysrationalia might rear its head in your business. Think about where and when your biases express themselves and then build in easy checks to make sure they don’t take over your critical thinking.

Ultimately, though, we are most motivated to slower thinking when we see that kind of thinking in others. Think of someone you admire who thinks slowly, who has better questions than answers and is a master of the “lukewarm take”. Commit to imitating their approaches to information and decisions, and see how it affects your own biases.

Related Articles

Take a look as some more related articles below