In our last post, we hoped to get you thinking about your biases. Give the ways our System I and System II thinking tend to work, it’s entirely possible that you are less aware of your System I biases precisely because your System II thinking is so sophisticated. As a leader, we certainly hope this is the case! But this seems to place us in an impossible situation: if our instinctive biases become more entrenched as our back-end critical thinking gets better and better, how are we ever supposed to shed the problem of bias?
Harnessing Habits
Paul Gibbons addresses the question of debiasing in his book Impact. There, he recommends turning to behavioral science for some answers.
Insights from behavioral science also undergird our executive coaching process at Leadership Reality. It’s sometimes confusing to leaders when we start by addressing their daily habits and dividing their personal life into 12 different dimensions. But our focus on personal behavior comes from the same place as Gibbons’s: intercepting our basic habits focuses our attention more effectively than anything else. And any attempt to de-bias our System I thinking requires that kind of intense attention.
It helps to think of System I thinking—our gut reactions—as a set of habits. Our instincts respond to triggers in our environment and start a pattern of thought-processes that have worked for us in the past. When we decide on a course of action, we trigger the reward systems in our brain because we have been fast, decisive, and efficient.
Of course, we all know that bad habits are as rewarding as good ones, if not more so. Bad habits wouldn’t be hard to break if they weren’t so darn pleasant. The same goes for our habits of System I thinking. And, as with bad habits, the only method for retraining our instincts involves interrupting the cycle of Trigger, Routine, and Reward.
Bring on the De-Biasing
Effective habit change usually occurs between the Trigger and Routine stages; the Reward comes from the satisfaction of doing things differently. When dealing with System I thinking, you can install watchmen over the threshold of your Triggers and Routines by asking pointed questions and responding to your instincts with different practices.
Debiasing Your Individual Instincts
In Impact, Gibbons recommends three major ways you can stand in the way of your own System I thinking and keep it from running away with you:
- Deliberative practice. Our worst biases come out in our assessments of our own work. The paradox of improvement involves wanting to get better, but not knowing how. Add in an all-too-common resistance to feedback and criticism, and the quality of our work plateaus. Deliberate practice hijacks optimism and perfectionism biases by making feedback an inarguable part of the creative process. When you seek consistent feedback after every project and commit to recording your learning, you start to wrangle your gut-reactions to feedback.
- Emotional management. An intense response to critical feedback is usually a symptom of another System I bias: we trust too readily in our emotions. Mindfulness practices can put distance between our thoughts and our feelings, lessening the impact intense emotions have on our reactions and our decision-making.
- Cultivating mental habits. Some mental habits are smaller and more practical than the larger tasks of cultivating mindfulness and deliberate practice. These are habits you can adopt right now, slowing your System I thinking and giving you a more objective relation to your work. A prioritized list of your 3 Most Important Tasks can help you decide what really needs your attention. Implementing a Distraction Page can help you maintain focus on what you’re doing, arresting your bias for urgency as that latest email pings your inbox. Scheduling reflective time to brainstorm and get yourself centered helps defend against a bias for action that might keep you busy, but leave you unproductive.
Debiasing Team Instincts
Teams come with inherent benefits, and their defense against biases is a big one. Teams make fewer intuitive decisions because there are more brains at the table. Because team members need to communicate their rationales to one another, they’re not likely to leave anyone out of the decision-making process. But when teams get comfortable with one another, they can create even more trenchant System I problems. These might include biases towards optimism, action, deadlines, conformity, or fear.
Teams can check their corporate System I thinking by electing a “bias-checker” for their meetings. The whole point of a bias-checker is to speak up with they seen System I thinking guiding the team too much. A dedicated Devil’s Advocate can also stop groupthink in its tracks. Team leaders can guard against the “anchoring effect” by expanding options for feedback and question-asking. Without such measures, some team members are likely to silence themselves and defer to the boss. In all these situations, teams need to remain aware of how much their mood affects the work they do and the biases they struggle with from day to day.
Bias “Killer” Questions
There’s probably no better way to start killing your Individual and Team System I biases than simply asking more questions. Provided on p.174 of Gibbons’s Impact, the following questions can help you halt common biases. Consider building these “bias killers” into your team meetings:
1. Overconfidence Bias
A person may have too much subjective confidence that his or her judgments are better than they really are. Guard your team against Overconfidence by asking, How confident are we in this decision? Are we at least 90%? If so, should we double-check and see if we’ve missed anything?
2. The Deterministic Fallacy
This leads us to assume inevitable connections between causes and effects. For example, an estimating team might draft up an estimate that checks all the boxes, and then get completely blindsided when the project budget exceeds that estimate. They assumed the budget would follow the estimate without variance. Under-budgeting is quite common, and there’s no reason for it to take you completely off-guard. This team could have guarded against their biases by saying, This project estimate meets our criteria. How likely is this estimate to be wrong by 25%? 50%? 100%? How often are we over- or under-budget?
3. The Halo Effect
This is the tendency for good or bad impressions to replicate themselves in unrelated areas. This can be a big problem when, for example, one company is looking to acquire another. A board in such a situation might say, This company looks perfect. Are we allowing ourselves to be swayed by appearances and being insufficiently diligent in our research?
4. The Ludic Fallacy
We overestimate the ability for games to model real-life situations. Most commonly, this means that we expect consequences in proportion to our actions. Life doesn’t work this way, however, which is why Nassim Taleb coined the word extremistan to describe a situation in which one mistake can have disastrous consequences for a long, long time. Always be asking yourself whether you’re truly aware of the implications of your mistakes.
5. The Framing Effect
This leads people to decide between options based on whether those options come with positive or negative connotations; e.g. as losses or as gains. Guard against the Framing Effect by asking yourself, Have we narrowed our options prematurely? Should we be asking ourselves bigger questions? Should we be measuring success in something other than losses or gains?
6. Ostrich Biases
These describe attempts made by leaders and stakeholders to avoid negative information by metaphorically sticking their heads in the sand. Avoid being an ostritch by asking yourselves, Have we been sufficiently diligent in learning from what we find embarrassing?
7. The Zero-Risk Bias
This is a tendency to prefer the complete elimination of a risk in one area to an overall reduction in risk for an entire project. If you’re spending too much time de-risking one small part of a project, say to your teams, I know the possibility is remote, but are we saying the risk is nil or negligible? If those minor risks have great consequences, how can we protect against them?
Test Yourself: Diversity and Inclusion
At present, there is perhaps no better or more urgent example of the need for debiasing than Diversity & Inclusion initiatives. Here, the rubber of debiasing meets the road of a healthy society. Beyond conscious prejudices, System I gut reactions can affect the livelihoods of other human beings. Diversity & Inclusion is a way of bringing our biases to the forefront in a critical way. If we ignore D&I initiatives, this is tantamount to admitting that we are likely ignoring our other biases, at the expense of strong operations.
To not take D&I seriously is to not take any of our biases seriously. D&I is a testing ground for resisting some of our strongest biases, including:
- Affinity bias, or the desire to recruit only people who already fit our cultures.
- Confirmation bias, or the tendency for us to draw conclusions about others based on pre-existing beliefs.
- Attribution bias, where we allow our personal experiences to color all our interactions with people from different cultures.
… And many, many more.
Organizations who treat D&I as an opportunity debias their System I thinking using the strategies presented here set themselves up for better success. By testing your debiasing in this important area, where your efforts have a real positive impact on real people, you move your company culture forward. Leadership Reality even offers proprietary tools to help you get started.
Debiasing takes a lot of time, effort, and humility. Start building your momentum in those places where it matters most!
Related Articles
Take a look as some more related articles below