Introduction
In 2011, Daniel Kahneman – the only non-economist that’s ever won an economic Nobel prize – published a book titled “Thinking Fast and Slow”. Without going too much into the details of the books, he theorized the brain as working with two distinct modes of thought. Basically, the first one would be what you would naturally associate with instinct. This system of thought is fast, effortless, unconscious and uses stereotypes to allow us to do so. The second system is the exact opposite. It’s linked to the more reflexive part of our brain. It will allow you to make decisions and reflect more consciously and logically but it is a much slower process and requires a lot of effort from our brain. When using the first system of thought and because of the pace of the process, this is where you are most likely to make decisions that are heavily influenced by cognitive biases. A bias is a logical fallacy your brain unconsciously commits which makes you see the reality distorted from what it really is. Biases wrong our brain in a predictive and repetitive way. They are systemic.
By now, you might be telling yourself that the first system of thought is to be avoided if possible and that the second one is much more convenient in order to make executive decisions within your team or organisation. This is where you’re mistaken. As a leader or manager, you might find yourself in a position where your decision must be taken quickly. Moreover, the second system is useful, yes, absolutely. However, you can’t empirically and consciously think your way through all the steps before taking any decisions. This would automatically lead you to an important inertia. The first system is thus not wrong per se, but how can we ensure we use it to the maximum of its capacities while avoiding some biases? This month, we dive into this topic by exploring a few biases we might be subject to and explore how we can best prevent them from happening to improve our decision-making system.
Still unconvinced of the existence of biases? Super quickly: think of a tool from your toolset. What’s the first thing coming to your mind? Now think of its colour. When you visualize a tool and its colour, go to the footnotes of this entry[i].
Biases
Since we focus here on biases for managers and leaders, we’ve chosen to discuss a few biases that are specific to the decision-making process within a team or an organisation.
A. Survivorship bias:
The first bias we would like to reflect upon is the survivorship bias. In a nutshell, this bias makes us take a decision based on a sample that’s already passed a first selection filter. We tend to take our decisions based on a successful sample, wronging our sense of what actually works or not.
This bias was coined after an episode that would have happened during WWII. The story goes on to say that the American aircrafts were shut down by the enemies and the Allies wanted, quite logically, to strengthen their planes where it was the most vulnerable. They asked a mathematician to inspect the returning vehicles, see where they had been hit by bullets and propose ways to strengthen the parts that had been hit. The scientist refused to proceed and said it would not make sense to reinforce those parts. If anything, he said, it showed the planes were strong enough to sustain being hit by the enemies and they should rather make their changes based on crashed ones. On a perhaps more relatable point, this bias is extremely present in the private sector. We tend to drive a company based on what we see has worked splendidly in billion-dollar companies, obliterating the fact that hundreds of other companies have tried the same thing and were not merely as successful because many more (exogenous) factors are at a play to explain the success of a small start-up now selling space aircrafts to the Nasa.
B. Experience bias:
Similarly, the experience bias makes us take a decision based on what we know and what we think is comparable to the context we find ourselves in. An example we found quite enlightening was the different reactions governments have/have had to the covid epidemic. Asia and Europe, for instance, had a highly different approach. Asia had already had big epidemics of SRAS viruses and Europe had more experience with viruses like flus, so we took a different path of reaction at first.
On a more managerial level, this bias can present itself by putting your (or older colleague’s) past experiences on a pedestal without seeing how things can/could be done differently, in a more efficient way even if it’s not the “way we’ve always done this”. Again, experience bias is not bad, it just has to be acknowledged to avoid making decisions based solely on paths that have already been taken, it considerably limits your field of action and distort the objective reality you’re evolving in.
C. Confirmation biases:
Maybe the best-known bias amongst all, and yet, the one that’s the hardest to prevent. Confirmation bias refers to the tendency our brain has to focus on elements that would confirm our initial opinion on a certain matter and dismiss the information that would prove us wrong.
In a managerial setting, this can manifest itself during a job interview. The first two minutes would be the crucial moment where we unconsciously take a decision about a candidate, study suggests. Afterwards, and depending on the decision we have taken, the questions we’d ask them would be oriented to confirm our original thought. For example, say you’re not interested in the profile, you’d tend to ask slightly harder or trickier questions to the person you’re interviewing. This is important to take into account considering the two minutes in which we make a decision are also heavily influenced by stereotypes we might have, leading you to have a less inclusive, diverse and committed team later on.
D. Biases in probability assessment, (see also: overconfidence, or hypo confidence of the self)
Those biases are mainly due to our brain hating statistics. That is just it. It cannot comprehend quickly and correctly what a statistic means and will wrongly use it during our decision-making process.
For instance, if you were to be told the likelihood of a plane crash (planes again, we know.) has increased in the past year from 0,01% to 0,02% you certainly would not think much of it.
Let’s do the same exercise but with different numbers. The likelihood for a plane to crash has increased from 40% to 80%. Granted a likelihood of crashing at 40% is already gigantic and most people wouldn’t risk it, chances are that with a probability that has been doubled and at 80% you’d really be worried if you had to take a plane. Now, take a look again at the numbers in the first example. The increase is also a dramatic 50%. It’s still not highly probable, but from the first example to the second, our brain perceived and interpreted the risk completely differently.
Similarly, since you first started reading this entry, you might be telling yourself that “this is an interesting read but that it doesn’t really apply to my case, because I already knew about that kind of stuff. Also somehow, I always reflect on my decisions and/or I’m a pragmatic at heart so that couldn’t happen to me”. Well, chances are that most of the other people reading those lines have thought exactly the same. Even if this is statistically impossible for a majority of people to be above the majoritarian way of processing information.
This is both due to a denial bias and overconfidence bias, that we, as human beings tend to naturally have. Do the test at home and ask your relatives if they think they drive better than the majority of us; just as good as any other person, or not as good as most people. You’d be surprised how many unsuspected prodigious drivers live amongst us.
Finally, many other biases could be cited – the official list has more than 180 entries. We’ve selected a few of the most reoccurring ones when leading a team or a project, but we also advise you to go and learn about the anchoring bias and framing bias, the Dunning-Krueger effect and so many others. The space we have in this entry is limited but trainings around the detection of biases are in the making.
How can we prevent our brains from being tricked?
So, discussing biases is interesting but once we’ve talked about them, what can we exactly do to reduce their influence in our decision-making process? We’d like to think merely being aware of their existence would help us get rid of them. Well, while it is certainly a good first step it shouldn’t be limited to that because you know by now that your brain likes to trick you and itself. For instance, look at the following optical illusion:
Figure 1: Checker Shadow Illusion
By Original by Edward H. Adelson – File created by Adrian Pingstone, based on the original created by Edward H. Adelson, Copyrighted free use, https://commons.wikimedia.org/w/index.php?curid=45737683
In this optical illusion A and B are the same colour. You can check it by linking the two squares, just like this:
Figure 2: Checker Shadow Illusion
By Original by Edward H. Adelson – File created by Adrian Pingstone, based on the original created by Edward H. Adelson, Copyrighted free use, https://commons.wikimedia.org/w/index.php?curid=45737683
Now, if you look back at the first figure, you’ll see your brain still sees two different colours. It knows it is under the influence of an optical illusion, but it is unable to actually learn from that and see the same colours. The exact same thing happens with cognitive biases. It’s not because you know that you’ll be exempted from them. So, what can you do? Developing strategy tables with explicit criteria or behaviours to adopt in case X situation happen is a good way to make sure your first system of thought won’t run amok once it must take a decision in an emergency. We’d advise you to also challenge yourself and your team – whilst paying attention to the “group think” bias – and be able to have doubts: why does this person have this opinion and why are we in contradiction?
In sum, finding a balance between method (strategic grids with explicit and thought-out criteria, protocols, norms, etc.) and team spirit (trust, contradiction, and challenges) would be an excellent second step to reduce the biases in your decision-making process. Interested in learning more about this topic? We are working hard at EIPA to provide you with a tailored program to prevent biases and master behavioural techniques that will guide you in making optimal decisions for you and your team, please feel free to reach out.
See you next month!
[i] Most people think of a blue hammer. Did you?
Do you have questions? Do not hesitate to contact the Negotiation Team at negotiationteam@eipa.eu
The views expressed in this blog are those of the authors and not necessarily those of EIPA.