
We are not always right, and we all know that. But sometimes, even when our actions might seem excessive or wrong, our brain’s shortcuts are designed to save our lives.
Late one evening, a young woman waited for the elevator in her apartment building. She was tired after a long day in the office. When the elevator doors opened, she saw a man inside whom she did not recognize. Nothing seemed overtly threatening, but something in her body tensed. Suddenly, all her senses were alert. His expression, clothes, posture, and the way he stood a little too close to the buttons, while refusing to make eye contact, triggered a quiet alarm in her mind. She hesitated, smiled politely, and decided to wait for the next elevator. Moments later, she heard shouting from a neighbour who had entered after her: the man had tried to force his way onto her floor. Her decision, guided not by logic but by instinct, might have saved her life. Was that bias, and if so, was it bad?
What Is a Bias, really?
When most people hear “bias,” they think of mistakes or unfair judgments. In psychology, however, bias is more accurately understood as a cognitive shortcut, a tool our brain uses to make decisions faster and conserve mental energy.
These shortcuts are part of what Nobel laureate Daniel Kahneman calls System 1 of thinking. System one is fast, automatic, and intuitive. It helps us respond quickly to everyday situations: crossing the street when we see a car, recognizing a friend in a crowd, or choosing a meal without overthinking. System two of thinking, by contrast, is slower, deliberate, and analytical, reserved for complex reasoning.
As Kahneman explains in Thinking, Fast and Slow (2011), “If your predictions are completely free of bias, you will never experience the satisfaction of correctly anticipating an extreme case. You will never be able to say, ‘I knew it!’”
He illustrates this with bankers evaluating potential investments. While careful analysis (System 2) might focus on past failures or market trends, intuitive judgments (System 1) can sometimes allow them to spot opportunities others overlook. An investor’s quick, heuristic-based assessment can lead to profitable investments, or, conversely, prevent huge losses, not despite bias, but because the brain’s shortcuts highlighted possibilities that slow, analytical reasoning might miss.
The Bias Blind Spot: Why We Think We Are Different
We all like to believe we are rational and fair-minded, but our brains have a trick. Psychologists call it the bias blind spot: the tendency to see biases in others but fail to notice them in ourselves.
Imagine a group of bankers and investors evaluating a new investment. They might criticize competitors for ignoring risks or making emotionally driven decisions and yet at the same time, each one may be unaware of their own cognitive shortcuts favouring familiar industries, overestimating past successes, or underestimating risks. This is bias blind spot in action: we notice errors in others but assume our own reasoning is objective.
This happens because our intuitive, fast-thinking System 1 works behind the scenes. We see patterns, shortcuts, and errors in other people, but we assume our own reasoning comes from careful, rational thought. As a result, we underestimate our own cognitive shortcuts while overestimating others.
Recognizing the bias blind spot is the first step toward more reflective thinking. When we admit, “I might be biased too,” we open the door to better decisions, fairer judgments, and a more realistic understanding of ourselves and others.
Rethinking Bias
Bias often gets a bad reputation, as if it were something that needs to be eradicated entirely. Yet our cognitive shortcuts are essential tools. They allow us to navigate daily life, make quick decisions, and even spot opportunities that careful analysis alone might miss. The key is not to eliminate bias as this would be an impossible task, but rather to understand, refine, and manage it.
By becoming aware of our biases, we gain the power to let them work for us rather than against us. Intuitive judgments can guide us toward a smart investment or a quick safety decision, but unchecked, the same instincts can lead to errors or unfair judgments. Recognizing these patterns helps us engage System 2 thinking when stakes are high, and trust System 1 when speed matters.
Bias is not a flaw; it is a feature of human cognition. Our challenge is to teach these shortcuts when to step aside, and when to take the lead.
And next time your instinct tells you something is off, remember: your biases are not always wrong. Sometimes, they might just be saving your life.
Reference:
Kahneman, D. (2015). Thinking, fast and slow. Bucharest, Publica
Further Reading
“The heuristics and biases of top managers: Past, present and future,” 2023. DOI:10.1111/joms.12937
Gigerenzer, G., & Todd, P. M. (1999). Simple Heuristics That Make Us Smart. Oxford University Press.
Klein, G. (1999). Sources of Power: How People Make Decisions. MIT Press.