One Bias to Rule Them All
The core of our cognitive biases and the roots of our brains' tomfoolery
Quick note: you can still help me fine-tune the newsletter by taking this short survey (7 questions, a few minutes at most).
You, me, everybody
If you’ve read some of my other posts, you know I firmly believe everyone is biased1. I’ve written about availability bias, anti-introversion bias (or the extrovert ideal), halo effects, signal detection theory and partisan bias, face space theory and our love for the average, and algorithmic bias, to name a few.
The collection of cognitive biases, however, is much larger than that handful. Here’s a list. We are not nearly as rational as we think we are and what our brains tell us is often a skewed perception of objective reality.
In 1973, evolutionary biologist Theodosius Dobzhansky wrote an essay titled ‘Nothing in Biology Makes Sense Except in the Light of Evolution’. Since then, the title has evolved into one of the favorite sayings of biologists across the globe. And it’s evolution that will help us make sense of our biases. After all, we’ve known for almost a century that even pigeons develop superstitions…
But that sounds strange, doesn’t it? Why would evolution lead to a biased brain? Wouldn’t it be more useful to have gray matter that accurately tracks reality?
Natural selection, however, favors accuracy only as much as it helps us survive and reproduce. And for that, delusions might be helpful.
Boink in the bush
Our brains like heuristics — mental shortcuts that work all right in most conditions, even if they’re not fully accurate. To use a cliché example, if, in the ancient days, a bush rustled you might have wanted to stick around for observation and assessment, or you might have assumed it was an animal that could eat you and run away.
Or, more relevant in today’s world, if, again in those ancient days, you saw a stranger, chances are they would attack you or infect you with a bug you hadn’t encountered before. Therefore, stranger = treat with suspicion. Sounds familiar, doesn’t it? Those ancient heuristics still rule us. Even if the advantage they offered our distant ancestors was very small, those tiny advantages add up over evolutionary time until the skewed perceptions that led to them have become entrenched mental shortcuts.
A recent study tallies several common cognitive biases and wonders whether a core mechanism underlies them all. The researchers - psychologists Aileen Oeberst and Roland Imhoff - start by proposing six fundamental beliefs that (almost?) everybody holds:
My experience is a reasonable reference.
I make correct assessments of the world.
I am good.
My group is a reasonable reference.
My group members are good.
People’s attributes (not context) shape outcomes.
Our brains, Oeberst and Imhoff claim, can’t help but form those beliefs. Or:
…they are an indispensable part of human cognition.
Why? I think because they’ve proven to be heuristics that can improve - or, once upon a time improved - the odds of survival and reproduction.
The core bias
The funny thing is, for those beliefs to work, they don’t need to be true.
How do those fundamental beliefs then lead to a bunch of biases? That’s the interesting part. Oeberst and Imhoff suggest that, deep down, all biases are confirmation biases. Our brains selectively process the information they receive to confirm one or several of our core beliefs.
Put differently:
Cognitive bias = prior belief + belief-consistent information processing2
For example: My experience is a reasonable reference + feels like everyone is looking at me = spotlight effect.
Another one: I am good + looks like I did a great job = Dunning-Kruger effect.
One more to drive home the point: I make correct assessments of the world + the media is biased = hostile media effect.
Take several of our cognitive biases, smash ‘em together, and we end up with various -isms and -phobias. But, if this core belief + selective processing framework reflects something real about our cognitive mechanisms, it also gives us a short checklist to address our biases.
Does my conclusion/feeling confirm one of my core beliefs (which might require some uncomfortable psychological digging)?
If so, did I accurately assess the information that led to my conclusion? Did I miss something or did I attach too much importance to some bits of info? Do I need more information from different sources?
So… what did I miss?
Everyone else, of course. Hello, confirmation bias.
For the biology/psychology aficionados: this is an interesting difference from the Bayesian approach, which states that we update our prior beliefs with new information. In the view we’re talking about here, how we process that new information is already affected by those beliefs. In other words, Bayesian approaches might overestimate the objectivity with which we process information.
I’ve always found bias so fascinating — how they’ve lingered around and permanently distorted our thinking without our even realizing half the time. Kudos to you for resisting them!
Did you not find anything odd about that list of beliefs from "the experts"? When even smart people think like that, how do we expect the average person to learn how to think properly?