Narrow-Minded: What bias tells us about how we think

Brains can do the most amazing things. They help us think about ideas, have interesting conversations, learn new dance moves, stay alive, and do all those other everyday activities we generally take for granted. For the most part, brains are really good at what they do. Of course, it would be impossible for them to do everything well, which might explain why we have so many cognitive biases.

Since Daniel Kahneman and Amos Tversky published their study on bias and “Subjective Probability” in 1972, about 200 types of cognitive bias have been discovered. There’s ingroup bias, which is the tendency to “give preferential treatment to others [we] perceive to be members of [our] own groups.” Then there’s consistency bias, which is the tendency to “[remember] one’s past attitudes and behaviors as resembling present attitudes and behaviour.” There’s also the framing bias, which is the tendency to “[draw] different conclusions from the same information, depending on how that information is presented.” The list goes on. But where are all these biases coming from? And are they really all that different?

It would be useful if we could travel back in time and study exactly how cognitive biases came into existence. Even though we can’t literally time travel, evolutionary psychologists recommend thinking about cognitive biases in terms of the problems they evolved to address. So instead of asking, “Why isn’t the brain better at doing what we think it’s supposed to do?” we should be asking, “What evolutionary pressures might have selected for cognitive biases?” Fortunately, someone ending up having a similar idea.

In his Cognitive Bias Cheat Sheet, Buster Benson identifies four categories of problems he believes our biases evolved to help us navigate:
Cognitive bias codex

TOO MUCH INFORMATION

In order to avoid drowning in information overload, our brains need to skim and filter insane amounts of information and quickly, almost effortlessly, decide which few things … are actually important and call those out …

[Downside:] We don’t see everything. Some of the information we filter out is actually useful and important.

NOT ENOUGH MEANING

In order to construct meaning out of the bits and pieces of information that come to our attention, we need to fill in the gaps, and map it all to our existing mental models …

[Downside:] Our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions, and construct meaning and stories that aren’t really there.

NEED TO ACT FAST

In order to act fast, our brains need to make split-second decisions that could impact our chances for survival, security, or success, and feel confident that we can make things happen …

[Downside:] Quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, and counter-productive.

WHAT SHOULD WE REMEMBER

[I]n order to keep doing all of this as efficiently as possible, our brains need to remember the most important and useful bits of new information and inform the other systems so they can adapt and improve over time …

[Downside:] Our memory reinforces errors. Some of the stuff we remember for later just makes all of the above systems more biased[.]

These problems may sound familiar. If so, it might be because Benson is basically describing how our cognitive biases map to the Ladder of Inference. Developed by Harvard Business School professor Chris Argyris, the Ladder of Inference illustrates thinking as a step-by-step process from data, meanings, and assumptions to conclusions, beliefs, and actions. Looking at these two frameworks together, each set of cognition challenges is like a rung on the ladder corresponding to where something might go wrong in our thought processes.

Adapting to the world moment by moment requires our brains to continuously make inferences about what happened in the past, what’s happening now, what will happen in the future. All of those inferences are subject to the problems of dealing with “too much information,” “too little meaning,” urgency, and importance. If we’re constantly using our biases to make sense of the world, maybe it shouldn’t be surprising that we have so many of them.

REFERENCE
Cognitive Bias Cheat Sheet | Better Humans: Buster Benson

RELATED
Subjective Probably: A Judgment of Representativeness | Daniel Kahneman • Amos Tversky
List of Cognitive Biases | Wikipedia
The Evolution of Cognitive Bias | Martie G. Haselton • Daniel Nettle • Damian R. Murray
The Fifth Discipline Fieldbook | Peter M. Senge • Art Kleiner • Charlotte Roberts • Bryan J. Smith

 

 

 

 

 

 

 

Advertisements