Well-being: Fuzzy thinking
Know your blind sides.... and use them to your advantage
I recently listened to a talk that the late author William Burroughs delivered to creative writing students at Naropa University in 1979. He touches on many themes during this lecture, but the one that struck me most was the idea that we're only equipped to learn about themes that resonate with our current world view. "You can't tell anybody anything they don't already know," he says.
At first I put this curious philosophy down to one of Burroughs' metaphysical flourishes. Then I remembered a phenomenon that's known as Confirmation Bias. This well-documented reasoning flaw shows that we often seek information that supports our world view, or else we interpret it in such a way that it confirms our preconceived ideas.
According to researchers, confirmation bias is particularly prevalent in those with rigid beliefs: conspiracy theorists, religious fundamentalists, Crossfit enthusiasts... It's important to bear this in mind if you're engaged in a debate with a person of this ilk. Are they actually hearing what you're saying, or are they unconsciously filtering out anything that doesn't resonate with their viewpoint?
Confirmation bias is one of hundreds of cognitive biases that impede truly objective analysis. It's very hard to overcome them (researchers have discovered that we're even biased about how biased we actually are). However, we can become more mindful of them, and maybe even use them to our advantage.
Many cognitive biases prove that we often judge others unfairly. Out-group Homogeneity Bias is the perception that those outside our group or tribe are more similar to each other than they really are. In other words, if you've ever looked at a group of friends and sneered 'hipsters', you might be surprised to discover that this group has made similar snap judgements about you and your group. This is similar to Essentialism, a cognitive bias that starts in early childhood and causes us to categorise people despite variations.
We have an equally simplistic view of other people's shortcomings. The Actor/Observer Difference is a cognitive bias that makes us attribute our own actions to situational factors and other people's actions to dispositional factors. In other words, if you make a mistake, it's because you're overworked. If somebody else makes a mistake, it's because they're incompetent.
Some cognitive biases are manipulated by those in the know. Take Availability Cascade, which is the tendency for information to become more plausible the more it is discussed in the public domain. This particular cognitive bias is the bedrock of public relations. I read it in a magazine.
Elsewhere, the Availability Heuristic is a phenomenon that makes us rely on readily available examples when making a decision. Those in the legal profession manipulate this bias by using vivid, evocative language when mounting their case. Meanwhile, politicians and lobbyists take advantage of what is known as the Identifiable Victim Effect, which is our tendency to empathise with individuals rather than larger groups.
Other cognitive biases aren't capitalised on as much as they ought to be. The Peak-End Rule is one such example. This cognitive bias shows that we judge an experience almost entirely on how it was at its peak and its end. Those in the service industry could use this to their advantage by putting more weight on the end experience: a late check-out or a complimentary shot of limoncello goes a long way.
We can also learn to manipulate Frequency Illusion, aka the Baader-Meinhof phenomenon. This is the tendency for us to notice newly discovered information with increased frequency - car brands, billboards, etc. Frequency illusion occurs because the brain is a pattern recognition machine. Hence, if we want to change our patterns, we must first expose ourselves to new stimuli. (This is my rationale for trying on jewellery that I can't afford.)
Cognitive biases can also help us understand our inherent self-centredness. The Lake Wobegon Effect is our tendency to overestimate our achievements; the False Consensus Effect is our tendency to overestimate the degree to which people agree with our opinions; the Curse of Knowledge is the inability of more knowledgeable parties to think about problems from the perspective of less knowledgeable parties. Mother Teresa said "humility is the mother of all virtues". When you consider the evidence, she was probably right.
Cognitive biases colour positive/negative perspectives too. The Negativity Bias proves that we have a greater recall of unpleasant memories compared to positive ones. The Impact Bias proves that we overestimate the duration and intensity of our emotional reactions to future events. In short: it wasn't as bad as you remember, and it won't be as bad as you think.
It's also important to take note of the Planning Fallacy - our tendency to underestimate the time required to finish a task; Unit Bias - the compulsion to finish an item or task (whether it's the food on your plate or the items on your to-do list) and the Denomination Effect - our tendency to spend more money when it is in smaller amounts (coins) rather than large ones (notes).
Our perspectives will always be skewed by the filters of cognitive biases, but we can learn to adapt accordingly.
Health & Living