Blindspots. You know what they are. We’re all taught when we learn to drive a car to look back when changing lanes because there’s a blindspot. The literal definition is “an area where a person’s view is obstructed.” In other words, blindspots are things that are there but we can’t see them. Either they’re outside of our field of vision or they’re overlooked due to inattention. One type of blindspot is cognitive bias, where our mind has been conditioned to see in a certain way based on previous experiences, rather than seeing what’s right in front of us.

”A cognitive bias is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them and affects the decisions and judgments that they make.” ~ (VeryWellMind)

So, cognitive biases are errors in thinking that prevent us from seeing clearly. Part of the Seeing Clearly project in 2021 will focus on practices around noticing your own mind and becoming familiar with your errors of judgment. We all have them. And, we can practice making our blindspots conscious.

A good foundational book which outlines and these biases and offers debiasing strategies, is The Blindspots Between Us: How to Overcome Unconscious Cognitive Bias and Build Better Relationships by Gleb TsipurskyAccording to Tsipursky, unconscious cognitive biases or “blindspots” can wreak havoc on our relationships and overall experience of life. It’s important to become aware of how these biases might be causing problems and come up with ways to address them. Below you’ll find a summary of the different types of cognitive biases from the book. But first, Tsipursky explains the difference between the autopilot system (our gut reaction) and the intentional system (the mind and reason). I wrote about this in a previous post, The Difference between Perception and Intuition, based on a similar book, Thinking, Fast and Slow, by Daniel Kahneman.

** Books mentioned have Amazon affiliate links, meaning I make a few cents if you purchase through my link. I only recommend books that I’ve read.

Autopilot and Intentional Systems

The autopilot system corresponds to our emotions and intuition. It activates the fight, freeze, or flight response, as well as other instinctive reactions. This system is a form of wisdom in the body; it provides important information and often “feels true.” According to Tsipursky and Kahneman, however, this system shouldn’t be the sole source of our decision making. It often leads to snap judgments that may be incorrect. Of course, if a car is barrelling towards you, your autopilot system will tell you to get out of the way and you should definitely do that and quickly. Intuition is a different story; it’s based on pattern recognition from previous experiences and can lead to cognitive biases. The autopilot system comes in handy when doing simple tasks. Tsipursky warns that we can’t always tell lies from truths. He says that “on average we detect only 54 percent of lies; shocking when you consider that we’d get 50 percent if we used random chance.” In an age with so much misinformation and disinformation out there, this is important to keep in mind. Something that “feels true” may not be. This is where the intentional system comes in.

The intentional system centers around the prefrontal cortex, and reflects rational thinking. You can use this system as a check on the autopilot system, to spot situations where you might be making a mistake due to cognitive biases, and correct any errors. We can only perceive the intentional parts of ourselves, and so we think that’s all there is but there may be unconscious biases determining our decisions. We can make mistakes by going only with our gut.

Cognitive biases are judgments we make based on misinformation. These are different from logical fallacies, which are errors in reasoning that people make during disagreements, usually with the intention of winning an argument. This is disinformation. One logical fallacy is  “cherry-picking,” when someone selects from a small sample of evidence that supports their side of an argument. They conveniently leave out what may be a larger body of evidence that opposes their perspective. You see this all over social media right now, where people use cherry-picked articles to counter each other and to promote what they believe to be true. Our cognitive biases make us vulnerable to these logical fallacies. By learning about your own cognitive biases, you can avoid being manipulated.

Cognitive Biases

Attribution Errors: These are also called correspondence biases. They occur when you attribute someone’s behaviour or actions to who they are as a person rather than the situation at hand. For example, someone cuts you off in traffic and so they are rude. The same happens to groups, when the behaviour of an individual or group of individuals in a group is attributed to the group as a whole. On the flip side, ideas about a group can be attributed to individuals within the group, even when these individuals vary widely. What Tsipursky calls the Ultimate Attribution Error is when we misattribute problematic group behaviours to the groups that we don’t like and “good” behaviours to groups we like. Here’s a short video from Khan Academy that describes attribution errors.

Individual Thinking Errors: Your autopilot system believes the world revolves around you but it actually doesn’t! Your perception of self is a mental construct, a story that your mind tells you that helps you function. One thinking error is that we tend to overestimate our positive qualities and discount our negative ones and this can cause harm in our relationships. We also love to compare ourselves to others and compete with those in our social and work circles. This can alienate us from others. What makes us truly happy are our social connections and relationships. And finally, we tend to give ourselves more credit for our successes than is deserved, while blaming others for our failures. If this isn’t you, congratulations!

Group Thinking Errors: The way our community or social circle thinks about “others” can very much influence our thinking and cause us to make errors that damage relationships with those not like us. They can lead to discrimination around ethnicity, sex, gender, religion, age, politics, disability, and geographic origins. The Horns Effect is when you don’t like some aspect of a person (their accent, skin colour, political affiliation, etc.), and you evaluate that individual too harshly. This can cause damage in the workplace especially. On the opposite end, the Halo Effect is when you feel a strong liking or positive opinion for someone due to a corresponding tribal affiliation. Who do we make into gurus?

Judgment Errors: There are many judgment errors we make about people due to an inability to read their nonverbal signals or the fact that they’re holding back something. The illusion of transparency is when we overestimate the extent to which others understand our feelings and thoughts. We’re overconfident about how well other people read us. It’s better to over communicate when exploring others’ perspectives, needs, and wants and in sharing your own. Another interesting judgment error is the “just-world fallacy,” a false expectation that the world is just, with those who do good being rewarded and evildoers punished. The curse of knowledge happens when you know too much about a topic, and therefore find it difficult to explain your knowledge to someone new to the topic. And finally, the false consensus effect is when you overestimate the extent to which others agree with you, creating a sense of false alignment. This effect damages society by exacerbating social polarization.

The Impact of Emotions: We tend to perceive ourselves as moved by logic, not emotions. Yet in reality, Tsipursky says we ‘re more emotional than logical. The Empathy Gap is when we underestimate the impact that emotions play in others, as well as ourselves. When emotion comes into a debate, our autopilot system activates and we rely on our gut more than the more deliberative intentional system. In a cool state, we would never say some of the things we do in emotional or heated states. Just as what we sometimes say online, we wouldn’t say to a person’s face. Those in positions of privilege can easily dismiss the experiences of those who don’t share their privilege. We could go a long way towards healing social ills by acknowledging the experiences of others that might be very different than our own. The Bystander Effect is about how we behave when we see a stranger in trouble. We are much more willing to help people in a critical situation if we are the only one available to help. When others are around, we tend to rely on someone else to take the lead. We’re reluctant to behave in a way that’s outside the norm because others might judge us negatively and reject us.

Assessing Risks and Rewards: Optimism Bias occurs when we underestimate the likelihood of negative future events. Tsipursky says that 80% of us — across race, ethnicity, and gender lines — have an optimism bias, which can be beneficial. It helps decrease depression and anxiety, improves health and longevity, and increases productivity. But there are dangers too, such as engaging in harmful behaviours, like smoking, eating, and excessive spending, due to downplaying  the costs of these behaviours. On the flip side, Pessimism Bias occurs when we overestimate future dangers. Risk-averse people conserve resources and let the optimists take risks. Pessimism tends to be less healthy than optimism, with these people suffering more from depression and having worse physical health outcomes. In American society, pessimism is often looked down upon, yet it can also be beneficial, leading to self-reflection and progress. Tsipursky recommends being a realistic optimist or a realistic pessimist.

To Conform or Not: Reactance is when we feel negative emotions when someone or something limits our freedom of behavior or range of choices. Sound familiar? Questioning mainstream trends is good in that it can lead to innovation, through improving existing ways of doing things as well as adapting to changes in the environment.  Authority Bias is the opposite of reactance and is also called “obedience bias.” It is the tendency to give more weight to and obey those we perceive as authorities than we objectively should. Much of the trouble with authority bias stems from giving too much trust to authority figures and overestimating the extent to which complying with authority figures benefits you. There has to be a happy medium between the two. Be a reasonable skeptic?

Debiasing Strategies

Just knowing about a cognitive bias is a good start but doesn’t solve the problem. You have to intentionally practice strategies for debiasing.

One of the simplest ways to shift from the autopilot system to the intentional one  involves simply delaying decisions and reactions, taking a mindful pause. When our sympathetic nervous system (fight, freeze, or flight response) kicks in, it’s time to activate your parasympathetic nervous system, also called the rest-and-digest system. You can take three deep in-breaths and out-breaths immediately and then delay your response.

Another debiasing strategy is to think in terms of the probabilities of what reality looks like and updating your beliefs about the world when new information becomes available. This year, 2020, is a great time to practice as so much is uncertain and new information comes on a daily basis. Are you updating to the new reality or staying set in your judgments? You can also practice testing your predictions about the future. For example, your predict how someone will respond to a situation based on your previous experiences of them. If your prediction doesn’t come true, maybe you need to update your mental model. We’re all changing all the time so this is crucial to healthy relationships. Other strategies include considering alternative explanations, identifying problematic habits in yourself, considering other people’s points of view, and getting curious about other perspectives.

When it comes to communicating with others, especially those who hold irrational beliefs (ones that go against facts), don’t try to correct their false beliefs. This almost never leads to people changing their mind. Instead, Tsipursky suggests focusing on reaching them emotionally in a way that gets to the heart of their beliefs. He suggests trying EGRIP, which stands for emotions, goals, rapport, information, and positive reinforcement. Identify emotional blocks. Establish shared goals. Use empathetic listening and share stories. Share facts. Reinforce any movement toward facts. Using EGRIP only works when the person holds false beliefs that contradict their goals. You can use your intentional system to help those you speak to accept reality.

Our autopilot system adapts based on previous ways of surviving. Yet our present situation can require different responses. The current ever-intensifying pace of change means our gut reactions will be less and less suited in the future, so relying on our autopilot system will not be as helpful when making decisions. Focus on getting to know your own cognitive biases, rather than pointing them out to others. It’s an ongoing practice that’s not easy and you’ll never have it all down!

Body, mind, and heart must work together to see clearly.

Click on the button below to see what Seeing Clearly 2021 is all about. We start on January 3rd.

Share This