Putting it all together, we are most vulnerable to biases when:
In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.
In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in an weather emergency.
We’ll end part 1 with a collection of biases.
When presented with evidence, especially those that confirm your mental model, you do not question what evidence might be missing. System 1 seeks to build the most coherent story it can - it does not stop to examine the quality and the quantity of information.
In an experiment, three groups were given background to a legal case. Then one group was given just the plaintiff’s argument, another the defendant’s argument, and the last both arguments.
Those given only one side gave a more skewed judgment, and were more confident of their judgments than those given both sides, even though they were fully aware of the setup.
We often fail to account for critical evidence that is missing.
If you think positively about something, it extends to everything else you can think about that thing.
Say you find someone visually attractive and you like this person for that reason. As a result, you are more likely to find her intelligent or capable, even if you have no evidence of this. Even further, you tend to like intelligent people, and now that you think she’s intelligent, you like her better than you did before, causing a feedback loop.
In other words, your emotional response fills in the blanks for what’s cognitively missing from your understanding.
The Halo Effect forms a simpler, more coherent story by generalizing one attribute to the entire person. Inconsistencies about a person, if you like one thing about them but dislike another, are harder to understand. “Hitler loved dogs and little children” is troubling for many to comprehend.
First impressions matter. They form the “trunk of the tree” to which later impressions are attached like branches. It takes a lot of work to reorder the impressions to form a new trunk.
Consider two people who are described as follows:
Most likely you viewed Amos as the more likable person, even though the five words used are identical, just differently ordered. The initial traits change your interpretation of the traits that appear later.
This explains a number of effects:
The antidote to the ordering effect:
Confirmation bias is the tendency to find and interpret information in a way that confirms your prior beliefs.
This materializes in a few ways:
Exposing someone to an input repeatedly makes them like it more. Having a memory of a word, phrase, or idea makes it easier to see again.
See the discussion in Part 1.2.
This is explained more in Part 2, but it deals with System 1 thinking.
People want to believe a story and will seek cause-and-effect explanations in times of uncertainty. This helps explain the following:
Once a story is established, it becomes difficult to overwrite. (Shortform note: this helps explain why frauds like Theranos and Enron were allowed to perpetuate - observers believed the story they wanted to hear.)
How you like or dislike something determines your beliefs about the world.
For example, say you’re making a decision with two options. If you like one particular option, you’ll believe the benefits are better and the costs/risks more manageable than those of alternatives. The inverse is true of options you dislike.
Interestingly, if you get a new piece of information about an option’s benefits, you will also decrease your assessment of the risks, even though you haven’t gotten any new information about the risks. You just feel better about the option, which makes you downplay the risks.
We’re more vulnerable to biases when System 2 is taxed.
To explain this, psychologist Daniel Gilbert has a model of how we come to believe ideas:
When System 2 is taxed, then it does not attack System 1’s belief with as much scrutiny. Thus, we’re more likely to accept what it says.
Experiments show that when System 2 is taxed (like when forced to hold digits in memory), you become more susceptible to false sentences. You’ll believe almost anything.
This might explain why infomercials are effective late at night. It may also explain why societies in turmoil might apply less logical thinking to persuasive arguments, such as Germany during Hitler’s rise.