Part 1-5: Biases of System 1

Putting it all together, we are most vulnerable to biases when:

  • System 1 forms a narrative that conveniently connects the dots and doesn’t express surprise.
  • Because of the cognitive ease by System 1, System 2 is not invoked to question the data. It merely accepts the conclusions of System 1.

In day-to-day life, this is acceptable if the conclusions are likely to be correct, the costs of a mistake are acceptable, and if the jump saves time and effort. You don’t question whether to brush your teeth each day, for example.

In contrast, this shortcut in thinking is risky when the stakes are high and there’s no time to collect more information, like when serving on a jury, deciding which job applicant to hire, or how to behave in an weather emergency.

We’ll end part 1 with a collection of biases.

What You See is All There Is: WYSIATI

When presented with evidence, especially those that confirm your mental model, you do not question what evidence might be missing. System 1 seeks to build the most coherent story it can - it does not stop to examine the quality and the quantity of information.

In an experiment, three groups were given background to a legal case. Then one group was given just the plaintiff’s argument, another the defendant’s argument, and the last both arguments.

Those given only one side gave a more skewed judgment, and were more confident of their judgments than those given both sides, even though they were fully aware of the setup.

We often fail to account for critical evidence that is missing.

Halo Effect

If you think positively about something, it extends to everything else you can think about that thing.

Say you find someone visually attractive and you like this person for that reason. As a result, you are more likely to find her intelligent or capable, even if you have no evidence of this. Even further, you tend to like intelligent people, and now that you think she’s intelligent, you like her better than you did before, causing a feedback loop.

In other words, your emotional response fills in the blanks for what’s cognitively missing from your understanding.

The Halo Effect forms a simpler, more coherent story by generalizing one attribute to the entire person. Inconsistencies about a person, if you like one thing about them but dislike another, are harder to understand. “Hitler loved dogs and little children” is troubling for many to comprehend.

Ordering Effect

First impressions matter. They form the “trunk of the tree” to which later impressions are attached like branches. It takes a lot of work to reorder the impressions to form a new trunk.

Consider two people who are described as follows:

  • Amos: intelligent, hard-working, strategic, suspicious, selfish
  • Barry: selfish, suspicious, strategic, hard-working, intelligent

Most likely you viewed Amos as the more likable person, even though the five words used are identical, just differently ordered. The initial traits change your interpretation of the traits that appear later.

This explains a number of effects:

  • Pygmalion effect: A person’s expectation of a target person affects the target person’s performance. Have higher expectations of a person, and they will tend to do better.
    • In an experiment, students were randomly ordered in a report of academic performance. This report was then given to teachers. Students who were randomly rated as more competent ended the year with better academic scores, even though they started the school year with no average difference.
  • Kahneman previously graded exams by going through an entire student’s test before going to the next student’s. He found that the student’s first essay dramatically influenced his interpretation of later essays - an excellent first essay would earn the student benefit of the doubt on a poor second essay. A poor first essay would cast doubt on later effective essays. He subverted this by batching by essay and iterating through all students.
  • Work meetings often polarize around the first and most vocal people to speak. Meetings would better yield the best ideas if people could write down opinions beforehand.
  • Witnesses are not allowed to discuss events in a trial before testimony.

The antidote to the ordering effect:

  • Before having a public discussion on a topic, elicit opinions from the group confidentially first. This avoids bias in favor of the first speakers.

Confirmation Bias

Confirmation bias is the tendency to find and interpret information in a way that confirms your prior beliefs.

This materializes in a few ways:

  • We selectively pay attention to data that fit our prior beliefs and discard data that don’t.
  • We seek out sources that tend to give us confirmatory data, and reject sources that contradict our beliefs.
  • We recall information that confirms our beliefs more readily than contradictory information.

Mere Exposure Effect

Exposing someone to an input repeatedly makes them like it more. Having a memory of a word, phrase, or idea makes it easier to see again.

See the discussion in Part 1.2.

Narrative Fallacy

This is explained more in Part 2, but it deals with System 1 thinking.

People want to believe a story and will seek cause-and-effect explanations in times of uncertainty. This helps explain the following:

  • Stock market movements are explained like horoscopes, where the same explanation can be used to justify both rises and drops (for instance, the capture of Saddam Hussein was used to explain both the rise and subsequent fall of bond prices).
  • Most religions explain the creation of earth, of humans, and of the afterlife.
  • Famous people are given origin stories - Steve Jobs reached his success because of his abandonment by his birth parents. Sports stars who lose a championship have the loss attributed to a host of reasons.

Once a story is established, it becomes difficult to overwrite. (Shortform note: this helps explain why frauds like Theranos and Enron were allowed to perpetuate - observers believed the story they wanted to hear.)

Affect Heuristic

How you like or dislike something determines your beliefs about the world.

For example, say you’re making a decision with two options. If you like one particular option, you’ll believe the benefits are better and the costs/risks more manageable than those of alternatives. The inverse is true of options you dislike.

Interestingly, if you get a new piece of information about an option’s benefits, you will also decrease your assessment of the risks, even though you haven’t gotten any new information about the risks. You just feel better about the option, which makes you downplay the risks.

Vulnerability to Bias

We’re more vulnerable to biases when System 2 is taxed.

To explain this, psychologist Daniel Gilbert has a model of how we come to believe ideas:

  • System 1 constructs the best possible interpretation of the belief - if the idea were true, what does it mean?
  • System 2 evaluates whether to believe the idea - “unbelieving” false ideas.

When System 2 is taxed, then it does not attack System 1’s belief with as much scrutiny. Thus, we’re more likely to accept what it says.

Experiments show that when System 2 is taxed (like when forced to hold digits in memory), you become more susceptible to false sentences. You’ll believe almost anything.

This might explain why infomercials are effective late at night. It may also explain why societies in turmoil might apply less logical thinking to persuasive arguments, such as Germany during Hitler’s rise.