We are often better at analyzing external situations (the “outside view”) than our own. When you look inward at yourself (the “inside view”), it’s too tempting to consider yourself exceptional— “the average rules and statistics don’t apply to me!” And even when you do get statistics, it’s easy to discard them, especially when they conflict with your personal impressions of the truth.
In general, when you have information about an individual case, it’s tempting to believe the case is exceptional, and to disregard statistics of the class to which the case belongs.
Here are examples of situations where people ignore base statistics and hope for the exceptional:
The planning fallacy is a related phenomenon where you habitually underestimate the amount of time and resources required to finish a project.
When estimating for a project, you tend to give “best case scenario” estimates, rather than confidence ranges. You don’t know what you don’t know about what will happen—the emergencies, loss of motivation, and obstacles that will pop up—and you don’t factor in buffer time for this.
Kahneman gives an example of a curriculum committee meeting to plan a book. They happily estimate 2 years for completion of the book. Kahneman then asks the editor how long other teams have taken. The answer is 7-10 years, with 40% of teams failing to finish at all. Kahneman then asks how their team skill compares to the other teams. The answer is Kahneman’s team is below average.
This was an astounding example of how a person may have relevant statistics in her head, but then completely fails to recall this data as relevant for the situation. (The book did indeed take 8 years.)
Furthermore, before Kahneman asked his questions, the team didn’t even feel they needed information about other teams to make their guess! They looked only at their own data situation.
Government projects have a funny pattern of being universally under budget and delayed. (Though there may be an underlying incentive at play here, since projects that are lower cost and shorter time are easier to get approved.)
The antidote is similar to the correction for heuristics in the last section:
Another technique is to write a premortem: “Imagine that it’s a year from now. We implemented the plan. It was a disaster. Write a history of the disaster.” The premortem has a few advantages:
Finally, when evaluating how well a project was executed, reward people who finish according to their original deadlines, not those who finish much earlier or later than planned.
Optimism has a lot of advantages. Optimistic people are happier, recover from setbacks more easily, have greater self-confidence, feel healthier, and live longer. Research suggests optimism is largely genetic (though some psychologists believe it can be learned).
Optimistic people play a disproportionate role in shaping the world. They are the inventors, entrepreneurs, and political leaders. They take risks and seek challenges. They’re talented but are also lucky (luckier than they acknowledge). Their success confirms their faith in their judgment and their ability to control events.
As described above, most founders know the statistics—most startups fail, and the path that yields the higher expected value path is to sell their services to an employer. To forsake the latter path and start a company, you need to be overoptimistic or deluded.
One drawback to optimism is that it encourages people to take outsized risks because they overestimate their chances of success. Data points that support this:
Despite the humbling statistics on failure, Kahneman notes that yet there is value in the legions of entrepreneurs who try and fail. They perform a market discovery service, figuring out pockets of opportunity that larger companies can later service. Many companies die as “optimistic martyrs.”
And overall, mixing high optimism with good implementation is a positive trait. It allows endurance through setbacks and belief in what one is doing.