Basic theory suggests that people have indifference curves when relating two dimensions, like salary and number of vacation days. Say that you value one day’s salary at about the same as one vacation day.
Theoretically, you should be willing to trade for any other portion of the indifference curve at any time. So when at the end of the year, your boss says you’re getting a raise, and you have the choice of 5 extra days of vacation or a salary raise equivalent to 5 days of salary, you see them as pretty equivalent.
But say you get presented with another scenario. Your boss presents a new compensation package, saying that you can get 5 extra days of vacation per year, but then have to take a cut of salary equivalent to 5 days of pay. How would you feel about this?
Likely, the feeling of loss aversion kicked in. Even though theoretically you were on your indifference curve, exchanging 5 days of pay for 5 vacation days, you didn’t see this as an immediate exchange.
As with prospect theory, the idea of indifference curves ignores the reference point at which you start. In general, people have inertia to change.
They call this the endowment effect. Before you have something, you might have a certain indifference curve. But once you get it, a new reference point is set, and from this point loss aversion sets in—the utility of a gain is less than a corresponding loss.
Here are a few examples of when people overvalue things once they own it:
The endowment effect doesn’t occur in all cases - people are willing to exchange $5 for five $1 bills, and furniture vendors are happy to exchange a table for money. When the asset under question is held for exchange, the endowment effect doesn’t apply.
You only feel endowed with items that are planned for consumption or use, like a bottle of wine or vacation days.
As with prospect theory and loss aversion, experienced financial traders show less attachment to the endowment effect.
We are driven more to avoid failing a goal than to exceed it. Failing a goal is perceived as a loss; exceeding the goal is a gain.
Examples:
In another reframing of loss aversion, we are biased toward keeping the status quo. Two effects are at play here: 1) the endowment effect exaggerates the value of what you have, warping your prior indifference curve, and 2) loss aversion makes you hesitant to take on risky bets, since losses are more painful than gains.
Examples:
The normal price of a good in a store is the reference point. Generally, it’s considered unfair to exploit market power to increase prices to match increased demand, unless the store must do so to retain profits. This is why there’s outrage when a supermarket in a blizzard suddenly raises prices on shovels.
Consider this scenario about pay: an employee is hired at $9 per hour, but the market rate drops to $7 per hour. It’s considered unfair to change the employee’s rate. But if that employee leaves, it’s acceptable to pay the new employee $7 per hour. Intuitively this sounds right, but rationally it sounds odd. The market rate should be the market rate.
Employers who violate rules of fairness are punished by reduced productivity by employees and sales by customers.
(Shortform note: the rise of on-demand services like Uber and real-time pricing may blunt this effect over time, gradually educating the public about supply, demand, and pricing. Soon it might seem normal for shovels to be more expensive during snowstorms.)
In another twist, the feeling of regret depends on your default action and whether you deviate from it. If you do something uncharacteristic and fail, you’re more likely to feel regret, but others are less likely to blame you.
Consider: Alan never picks up hitchhikers. Yesterday he gave a man a ride and was robbed.
Barry frequently picks up hitchhikers. Yesterday he gave a man a ride and was robbed.
Who will experience greater regret?
Who will be criticized most severely by others?
Common answer: Alan will experience greater regret, but Barry will be criticized most severely. Alan will have wished he stayed the normal path and would want to undo the event. Barry will be seen as habitually taking unreasonable risks, and thus “deserving of his fate.”
(Shortform note: sadly, this might drive some people to blame victims of rape, who allegedly were “asking for it” through their typical dress or behavior.)
In some cases, the default option is to do nothing to avoid regret (e.g. not sell your stock), while the alternative unusual action is to do something.
If you do the normal thing and get a bad outcome, this feels better than doing the unusual thing and getting a bad outcome.
If you do an unusual thing and get a good outcome, this feels better than doing a normal thing and getting a good outcome.
If you anticipate a bad outcome and regret, you will tend to do the normal thing.
If you anticipate a good outcome, you will tend to do the unusual thing.
Taboo tradeoff: there are certain things that people hold as sacrosanct (notably, health and safety), against which they would be unwilling to trade anything. This is driven by the prediction of strong regret if they made the trade and harm were to have resulted.
The risk of regret is that it causes inappropriate loss aversion. (Shortform note: this is aggravated by two factors we’ve previously discussed: 1) because probabilities are overweighted at the edges, we overestimate low chances of harm, 2) loss aversion makes the losses feel more painful than the gains.)
As you journal your decisions, note the possibility of regret before deciding. Then if a bad outcome happens, remember that you considered the possibility of regret before you made your decision. This avoids the hindsight bias and the feeling of “I almost made a better choice and I should have known better.”
Downweight your future regret—in practice, even in the case of a bad outcome, you will deploy psychological defenses to soothe this pain.
When an event is made specific or vivid, people become less sensitive to probability (lower chances are overestimated and higher chances are underestimated).
When an event is specifically defined, your mind constructs a plausible scenario in which it can happen. And because the judgment of probability depends on the fluency to which it comes to mind, you overestimate the probability. On the other side of the spectrum, the possibility of its not occurring is also vivid, and thus overweighted.
Here are a few examples of how specificity decreases sensitivity to probability:
(Shortform note: this effect could be the foundation of the common advice to visualize your success. Doing so helps you overcome the statistical knowledge of the low rate of success, which would otherwise drag you down psychologically.)
Each rate or probability is a fraction, consisting of a numerator at the top (the events you are estimating) and a denominator (all the possible events).
Denominator neglect occurs when people focus on the size of the numerator, rather than examining the basal rate.
Here’s an example:
You have the choice of drawing a marble from one of two urns. If you draw a red marble, you win a prize.
Urn A contains 10 marbles, of which 1 is red and the rest are white.
Urn B contains 100 marbles, of which 8 are red and the rest are white.
Which do you choose?
30% of subjects choose Urn B, since it has a larger number of marbles. Before you think this is silly, imagine a picture of both urns. Urn A has just 1 winning marble among other white marbles. In contrast, Urn B has 8 winning chances, standing out among the white marbles. Urn B conveys a more hopeful feeling.
Here are more examples of denominator neglect:
(Shortform note: the following are our additions and not explicitly described in the book.)
In some cases, you might exploit these biases for your own gain to overcome your hesitation: