On Heuristics, Errors and Biases

Human beings are imperfect rational-actors. We are led astray in our decision making by cognitive miserliness, which entails the use of one-size-fits-all heuristics that inevitably result in errors and biases. Here are some of the most common ways we are prone to error (inspired by the summary provided by Richard Thaler and Cass Sunstein in their bestselling book, Nudge).


Imperfect information means we often have to make decisions on the basis of estimated values. For example, if you're buying a used car, you probably have some idea of how much you ought to be paying. The trouble is we are frequently led into error by the initial value we use to anchor this sort of estimate. If your friends have all paid $30,000 for their used Corvette, likely you'll jump at the chance of paying $28,000 for the same, even if you know you really ought to check out the local auto-market. Similarly, a charity can nudge prospective donors towards a higher donation, if it shifts the range of available donation options upwards (e.g., $100, $200 and $500, rather than $35, $50 and $100).

Our tendency to anchor extends far beyond purely financial estimates. If you were brought up in a family of heavy smokers, then probably your ten-a-day habit won't seem particularly serious. If your classmates average 500 Facebook friends, you're going to feel unpopular if you're stuck at 300. And if you train with international athletes, you'll likely believe you're a mediocre talent if you're "only" operating at a national standard.


If you're asked to assess the probability of some event or occurrence, the chances are you'll judge it more probable if you can easily bring examples of it to mind. As a rule of thumb this makes sense, because instances of a class of common events will be recalled more easily than instances of a class of less common events. However, it can also result in us going wrong. For example, murder attracts more media attention than suicide, which probably explains why we wrongly believe it is more common than suicide. If you're an American woman, you're more likely to die in childbirth than in a terrorist attack, but likely most people believe otherwise.

Salience is clearly part of the story here. Recent events, for example - perhaps a series of aircraft crashes - are likely to affect perceptions of risk; and if you have experienced an event yourself, you'll probably overestimate its chances of occurring.


In a situation of uncertainty, if you're asked to judge how likely it is that an object belongs to a particular category, or an event has a particular cause, probably you'll make use of a representativeness heuristic. Amos Tversky and Daniel Kahneman illustrate representativeness by asking us to consider the case of Steve:

Steve is very shy and withdrawn, invariably helpful, but with little interest in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.

If we're given a list of occupations - for example, barista, forensic scientist, boxer, or physician - we will tend to rank the likelihood that Steve is working in one of these occupations according to whether he is representative of the stereotype of that occupation (hence we'll judge forensic scientist as more likely than barista). Unfortunately, this way of proceding will quickly lead us to make serious mistakes. Consider, for instance, that similarity to a stereotype tells us nothing at all about the prior probability of a particular job. It might be that Steve fits the stereotype of a forensic scientist perfectly, but there are so comparatively few forensic scientists, and so many baristas, that it would be reckless to suppose he is more likely to be a forensic scientist than a barista. (For another example of how this sort of thinking can lead us astray, check out the curious case of Mary.)


Thaler and Sunstein relate that before the start of Thaler's Managerial Decision Making class students are asked to estimate where they will fall in the distribution of grades awarded for the class. The responses are indicative of a pronounced overconfidence:

Typically less than 5 percent of the class expects their performance to be below the median (the 50th percentile) and more than half the class expects to perform in one of the top two deciles. Invariably, the largest group of students put themselves in the second decile. We think this is most likely explained by modesty. The really think they will end up in the top decile, but are too modest to say so.

Similarly, the vast majority of college professors think their teaching ability is above average; nine out of ten drivers in the United States think they have better than average driving skills; nearly all psychology students believe they wouldn't have obeyed in a Milgram experiment situation; smokers tend to underestimate the risks associated with their habit; and most people think their lifestyles are healthier than those of other people.

Loss Aversion

People dislike losses much more than they like gains, even if the overall outcome is the same. If students are gifted a coffee mug illustrated with their university insignia, and then asked how much they would sell it for, it turns out that the price they want is roughly twice what they would be willing to pay for such a mug if they hadn't been gifted it.

Loss aversion opens up the possibility that people's behavior can be nudged in particular directions. For example, if you want to get people to sign up for a conference, you're going to do better if you tell them late registration will result in a penalty fee rather than that early registration will get them a discount.

 Status Quo Bias

There is an inertia effect in play such that we tend to stick with our current situation. Banks and cell phone companies and gyms put a premium on attracting new clients, because they know that once somebody has signed up, they'll probably stay signed up. Cass Sunstein reports that for years he has subscribed to magazines that he rarely reads, simply because he responded to a clever American Express promotion promising free subscriptions for a limited time period, and then never got around to cancelling the subscriptions after the free period ended.

Status quo bias makes default options very powerful. If you want people to participate in a voluntary pension scheme, for example, make participation the default, and provide an opt out.


Framing, which was first explored in detail by the psychologists Daniel Kahneman and Amos Tversky, refers to the process of making a choice appear more or less attractive depending on how it is set up. For example, if you emphasize the need for action in order to avoid a loss, you're going to do better than if you stress the gains that will flow from the action (see loss aversion, above). This is the case even if the overall outcome is identical. Thus, Thaler and Sunstein note that energy conservation campaigns work best not if you tell people they can save money by using energy saving methods, but rather if you tell them they'll lose money if they don't. (For a longer discussion of framing, see here.)

Thaler and Sunstein conclude their discussion of heuristics, errors and biases by noting that people are subject to influences that are not readily explicable in terms of the standard economic framework.

The picture that emerges is one of busy people trying to cope in a complex world in which they cannot afford to think deeply about every choice they have to make. People adopt sensible rules of thumb that sometimes lead them astray.

The fact people can be led astray also means that they are nudgeable - they can be steered in directions that have consequences that they themselves would favour. This, of course, is Thaler and Sunstein's central thesis.