The Anecdotal FallacyAlias: The Volvo Fallacy
Last year, tens of millions of people bought life insurance for scheduled flights of airlines in the United States. Not one of those insured passengers died in a crash…. [T]ravel insurance…is now purchased by half of American leisure travelers―a fivefold increase since 2001, according to the United States Travel Insurance Association. As a purely economic investment, some of this insurance can be dubious, particularly the flight insurance policies. … Because calamities are so vivid and easily brought to mind, we tend to overestimate their probability when we intuitively judge what will happen….
Source: John Tierney, "Appeasing the Gods, With Insurance", The New York Times, 5/6/2008
The Availability Heuristic: The easier it is to remember, or to imagine, a type of event, the more likely it is that an event of that type will occur.
So, it's the "availability" to memory or imagination that gives this rule of thumb its name. It's not a very memorable or descriptive name for a simple idea: if we can easily remember an event of a certain type or imagine it happening, then we tend to think it more likely than if we cannot do so. Like all rules of thumb, the ease of remembering or imagining"representing"a type of event is evidence of degree of likelihood in ordinary circumstances. Instances of a type of event which we frequently experience will be easily remembered, so that that type will be correctly judged to be likely. Similarly, if there are many ways that a kind of event can come about, then it will be easy to imagine and also likely to happen. Moreover, if one has a hard time remembering an event of a given sort, then it is probably rare and unlikely. Also, if we cannot even imagine it, then there may be almost no way for it to occur. However, unusual events do happen, and if they happen to us then we tend to overestimate their likeliness.
You may have had the experience of seeing an accident on the road, then slowing down and driving more carefully afterwards. Of course, it's a good idea to slow down in the immediate vicinity of an accident scene, since there may be wreckage on the road. Also, it's possible that the accident took place where it did because the area is an unusually dangerous one. However, the vivid memory of the accident and your heightened caution may have lasted after you were away from the accident scene. The experience of seeing one makes accidents more vivid in your memory, thus making them seem more probable. However, the probability of getting in an accident in one place is not increased by seeing one in another.
You commit the Anecdotal Fallacy when a recent memory, a striking anecdote, or news of an unusual event leads you to overestimate the probability of that type of event, especially if you have access to better evidence of the frequency of such events.
Relying on memories or imagination when judging the probability of events may have worked well for our ancestors, but in the modern world we are exposed to vivid vicarious experiences through the communications media. Unfortunately, common events make for uninteresting stories, and we're more interested in the out-of-the-ordinary. There's an old saying: "When a dog bites a man that is not news, but when a man bites a dog that is news." In other words, relatively common events are not newsworthy; it's the unusual that makes news. So, the news media frequently expose us to uncommon events as "news", and we acquire a mistaken impression of how common such events are. As a result, many people are fearful of highly unlikely events, such as terrorism, shark attacks, and strangers kidnapping their children. Such exaggerated fears can lead people to take unnecessary and even harmful actions, such as buying expensive flight insurance, or driving instead of flying, which is statistically safer. We often fear most those things that are least likely to happen, and fail to take precautions against more probable risks. In this way, the Anecdotal Fallacy may well have done more damage to the human race than any other mistake in our thinking.
- Daniel Kahneman, Paul Slovic & Amos Tversky (editors), Judgment Under Uncertainty: Heuristics and Biases (Cambridge University Press, 1982), Part IV.
- Nisbett, R.E., et al., "Popular Induction: Information is Not Always Informative", in J.S. Carroll & J.W. Payne (Editors), Cognition and Social Behavior, Halsted, 1976. The source of the "Volvo fallacy" alias .
- Massimo Piattelli-Palmarini, Inevitable Illusions: How Mistakes of Reason Rule Our Minds (John Wiley & Sons, 1994), p. 190.
Michael Shermer, "How Anecdotal Evidence Can Undermine Scientific Results", Scientific American, 7/2008
Thanks to Stephen Rowe for a criticism of the auto accident example which led me to revise it.