“If I Can Imagine It, It Must Be Likely” – The Availability Heurisitic

We live in a world filled with news stories about frightening tragedies and potential risks. In just the past six months, we had a horrifying school shooting and a sickening bombing at a marathon. We’ve been threatened by a rogue regime with nuclear annihilation. On a more mundane level, we’ve been warned about additional influenza threats and informed that horse meat has been surreptitiously used in certain food products.

All of these stories give rise to a certain level of predictable fear, in turn driving an internal monologue. Are my children safe at their school? Should I attend that concert next week? Did everyone in our family get a flu shot this year? That hamburger I ate last night tasted kind of strange.

While fear is an unpleasant emotion, it’s a dominant and important one for all higher order animals. In a prehistoric world filled with danger, fear was an essential element to avoiding harm. And an oversensitivity to fear wasn’t necessarily a bad quality. Is that rustling in the grass a poisonous snake or the wind? Better to assume the former and quickly run away. Is that a dangerous and angry rival in the distance? I’ll guess that it is and quickly hide.

But this same dynamic that made us safer in a prehistoric world, can be counterproductive and even crippling in a modern society. A 24 hour news cycle along with pervasive social media bombards us with graphic images of tragedies and threats. We’re tortured by the horrifying photos of the bloody victims of the marathon bombing. We’re haunted by the picture of small children being led away from the Newtown school shooting. We fearfully watch videos of military parades in North Korea that are filled with columns of dangerous looking missiles.

Unlike the prehistoric world, where fear was appropriately calibrated to prevailing threats, in a modern environment, it can lead to inappropriate risk calculations, and counterproductive behaviors. Psychologists have conducted significant research into the way people perceive risks and estimate the likelihood of events. A lot of this research centers on a phenomenon known as the availability heuristic.

The psychologists Daniel Kahneman and Amos Tversky coined the term availability heuristic (or bias) as part of seminal research they conducted in 1973. In this research, they demonstrated that people would overestimate probabilities if a scenario was easy to recall or envision. As opposed to the probability of risks and dangerous situations discussed above, Kahneman and Tversky initially demonstrated availability bias in some mundane settings.

In one experiment, subjects were presented with several letters from the alphabet. They were asked to judge whether each of these letters appeared more frequently as the first or third letter in a word. The letters, K, L, N, R and V, all occur more frequently as the third letter in English words. In every case, subjects believed that the letters occurred more frequently in the first position, by a ratio of approximately 2:1. Kahneman and Tversky believed that the effect occurred because it was far easier to recall words that started with a particular letter as opposed to words with a letter in the third position. That is, these “first position” words were more “available” to the subjects.

As part of this same research, Kahneman and Tversky conducted an additional experiment. Subjects were broken into two groups and asked to listen to a tape recorded list of 39 names. For one group, the list contained 19 famous male names and 20 less famous female names. For the other group, the list contained 19 famous female names and 20 less famous male names. Out of 86 participants, only 13 recalled a greater number of less famous names. Out of 99 subjects, 80 believed that their tape recorded list contained more famous than less famous names. Again, Kahneman and Tversky concluded that ease of recall led people to miscalculate the frequency of a scenario.

In additional research in 1978, the researchers Lichtenstein et al. examined how people assess the probability of lethal events. They presented subjects with a list of 41 causes of death with widely varying levels of frequency. As predicted, subjects were more inclined to overrate the frequency of death under particular conditions. Specifically, disproportionate exposure, memorability or imaginability were linked to higher judgements of frequency. As an example, subjects believed that homicide was more common than stomach cancer, even though the latter was five times as common! Sensational and vivid events such as tornados and floods were perceived as more common than their actual occurrence. Natural causes of death such as asthma and diabetes were underestimated in terms of likelihood. Lichtenstein et al. found a strong relationship between newspaper coverage of deadly events and perception of frequency.

Additional research conducted by Sherman et al. in 1985, looked at risk assessment in an additional, novel way. They showed that simply imagining a hypothetical scenario could increase an individual’s perception of the likelihood of the event. In the study, subjects were told that a new disease was becoming more prevalent in their area. They read a list of symptoms of this new disease and were asked to rate their likelihood of contracting the illness on a scale of 1 to 10. The subjects were broken into four groups and presented with some variation on this task. In one condition, the subjects simply read the list of symptoms. In a contrasting condition, the subjects were asked to actually imagine themselves with the symptoms. Additionally, the type of symptoms were presented differently. In one case, the symptoms were common and vivid (e.g. low energy, headaches). In an alternative case, the symptoms that were presented were not concrete and difficult to imagine (e.g. poorly functioning nervous system, inflamed liver)

Group 1 – Symptoms: Easy to imagine / Exercise: Imagine

Group 2 – Symptoms: Easy to imagine / Exercise: Read list of symptoms

Group 3 – Symptoms: Difficult to imagine / Exercise: Read list of symptoms

Group 4 – Symptoms: Difficult to imagine / Exercise: Imagine

As the researchers had hypothesized, the subjects in Group 1 predicted the greatest likelihood of contracting the disease. The combination of easy to imagine symptoms, along with an exercise of visualizing the condition, caused their perception of risk to be significantly elevated. On the opposite side of the spectrum, the folks in Group 4 predicted the least likelihood of contracting the disease. Attempting to imagine symptoms that were difficult to visualize created a perception of least elevated risk.

Personal Takeaways – Understanding availability bias, and recognizing when it could influence your behavior is a very valuable capability. Our tendency to overestimate the dangers of terrorism, crime and severe weather can cause us to live in unwarranted fear and take unnecessary precautions. Our bias to underestimate the dangers of common diseases can lead us to have undesirable dietary habits, to avoid medical exams and to be noncompliant with prescription drug regimens. Conditions that are asymptomatic, for example hypertension or colon cancer will be viewed as unlikely, when they are in fact, common killers.

Professional Takeaways – Availability bias can have unfortunate influences in the workplace as well. Often, as we look to make judgments and decisions, we are captivated by vivid imagery and memories. Just as in our personal lives, our sense of risk and probability can be distorted by availability bias. Any decisions that require calculations of risk are prone to this problem.

A new data center will be meticulously planned to withstand the effects of adverse weather events. While these events are extremely rare, their destructive effects are vivid and easy to imagine. A far more common threat are the outages resulting from process failures and operational errors. Although this is a bigger contributor to downtime in most data centers, it is difficult to envision and doesn’t evoke compelling imagery.

We often judge co-workers based on vivid memories. An individual who made significant contributions to a highly visible effort, receiving an award, will be well remembered. The individual who asked the uncomfortable question at the Town Hall will be closely associated with that moment. In either case, that one episode represents a minor part of their total body of work. But it will typically influence a large portion of your opinion of that individual. And it may also be used as a rationalization (or confirmation) for your existing opinion. If you had thought favorably of our award recipient, this will confirm your belief that he is a star. If you originally had negative feelings, this will reinforce your thoughts that he is a showboat, taking an unfair share of the credit. Our Town Hall questioner will either be viewed as a nuisance or a maverick.

In summary, on a routine basis, we need to consider the odds of an event occurring. Most often, we do so in an ad hoc fashion, relying on memory and instincts. This leads to an availability bias where we miscalculate the probability of the event. The key to avoiding availability bias is to use actual statistical odds as opposed to intuition when attempting to determine the likelihood of an event.

This entry was posted in Business Continuity, Cognitive Bias, Probability/Statistics, Psychology, Risk. Bookmark the permalink.