In his bestselling book *Blink*, Malcolm Gladwell describes the power of rapid, instinctive thinking. He explains how medical doctors, counterintuitively, can make improved diagnoses with less information and analysis. While there may be instances where intuitive thinking beats analytical thinking, there are many situations where just the opposite is true. A phenomenon known as base rate neglect illustrates how people can sometimes jump to inappropriate conclusions, with significant consequences.

Base rate neglect is the tendency for people to mistakenly judge the likelihood of a situation by not taking into account all relevant data. It is a well established concept in cognitive science, confirmed through extensive research. A quick example of base rate neglect can be demonstrated through the following problem:

Tom is an opera buff who enjoys touring art museums when on vacation. Growing up, he enjoyed playing chess with family members and friends. Which situation is more likely?

- Tom plays trumpet for a major symphony orchestra
- Tom is a farmer

Most people are inclined to judge that Tom is much more likely to be the musician. The characteristics outlined in his description are more representative of people employed in the arts. However, a quick look at some underlying data shows base rate neglect in action.

Let’s calculate the total number of people that could possibly fit conclusion #1. There are 117 symphony orchestras in the United States with budgets over $2.5 million per year. Each orchestra typically has 3 trumpet players. Even if every one of these individuals met the qualifying characteristics in the example, they would number only 341.

Now lets look at the number of people that could possibly fit conclusion #2. There are 2 million family farms in the United States (to simplify things I am not including corporate farms). Let’s assume a conservative number of 2 adult males per farm. If only 1 in 10,000 of these farmers fits the description, they would number 400.

In this example of base rate neglect, people focus on the similarity between characteristics and occupation, without taking into account underlying statistics. That is, they ignore the relative rarity of each occupation. Researchers who study this phenomenon believe it stems from a related cognitive bias known as the representative heuristic.

While incorrectly guessing someone’s profession may be harmless, there are situations where base rate neglect can cause serious problems. One example is the misjudgment of reliability of eyewitness testimony. Another is the incorrect assessment of the likelihood of false positive results from medical tests.

Let’s look at the example of courtroom testimony. There has been a hit and run accident at night involving a taxi cab. The only eyewitness identified the taxi cab as being blue. There are two taxi companies operating in the city. One company has green cabs which make up 85% of the total cabs in the city. The other company has blue cabs, comprising the remaining 15% of taxis. The eyewitness was tested by the court under similar night time conditions and was able to accurately distinguish the color of cab 80% of the time. What is the likelihood that the eyewitness correctly identified the cab as blue?

This exact problem was used in a famous study in 1980 by the Nobel Prize winning researchers Kahneman and Tversky. What they found was that people frequently estimated the likelihood of proper identification of the cab at 80%, consistent with the eyewitnesse’s demonstrated reliability. Even those who attempted to calibrate a more accurate estimate judged that the eyewitness would be correct over 50% of the time. The reality is that given the underlying percentages of cab colors (only 15% are blue), the eyewitness is likely to be incorrect in their testimony nearly 60% of the time. That is, they are more likely to be wrong than right! (For a breakdown of the math behind this calculation, see this Wikipedia article)

An even more troubling example of base rate neglect involves assessing the reliability of medical tests. Let’s say you have just met with your doctor who has informed you that you have tested positive for a typically fatal disease. To make things worse, this test is accurate 95% of the time. Most people would sadly conclude that there was a 95% chance that they have the disease, a virtual death sentence. In fact, one would need to know the prevalence of the disease in the general population to determine the actual likelihood that the test was correct. If the prevalence of the disease is 1 in 1000, the likelihood that you actually have the disease (based on this test) is less than 2%. (See notes at bottom of post for mathematical explanation)

When this exact problem was given to students at Harvard Medical School, almost half the students computed the likelihood that the patient had the disease was 95%. The average response was 56%. Similar studies with practicing physicians have had comparable findings; medical experts are prone to base rate neglect.

Base rate neglect is a fundamental flaw in human reasoning, resulting from our innate weakness in analyzing complex probability problems. It is an example of where our intuitive judgements or instincts can lead us astray. Having an understanding of base rate neglect, along with the supporting math, can help you arrive at more accurate judgements, conclusions and decisions.

Solution for Medical Test Problem:

Let’s assume a model where 100,000 people are tested for the diseased. Since the disease effects 1 in 1000 people, we would expect 100 people to have the disease. That means that 99,900 people would not have the disease. Of those 99,900, 4,995 people would receive a false positive diagnosis. That compares to only 95 people receiving a correct positive diagnosis. Therefore, the probability upon receiving a positive test that one actually has the disease is 1.9% (95/4,995). The table below is a little easier to follow: