In the classic short story, Silver Blaze, Sir Arthur Conan Doyle chronicles the mystery of the kidnapping of a prize race horse. Doyle’s iconic detective, Sherlock Holmes, is summoned to investigate the crime. Holmes famously solves the case by focusing on a critical piece of evidence, a guard dog that doesn’t bark during the commission of the crime. He concludes “the midnight visitor was someone the dog knew well”, ultimately leading to the determination that the horse’s trainer was the guilty party. The story is often used as an example of the importance of expanding the search for clues beyond the obvious and visible.
Let’s fast-forward a half-century from the world of fiction to the very real world of military strategy. During World War II, Allied B-29 bombers were being shot down with great frequency. A mathematician named Abraham Wald, was given a challenging problem by military officials. How could these planes be optimally reinforced with armor plating? There were tradeoffs to consider. Every addition of plating added to the weight of the plane, decreasing it’s performance characteristics. Therefore, reinforcements needed to be added only to the most vulnerable areas of the planes.
Wald was presented with data from planes returning from bombing missions. The data showed the pattern and frequency of hits from enemy gunfire. Conventional thinking led many to believe that Wald would recommend the reinforcement of those areas receiving the greatest number of hits. Counter-intuitively, Wald recommended just the opposite, the reinforcement of those areas with the least frequency of bullet holes. Wald’s profound insight, in the spirit of Sherlock Holmes, was to focus on the unseen. He grasped that the returning aircraft were the survivors, receiving enemy fire in non-vulnerable areas. The downed aircraft (which were not part of the data set) were the ones that had received the fatal hits, most likely to the remaining areas. It was those remaining areas (the ones free of bullet holes in the surviving planes) that needed the extra armor plating.
Silver Blaze and Wald’s B-29 puzzle offer some important lessons to the world of management and professional decision making. Often when we analyze data, we use an incomplete or biased data set that causes us to draw inappropriate conclusions. Scientists have given this mode of thinking a name, Survivorship Bias. It refers to our propensity to analyze the survivors of a situation (in Wald’s case the returning planes) without considering the failures that never made it into the data set. Let’s explore a few practical examples.
Highly successful entrepreneurs are often the subject of articles and books that attempt to determine the critical factors that led to their wealth and fame. Was it their drive and ambition, their ability to work on 4 hours sleep or their creativity that led to their success? For each of these business superstars, there were thousands of run of the mill entrepreneurs who achieved either modest success or complete failure. Without studying them as well, how would we know that they didn’t share some of the characteristics of the highly successful folks? How could we tease out the role of luck and circumstance from genuine success factors.
Many firms attempt to validate their hiring methods by looking at the job performance of the individuals that they have recruited. Missing from this practice is the performance data for the folks they chose not to hire. Without this information, it’s difficult to determine if changing hiring methods would produce a higher performing workforce.
In the world of marketing, insurance companies frequently advertise that customers switching to their service have significant average savings. Once again, this only takes into account the “survivors”, that is, the individuals who make the decision to actually switch. Numerous customers may have assessed a switch and decided to stay with their incumbent provider. Furthermore, many folks may already be happy with low rates and would discover no savings if they investigated the new firm’s rates.
Another example of Survivorship Bias comes from the education space. Let’s say a school district is comprised of two very different populations. One area of town is wealthy, with high performing students. The other area has significant socioeconomic challenges and has students who on average are low achievers. To present itself in a better light, the school could focus on a key stat: The number of graduating seniors going on to a 4 year educational institution. In doing so, the school would not account for a high dropout rate. A more accurate picture of overall student achievement would be the graduation rate.
In the corporate world an analogous situation can occur when companies look to gauge their progress in a particular area. For example, they may see improvements in morale based on increased scores on an employee engagement survey. However, the most disgruntled individuals may have chosen to leave the firm since the last study. Once again, only the survivors scores would be tallied.
People have an inherent tendency to focus on visible and obvious factors when analyzing a situation. This can result in biased data sets and inaccurate conclusions and decisions. When analyzing a situation, ask yourself the following question: Are there dogs that aren’t barking?