Two Types of Thinking

The Nobel Prize winning psychologist, Danny Kahneman, has just released an excellent book entitled Thinking, Fast and Slow. Kahneman is the grandfather of the cognitive science revolution, with significant contributions dating back to the early 1970’s. In his new book he presents decades of research in a style that is accessible for the layperson.

The key theme of the book is that the human mind has two distinct systems of thought.  The first, which Kahneman labels System 1, is fast, reflexive, intuitive and automatic.  It is the primitive part of thinking that evolved to allow us to survive in a dangerous world.  It is essential for rapid assessment and reaction to threatening situations.  Thinking processes associated with System 1 happen below our level of conscious awareness.

System 2 is slow, rational and deliberative.  It is the part of our thinking that we can consciously “observe” and is used for analysis, logical reasoning and deliberate calculations.  System 2 can sometimes act as rational governor, overriding judgements or decisions made by System 1.  However, the main takeaway from Kahnemen’s life work is that  System 2 can be “lazy”, allowing System 1 to lead us to irrational conclusions.

Let’s look at a quick visual example of System 1 and 2 in action.  Take a quick look at the following emoticon:

You are instantly able to sense an emotion of anger from this character. It didn’t require any deliberation or conscious analysis.  You knew instantly that the character was mad and might have even sensed that he was offended or was ready to yell.

Let’s contrast that to a visual item that immediately draws in the skills of System 2.  Take a look at the following math problem:

You were not able to immediately and intuitively solve this relatively simple math problem.  Given some time and focus, you could compute the answer in your head.  But the answer would not “hit you”, the way you felt the sense of anger from the emoticon.

System 1 is a valuable ally in a number of critical domains.  When driving a car, System 1 can enable us to instantly sense and avoid a dangerous situation.  On a dark, isolated city street, System 1 enables us to rapidly assess people and their intentions.  On an athletic field, System 1 allows us to intuitively see our opponent’s intentions.  Many athletes describe their ability to sense a play unfolding instantly, without being able to explain what they saw.

While System 1 is crucial for surviving in a real-time, danger driven world, modern society increasingly requires the skills associated with System 2.   While avoiding tigers or angry rivals may have been paramount in a primitive world, modern life requires analytical thinking, deliberative calculation and accurate predictions.  Consider the following important questions that are typical in the modern world:

  • What strategy should I use for investing my money?
  • Should I refinance my mortgage?
  • Should I buy flood insurance for my house?
  • Should I use that new supplement that is said to prevent certain diseases?
  • Do I hire person A or person B for the job?
  • Does the report I am reviewing tell me definitively that Department A is outperforming Department B?

These are all questions that require the deliberate, logic based thought of System 2. However, as Kahneman describes, System 1 is a powerful, hidden force, often disproportionately influencing the overall thought process. This, in turn, leads to suboptimal judgements, decision and predictions.

These irrational tendencies are known as cognitive biases.  Kahneman, and other researchers, have spent the last 40 years demonstrating these tendencies through clever experimental techniques.  They have identified dozens of these biases.  For this post, I’ll describe one, known as availability bias, that illustrates the unconscious power that System 1 can have on our behavior.

One of the important skills that is needed for success in the modern world is the assessment of the probability of various risks.  Over- or underweighting the likelihood of an event can cause us to make suboptimal decisions.  System 1 has a tendency to react to vivid, emotionally provocative stories. Therefore, it tends to up the probability of an event, when a similar, stirring event can be recalled. Avoiding the watering hole after watching a tiger maul a neighbor is a good survival strategy. Unfortunately, this same dynamic leads to the availability bias, handicapping our ability to accurately assess risks.

A landmark study done by the research team of Slovic, Lichtenstein and Fischoff showed availability bias in action.  Subjects were asked to judge the relative likelihood of two different causes of death.  Here are some samples of their findings:

  • Although people die from Asthma 20 times more frequently than in Tornadoes, the latter was seen as more likely.
  • Death from accidents was viewed as likely as death from disease.  In reality, disease kills 18 times as many people as all accidents combined.
  • Death from accidents was viewed as 300 times as likely as dying from Diabetes.  In reality, it is less than twice as likely
In each case, the subjects grossly overweighted the probability of the more vivid and familiar cause.  Accidents and tornadoes are vivid, emotionally provocative incidents.  They receive significant news coverage, with graphic, fear inducing video.  Unless you know someone personally that has died from Asthma or Diabetes, it is difficult to visualize the occurrence.  System 1 is driven by the sensational stories covered on the 11 o’clock news.  System 2, unless forcefully activated, tends to allow System 1 to “call the shots”.

Unfortunately, leaving System 1 in control of these calculations, and subsequent behavior choices, can enhance rather than diminish our risk.  It can lead to the following unfortunate behaviors:

  • Driving long distances instead of flying
  • Buying insurance that is not cost effective
  • Overspending on preparations for unlikely events
The message is clear.  Whenever you are assessing risky situations, relying on intuition is dangerous.  We are very likely to be steered off course by our subconscious friend, System 1.  A better approach is to utilize objective risk data compiled from actual occurrences.
Availability bias, is but one of dozens of cognitive biases.  Each one can have a pernicious effect on our personal and professional decision making.  For additional examples of these biases, see the following blog posts.
 
This entry was posted in Cognitive Bias, Probability/Statistics, Psychology, Risk and tagged , , , , . Bookmark the permalink.