Human beings are a fantastic lot, evolved to a separate plane from the remaining animal kingdom. We are inspired to be creative, enjoy solving complex problems and constantly seek to better our lot. As best we know, no other species has the ability to review the past, ponder the future and reflect on sophisticated goals and priorities. Through our superior skills, we’ve been able to advance more in the last year than any species has in the last million.
But despite our unrivaled skills and our drive to improve, we can be a surprisingly reactionary lot, resistant to changes in habits. While technology delivers constant improvements in processor speeds, disk densities and bandwidth, many corporate processes trace their roots to a pre-computer era. Why are we so resistant to changes in basic processes? Are the old processes simply tried and true, unworthy of improvement? A walk back in history might provide some clues to this mystery.
In 1846, a young doctor, Ignaz Semmelweis was working in a maternity hospital in Vienna. Semmelweis observed a disturbing phenomenon. Women giving birth at the hospital were prone to dying from a disease, puerperal fever, also known as childbed fever. Perplexingly, the two local maternity hospitals experienced vastly different rates of the disease. In Semmelweis’ hospital, known as the First Clinic, mortality rates from puerperal fever ranged from approximately 7 to 11%, with monthly spikes up to a shocking 30%. In the Second Clinic, mortality rates were a more moderate 2 to 7.5%.
Semmelweis was extremely disturbed by the poor performance of his hospital and the concomitant human suffering. He set out to analyze the differences between the two hospitals, with a goal of isolating a causal factor. Along his journey, Semmelweis utilized a number of scientifically sound practices, including, establishing a hypothesis, conducting an experiment and analyzing statistics. He also looked at common and differentiating factors between the two populations he was studying.
Semmelweis started his investigation by looking at differences between the two hospitals. They served similar populations, experienced the same climate and used common procedures for child birth. The main difference was that the First Clinic was a teaching hospital for doctors and the Second Clinic taught only midwives. But Semmelweis could not initially determine why that would cause a huge difference in mortality rates.
A breakthrough in his investigation occurred when a physician friend died with similar symptoms to childbed fever. The doctor, Jakob Kolletschka, had been accidentally poked by a student’s scalpel during an autopsy. The wound resulted in an infection that led to Kolletschka’s death. Semmelweis now made a bold prediction. He hypothesized that doctors performing autopsies who then performed deliveries were sickening the mothers. He further surmised that some “cadaverous material” was carried on the hands of the physicians.
His observation would seem obvious with the benefit of our current scientific knowledge. But Semmelweis’ hypothesis predated the germ theory of disease, developed later in the 19th century by Pasteur and others. At the time, conventional ideas of disease were based on ancient concepts of humours that were first postulated by ancient Greek philosophers. Humours were vital essences inside a body that needed to be “in balance” to avoid illness. Disease was thought to be spread by miasmas, which were described as poisonous vapors or mists.
Semmelweis chose to test his theory by having physicians leaving the morgue to wash their hands in a chlorinated lime solution prior to performing a delivery of a baby. The results were immediate and dramatic. Mortality from puerperal fever dropped from over 10% to a range of approximately 1-2%. At this point, one would expect the story to have a happy ending, with Semmelweis lauded a hero and physicians uniformly adopting his practices. Unfortunately, this was not to be.
The reaction of the medical community to Semmelweis’ findings was a combination of skepticism and outright hostility. It was hard for doctors to look past the prevailing theories of humour imbalance and miasmas. Some doctors criticized his methods, feeling that they lacked scientific rigor. Others felt that his results were inconclusive or could be the result of random fluctuations in mortality. There was a social component as well. Some doctors were offended by the idea that people of high standing could contain significant amounts of disease producing dirt. They felt a simple hand washing with soap and water should suffice.
Semmelweis ultimately left the hospital in Vienna and returned to his hometown of Pest. He took a position as a doctor of obstetrics at a local hosiptal. There, he again worked his “magic”, dropping the Childbed Fever rate to below 1%. Unfortunately, suffering from mental illness, he was committed to an institution and died tragically at the age of 47. Ironically, around the time of Semmelweis’ death, Louis Pasteur would formulate the germ theory of disease, lending significant scientific backing to the power of antiseptic hand washing.
How can one explain the resistance of the medical community to Semmelweis’ idea and experimental findings. Even if there was healthy skepticism, you’d think that the dramatic nature of the results, coupled with the high stakes involved, would provoke further investigation and interest. One can again look to human nature for insight.
People have a strong, inherent tendency to hold onto cherished beliefs, ideas and teachings. Conversely, people are inherently wired to dismiss alternative ideas that conflict with their existing world views. Psychologists that research this phenomenon have a term for it; confirmation bias. Imagine your views on a controversial political topic such as gun control, capital punishment or tax policy. How easily are you persuaded to change your viewpoint when confronted with strong evidence supporting the opposite position? If you’re like most people, you’ll be inclined to dismiss the evidence as biased, inaccurate or irrelevant.
Another factor working against Semmelweis’ was the guilt he was directing against his fellow doctors. Semmelweis was making a very large and disturbing claim: that physicians were the direct cause of their own patient’s death. For physicians, who are sworn by oath to “do no harm”, there can be no more bitter pill to swallow. A well studied social psychology concept, cognitive dissonance, explains the physician’s resistance to Semmelweis’ ideas. Cognitive dissonance states that people have difficulty holding two contradictory ideas at the same time. In the case of the physicians, they saw themselves as learned, helpful professionals. only looking to heal their patients. Accepting Semmelweis’ findings would require them to see themselves as fallible people who had been harming patients through their primitive practices. It was easier to see the findings as false.
Let’s wind the clock forward to the present day. Surely today’s physicians understand the evidence for antiseptic examination conditions. Disappointingly human nature interferes again, leading to a shocking lack of compliance in hand washing by doctors. Studies as recent as 2010 show hand washing compliance rates below 50% by physicians. Attempts to increase compliance by the use of such techniques as checklists have been met with resistance from doctors, who see it as demeaning or unnecessary. Once again, forcing changes to established practices can be a daunting task.
While this post has focused on medical practices, this blog is oriented to management practices in large enterprises. While it’s easy to look scornfully at stubborn 19th century physicians, how should we evaluate the practices of today’s manager? My sense is that they are equally lacking in scientific rigor.
Many processes within a large organization are long standing, unquestioned ideas that more resemble folklore than science. Enterprises are typically uncomfortable replacing these processes, even in the face of demonstrated evidence. When introducing new processes, there is typically little thought given to a sound, scientific measurement of results. In fact, the standard of evidence involved in most corporate decisions, would be woefully low when judged by the standards of scientific research.
Let’s look at two processes that go largely unquestioned in the corporate world, job interviews and workplace surveys. Job interviews remain a staple of the corporate recruitment process. Most organizations use an unstructured approach, with individual interviewers determining the questions and style of the process. Significant numbers of studies have determined that this is the least reliable form of interviewing. That is, it ranks very low in predictive value for employee success. Alternatively, the studies show that structured interviews with fixed questions, produce better hiring results. There are some researchers who believe that tests combined with simple statistical rules trump the conventional interview process and minimize hiring biases. Yet, how many firms or hiring managers are aware of these studies? How many hiring managers are ready to put aside their personal interviewing style and expertise for a set of predetermined questions?
Workplace surveys are another corporate process typically conducted with little regard for scientific principles. Most are constructed and interpreted by people with little training in proper survey methodology and minimal understanding of statistical analysis. While their intent may be noble, they provide a false confidence of understanding the true opinions and concerns of the organization. This is especially true when the survey attempts to show progress from an earlier study. These surveys rarely apply statistical concepts such as appropriate sample sizes or confidence intervals. Concepts like confounding factors and regression to the mean are rarely considered. Yet the results of these workplace surveys are typically treated as important factual information; the basis for significant decisions.
Our inherent human nature is a double-edged sword when it comes to process improvement. We are interested in improving ourselves and our teams yet we subscribe to an inertia that keeps us clinging to shop worn ideas. There is a deep irony in all of this. While we love technology, as in gadgets and productivity software, we tend to ignore the scientific principles that are needed to create it. Those same principles, if applied to management processes, would ultimately help us and produce better results and outcomes.