It seems we can’t go another day without a new story about an air traffic controller falling asleep on the job. Each case provides a new opportunity for people to add unhelpful comments to the story line. The mainstream media trumpets headlines of imminent peril and dereliction of duty. The FAA shows their seriousness by taking disciplinary action against the napping controllers. Even President Obama gets in on the action, telling controllers “you’d better do your job.” In fairness, some outlets have provided useful input from sleep experts, explaining the effects of shift work and sleep deprivation on individuals. Additionally, on a positive note, the FAA announced new work rules that could provide some additional help for sleep deprived controllers.
Missing from the chorus of hysteria and blame is a voice from the folks that make a science of studying these issues. Little known to the public, there is a field known as Human Factors Engineering (HFE) that studies how to improve working environments to make them safer and more operationally sound. The use of HFE has led to significant improvements in error rates in fields such as health care, nuclear plant operation and, yes, aviation. In fact, despite the current uproar over air traffic controllers, commercial aviation remains a remarkably safe industry due to the use of HFE .
At its core, HFE has a number of basic principles:
- Operational errors (including falling asleep on the job) are normally the result of poorly designed systems, processes or environments.
- When experiencing problems most organizations will resort to the uninformed tactic of “blame, train, discipline.” This does not fix the underlying weaknesses in the system.
- Human beings are fallible and will always make mistakes. Environments must be designed to ensure that these mistakes can not result in harm. This is the basic concept behind Mistake Proofing.
- A culture of openness needs to be created where individuals can report issues, system weaknesses and “near misses” without fear of retribution.
At this point, I’m guessing that a few of you are rolling your eyes, thinking that HFE is a touchy-feely concept that absolves workers of their responsibilities. This view couldn’t be farther from the truth. In conventional environments, it is managers that typically avoid responsibility by disciplining workers instead of improving systems. In organizations utilizing HFE, managers and workers become teammates, working collaboratively to create the safest environment. Additionally, HFE advocates a concept called Just Culture that differentiates mistakes from willful acts of negligence. For example, Just Culture would draw a distinction between a controller that nods off and one who leaves his post to take a nap in his car.
Forward thinking organizations should utilize the principles of HFE to design safer and mistake proof environments. These environments should take into account the inherent fallibility of human beings. Instead of expecting perfect execution, processes and systems should be designed to expect and compensate for human error. Finally, an open and supportive culture must be put in place. Team members need to know that they can report system weaknesses without fear of retribution. They should know that they will not be disciplined for simple errors, only for willful acts of negligence.
In Part II of this post, I’ll examine the psychology behind typical responses to workplace mistakes. Why is it that most organizations focus on blaming the individual as opposed to fixing the system? I’ll also look at how IT organizations can apply techniques from HFE and mistake proofing to improve service levels and employee satisfaction.