A longtime airline captain and chief safety officer at American Airlines, Scott Griffith gained national prominence in 1994 after he created what would become a major airline safety initiative.
Under the Aviation Safety Action Program (ASAP), pilots are encouraged to self-report errors and safety lapses, which are then addressed to make improvements and prevent accidents. ASAP was later embraced by the FAA, unions and the airline industry and remains one of the cornerstones of U.S. airline safety.
In 2000, Griffith was asked to join a think-tank and consulting firm that had developed a strategy for reducing errors by better managing human behavior relating to the systems people worked within. The company, based in Dallas, is now known as Outcome Engenuity.
Griffith spoke with Best Practices in Emergency Services about his company’s approach to reducing risk and what EMS has to learn about safety from the airline industry’s experience. The following excerpted interview can be found in its entirety at emergencybestpractices.com.
What was the main obstacle to improving airline safety when you created ASAP?
For many years in aviation, there was this disconnect between what was really taking place in the cockpit—the mistakes pilots would make and the system errors they would see—and what the regulator would see. All of the precursors to an accident, all of the systems and behaviors we know that can lead to an accident, went unnoticed by the regulator. There was no mechanism for reporting it, and pilots held the regulator at arm’s length.
The role I played was to help foster a better, more just environment. If a pilot came forward in good faith and raised his hand and acknowledged a behavior or an event, a team would come together to work collaboratively to resolve the hazard or the risk. The program was an amazing transformation. We started to see pilots coming forward in large numbers to identify risk in the system, including their own behaviors.
How can employers work toward providing a safer work environment for their staff and patients?
They can start by having a more honest conversation with their employees. At times they will be caught between two values. The problem is that most organizations measure and reward outcomes rather than behaviors. If you’re getting good outcomes, you may be rewarding a very risky behavior.
We see outcome bias again and again in organizations. But it’s a dangerous approach, and an accident waiting to happen. What can help this is coaching, peer to peer, so that people can become aware of where they are cutting corners and are brought to an understanding of the risks. More often than not, people cut corners and have a positive outcome, which reinforces the risk-taking behaviors. We break that paradigm. It’s the systems and behaviors that matter most, and then the outcomes will improve.
What would you say to someone who points out that EMS responders don’t work in a controlled environment—they’re doing their best to be safe, but it’s just the nature of the business that there’s risk?
It’s very difficult to write procedures or rules that work in all circumstances.
We have an algorithm we developed that guides organizations in how to respond to events and behaviors when you see an EMT or paramedic who didn’t follow procedure. You hold them accountable for doing the right thing. The rules are subordinate to your values. We will stand in judgment of whether you did the right thing. Sometimes that means following procedures, sometimes it means going beyond the procedures in certain emergency situations.
But that’s not the place organizations typically struggle. Managers usually have a good feel for whether the employee did the right thing. Where we struggle is with the day-to-day cutting of corners, because it wasn’t risky enough to cause an accident at the time. If a paramedic is texting while driving and going 30 mph and there is no crash, how do you manage that behavior? You could try to build a better system around the human to deter them from the behavior. You could put a device in the ambulance that wouldn’t allow the cell phone to operate, but that might not be practical. So you have to raise the humans’ awareness, because humans can circumvent systems, and they do so quite often.