When an error occurs, it's natural to ask who was at fault and hold them accountable for their mistake. While some may feel better because the person involved had to "pay" for the error, it isn't an effective way to improve the overall safety of the system. To illustrate this concept, consider a common example: Not noticing a stop sign, a woman drove right through a four-way intersection without stopping. By chance, she did not hit anyone. In patient-safety language, this is defined as a near-miss, because, although there was a situation with potential to do harm (a hazard), no one was hit. A police officer who observed the event stopped the driver and issued a ticket. In this case, the person was blamed (stopped by the police officer) and shamed (given a ticket). It was expected that this punitive action would not only teach the woman not to run stop signs, but also serve as a deterrent to others. Later that year, at the same intersection, a man ran the stop sign and struck and killed a bicyclist. This was an adverse event, as someone was killed.
The goal of the systems approach is to examine all of the factors that led to an adverse event or near-miss, and to make changes to the system to prevent similar events in the future. One important part of this approach is understanding that human error is inevitable and will be repeated. Thus, after an incident occurs, the approach should focus on identifying problems in the system and finding changes that could be implemented to minimize the impact of human error. This is accomplished through two goals. The first is trying to find a solution that might reduce the chance of the same error occurring again, called a "forcing function." For example, consider that most monitor/defibrillator devices allow an unsynchronized electrical countershock to be delivered even when a patient is in a rhythm like supraventricular tachycardia, which requires a synchronized shock. A forcing function might prevent the delivery of shock unless the device is placed in sync mode. Since it is impossible to eliminate human error, the second goal is to buffer the effect of an error after it occurs, or to find a solution that will prevent the error from leading to injury (this is why cars have airbags).
Let's use the stop sign case to illustrate the systems approach. If the initial near-miss had been evaluated, it might have been discovered that the driver ran the stop sign because she didn't see it. Further analysis would have revealed that she didn't see the stop sign because it was partially obscured by branches from a nearby tree, which had not been trimmed at this intersection for the past two years because the city crew responsible for trimming trees was shorthanded due to budget cuts. Thus, further investigation revealed some root causes, as well as a factor that is easily fixed—cutting the branches. If the city realized this, resources might have been shifted to trim vegetation at stop signs around the city in order to reduce the occurrence of this same error. This could result in overall improvement of system safety. Other system solutions might be to minimize the consequences of the error, such as placing four-way stop signs or lowering the speed limit. The safest system solution would be building a bridge to eliminate the intersection, but cost-benefit analysis might find that this solution is not feasible. This example demonstrates how a simple human error can be due to several latent problems that are only identified once the event is analyzed more deeply. If we stop with punishing the driver for running the stop sign, we'll miss the opportunity to make changes that will prevent the same thing from happening again in the future. In this case, it would have saved a life.
This approach can be applied in the EMS setting. An adverse event or near-miss should be investigated by asking "why," unlike the blame-and-shame system that asks "who" was at fault. It is important to ask "why" six times, because stopping at one "why" will not identify the real root cause of the error or all of the contributing factors.
Some EMS managers might be concerned that the systems approach could result in a lack of accountability among EMS providers. Although the systems approach suggests that it is counterproductive to penalize a medic for a "normal" error, there needs to be a method, such as Just Culture, to make sure the systems approach is not used as an excuse for grossly negligent actions. Just Culture recognizes that competent professionals make mistakes and even competent professionals will develop unhealthy norms (shortcuts, "routine" rule violations), but it has zero tolerance for reckless behavior. Just Culture might dictate that we issue a formal warning to the driver who ran the stop sign. Issuing a warning rather than a ticket recognizes that there were several contributing factors to the event, but reminds the driver that he also has an active role in ensuring traffic safety. The same would be true for EMS, suggesting that medics be warned for minor errors and receive some form of remedial training for more egregious ones. This results in a less punitive environment, where providers are less motivated to hide their mistakes. This also leads to increased awareness of EMS leaders about the types of near-misses that are occurring and how often. Analysis of these events can lead to system changes that will prevent future injury to patients.