Skip to main content

Advertisement

ADVERTISEMENT

Original Contribution

No More Blame & Shame

September 2008

     As part of a routine skin biopsy procedure at his dermatologist's office, a healthy 22-year-old male receives an injection of lidocaine with epinephrine. Within minutes, his heart starts pounding and he begins to feel anxious. The dermatologist believes the patient is having an anaphylactic reaction and calls 9-1-1.

     When EMS arrives at the scene, the dermatologist informs them that the patient is suffering from an anaphylactic reaction. The patient reports that his heart is pounding, and he feels short of breath and very anxious. En route to the hospital, the paramedic asks his EMT partner to get the diphenhydramine and epinephrine vials out of the drug box. The paramedic later recalls asking the BLS partner to draw up "all of the diphenhydramine," but the EMT recalls hearing the paramedic ask for "all of the epinephrine." The paramedic takes the prepared syringe from the EMT and administers the medication intravenously, without checking the amount or the vial it was drawn from. Within minutes, the patient's rhythm changes to sustained ventricular tachycardia, and the patient complains of severe chest pain and diaphoresis, becomes distraught and says, "I think I'm dying." At this point, the paramedic realizes that he has just delivered 1 mg of 1:1,000 epinephrine via rapid intravenous bolus. Later, in the emergency department, it is determined that the patient suffered a myocardial infarction during the event. A lab analysis shows a rise in troponin levels, and a wall motion abnormality is found on echocardiogram, indicating that the patient sustained permanent damage to his heart muscle.

     Adverse events like this are not uncommon. In fact, more deaths occur each year due to medical errors than from motor vehicle crashes, breast cancer or AIDS. Although there are currently no reports that specifically look at EMS error rates, several suggest that EMS is no different than the rest of medicine with regard to patient safety. This is especially significant considering that 15,000 EMS systems and upwards of 800,000 EMS personnel respond to more than 16 million transport calls annually.

     The current EMS culture often uses "blame-and-shame" mentality. When an adverse event occurs, the common first response is to find out whose fault it is and discipline the individual. Unfortunately, this approach is not effective for improving overall patient safety, for several reasons. First, it ignores the fact that other factors in the system (besides the individual provider) might have contributed to, facilitated or even caused the adverse event to occur. This is important, because if these factors can be identified and modified, the chance of similar events occurring in the future can be reduced. Second, focusing blame on the individual doesn't prevent the same event from happening to another provider. Third, the blame-and-shame mentality creates a culture where EMS providers fear reprisal and may try to hide adverse events and near-misses rather than using them to improve the system. Unless management and system leaders are aware of events, they can't take steps toward reducing them.

     Other high-risk industries, such as aviation and nuclear power, have become highly reliable and safe because they have moved away from this mentality and instead use concepts like the systems approach to maximize their safety. The systems approach recognizes that all adverse events have multiple contributing factors, many of which are out of the provider's control. Aviation, for example, utilizes an Aviation Safety Reporting System (ASRS), which documents both adverse events and near-misses. Observing ASRS's success in aviation, members of the EMS community followed suit and developed a similar system: the Medical Error Prevention and Reporting System (MEPARS). A number of agencies around the country have implemented MEPARS or similar systems. Since its inception, MEPARS has not only identified several near-misses and adverse events, but has reduced the recurrence of similar events. The purpose of this article is to illustrate how using a systems approach in EMS, and using an event-reporting system like MEPARS, is a better method for reducing adverse events than the blame-and-shame approach.

BLAME-AND-SHAME
     When an error occurs, it's natural to ask who was at fault and hold them accountable for their mistake. While some may feel better because the person involved had to "pay" for the error, it isn't an effective way to improve the overall safety of the system. To illustrate this concept, consider a common example: Not noticing a stop sign, a woman drove right through a four-way intersection without stopping. By chance, she did not hit anyone. In patient-safety language, this is defined as a near-miss, because, although there was a situation with potential to do harm (a hazard), no one was hit. A police officer who observed the event stopped the driver and issued a ticket. In this case, the person was blamed (stopped by the police officer) and shamed (given a ticket). It was expected that this punitive action would not only teach the woman not to run stop signs, but also serve as a deterrent to others. Later that year, at the same intersection, a man ran the stop sign and struck and killed a bicyclist. This was an adverse event, as someone was killed.

SYSTEMS APPROACH
     The goal of the systems approach is to examine all of the factors that led to an adverse event or near-miss, and to make changes to the system to prevent similar events in the future. One important part of this approach is understanding that human error is inevitable and will be repeated. Thus, after an incident occurs, the approach should focus on identifying problems in the system and finding changes that could be implemented to minimize the impact of human error. This is accomplished through two goals. The first is trying to find a solution that might reduce the chance of the same error occurring again, called a "forcing function." For example, consider that most monitor/defibrillator devices allow an unsynchronized electrical countershock to be delivered even when a patient is in a rhythm like supraventricular tachycardia, which requires a synchronized shock. A forcing function might prevent the delivery of shock unless the device is placed in sync mode. Since it is impossible to eliminate human error, the second goal is to buffer the effect of an error after it occurs, or to find a solution that will prevent the error from leading to injury (this is why cars have airbags).

     Let's use the stop sign case to illustrate the systems approach. If the initial near-miss had been evaluated, it might have been discovered that the driver ran the stop sign because she didn't see it. Further analysis would have revealed that she didn't see the stop sign because it was partially obscured by branches from a nearby tree, which had not been trimmed at this intersection for the past two years because the city crew responsible for trimming trees was shorthanded due to budget cuts. Thus, further investigation revealed some root causes, as well as a factor that is easily fixed—cutting the branches. If the city realized this, resources might have been shifted to trim vegetation at stop signs around the city in order to reduce the occurrence of this same error. This could result in overall improvement of system safety. Other system solutions might be to minimize the consequences of the error, such as placing four-way stop signs or lowering the speed limit. The safest system solution would be building a bridge to eliminate the intersection, but cost-benefit analysis might find that this solution is not feasible. This example demonstrates how a simple human error can be due to several latent problems that are only identified once the event is analyzed more deeply. If we stop with punishing the driver for running the stop sign, we'll miss the opportunity to make changes that will prevent the same thing from happening again in the future. In this case, it would have saved a life.

     This approach can be applied in the EMS setting. An adverse event or near-miss should be investigated by asking "why," unlike the blame-and-shame system that asks "who" was at fault. It is important to ask "why" six times, because stopping at one "why" will not identify the real root cause of the error or all of the contributing factors.

JUST CULTURE
     Some EMS managers might be concerned that the systems approach could result in a lack of accountability among EMS providers. Although the systems approach suggests that it is counterproductive to penalize a medic for a "normal" error, there needs to be a method, such as Just Culture, to make sure the systems approach is not used as an excuse for grossly negligent actions. Just Culture recognizes that competent professionals make mistakes and even competent professionals will develop unhealthy norms (shortcuts, "routine" rule violations), but it has zero tolerance for reckless behavior. Just Culture might dictate that we issue a formal warning to the driver who ran the stop sign. Issuing a warning rather than a ticket recognizes that there were several contributing factors to the event, but reminds the driver that he also has an active role in ensuring traffic safety. The same would be true for EMS, suggesting that medics be warned for minor errors and receive some form of remedial training for more egregious ones. This results in a less punitive environment, where providers are less motivated to hide their mistakes. This also leads to increased awareness of EMS leaders about the types of near-misses that are occurring and how often. Analysis of these events can lead to system changes that will prevent future injury to patients.

     Even in an agency with a true organizational culture of safety, there will occasionally be situations that warrant punitive action, such as reckless behavior or criminal activity. In fact, some agencies with protected reporting systems have chosen to list specific exclusions from protection, including operational issues like being persistently late for work or making inappropriate statements to coworkers or patients. This makes sense, since the goal of event reporting is focused on preventing adverse medical care events. Cases requiring punitive action are often straightforward, such as criminal activity while on duty. Other times, the need may be more complex, such as providers who repeat events that raise questions about their ability to improve. In theses cases, agencies must be very careful, because the true (but often less apparent) root cause of repeated similar events is often a persistent system fault rather than human error.

     An agency may occasionally conclude that a provider is a risk to patients, or has persistent issues with skills or clinical competence that are refractory to educational interventions. In these cases, the agency must use unreported events to create the paper trail that is required for termination or other adverse action. In order for protected reporting systems to be successful, the protections must be consistently adhered to. The aviation system (discussed below), which has successfully followed this principle without exception for over 30 years, does not have a problem terminating incompetent or reckless pilots. Problem EMS providers usually have ongoing operational issues that are not protected and can be directly addressed following procedures outlined by employment agreements, unions or civil service systems.

AVIATION'S SUCCESS STORY
     In order for the systems approach to be effective in improving patient safety, near-misses and adverse events must be closely examined for contributing factors; however, it is not practical to suggest that every contributing factor that is identified can be fixed. Instead, EMS systems must have a way to identify trends and share information across regions. One method that has proved to work in the aviation industry is to rely on a sophisticated event- reporting system that collects and analyzes near-misses and adverse events. The Aviation Safety Reporting System is a database of more than 600,000 near-miss and adverse event reports that have been voluntarily submitted by pilots, flight attendants, air traffic controllers and maintenance personnel. The reports are catalogued and analyzed, and when prominent hazards or trends are identified, alerts are sent out to the aviation community. This system is successful because it shifts away from a culture of blame toward a culture that identifies problems and design changes that target the system rather than the person, and it encourages people to report near-misses and adverse events by offering immunity from punitive action, thereby encouraging reporting. Recognizing the success of the ASRS system, many argue that the aviation experience might provide a viable solution in medicine.

     Due to the success of the aviation system, a national event reporting system has been developed for EMS. The Medical Error Prevention and Error Reporting System follows the concepts of ASRS. It is a voluntary self-reporting system, where each report is sent to and reviewed by EMS patient safety experts with no supervisory or enforcement powers over the EMS providers who are reporting the event. The experts collect enough information to classify contributing causes, then remove any identifying information from the report so it is entered anonymously into the database. Reviewers periodically analyze the data to identify trends and publish a monthly newsletter to all participating agencies to describe trends and reasons to be cautious. As an incentive to encourage reporting, participating EMS agencies have committed to provide EMS providers who submit a MEPARS report immunity from punitive action.

     Although participation in event- reporting systems like MEPARS is one way to advance an EMS agency toward a culture of safety, EMS providers will not participate in the system unless the culture allows them to feel safe doing so. There are many ways in which agency leadership can help foster this feeling and avoid underreporting of events. The event-reporting system should be integrated into daily operations. For example, when a provider submits an agency incident report describing a near-miss or adverse event, the supervisor should encourage him to submit the event to the adverse event reporting system. Similarly, it is critical for agency leadership to consistently respect the protections to ensure the success of event reporting.

CASE STUDY REVIEW
     Let's reconsider the initial case reported through the MEPARS system and see how it is resolved using the systems approach. An analysis of the event revealed that there was no procedure to double-check the medication prior to its administration; the medications were in similar vials with similar labeling; there was miscommunication between the providers regarding the drug needed; and it was normal procedure in this EMS agency for BLS providers to draw up medications for the ALS provider. Another contributing factor was that the dermatologist's mistaken diagnosis of anaphylaxis (rather than a normal reaction to inadvertent intravascular injection of epinephrine) influenced the paramedic's assessment of the patient. Based on the lessons learned from this event, several "system fixes" could be implemented to avoid similar future events. For example, ALS providers should always prepare their own medications. Similar vials could be modified with different colored labels or by purchasing the medications from different manufacturers. In an EMS system where 1:1,000 epinephrine is only administered in doses of 0.3 mg or less, it should not be available in 1 mg vials. If administration of 1:1,000 epinephrine is not allowed intravenously, make it available in prefilled syringes that can only be administered intramuscularly (such as an EpiPen).

     If this case was resolved using the blame-and-shame approach, the results would be far different. The EMT and paramedic would immediately be identified as culprits. The BLS medic would be blamed for breaking policy by drawing up medications; the paramedic would be blamed for allowing the BLS medic to draw up the medication, as well as for not verifying the medication before administering it. Both people might be terminated, and the agency would feel it had resolved the issue with no further motivation to find avenues of system improvement and no protection from the same event occurring in the future or in a different EMS agency.

CONCLUSION
     The road to creating a safe environment for patients will involve a change away from our current EMS culture where near-miss errors are rarely reported, usually due to fear of punitive reaction from peers and supervisors. Rather than blaming individuals for errors, we should look at what system changes can be made to reduce the chance of the same error occurring, or ensure an error does not result in an adverse event. Through an event-reporting system like MEPARS, a much-needed transition can be made away from the flawed blame-and-shame, and system problems can be identified and addressed nationally so EMS agencies everywhere can improve the safety of their systems.

Bibliography
Berwick DM. Errors today and errors tomorrow. N Engl J Med 348(25):2570–2572, 2003.
Billings CE. The NASA Aviation Safety Reporting System: Lessons learned from voluntary incident reporting. In: Proceedings of Enhancing Patient Safety and Reducing Errors in Health Care, p. 97–100, 1999. Chicago: National Patient Safety Foundation.
Fairbanks RJ, Crittenden CN, O'Gara KG, et al. The nature of adult and pediatric adverse events and near misses in EMS (Abstract). Prehosp Emerg Care 9(1):102–103, 2005.
Fairbanks RJ, Caplan SH, Bishop PA, et al. Usability study of two common defibrillators reveals hazards. Ann Emerg Med 50(4):424–432, Oct 2007. Associated editorial: Karsh and Scanlon, 50(4):433–435 Oct 2007.
Helmreich RL. On error management: Lessons from aviation. Br Med J 320(7237):781–785, 2000.
Hobgood C, Bowen JB, Brice JH, et al. Do EMS personnel identify, report, and disclose medical errors? Prehosp Emerg Care 10(1):21–27, 2006.
Kohn LT, Corrigan JM, Molla S [eds]. To Err Is Human: Building a Safer Health System. Committee on Quality of Health Care in America. Washington, DC: Institute of Medicine, National Academy Press, 2000.
Lindstrom AM. 2006 JEMS platinum resource guide. JEMS 31(1):42–56,101, 2006.
McCaig LF, Burt CW. National Hospital Ambulatory Medical Care Survey: 2003 emergency department summary. Advance data from vital and health statistics, No. 358. Hyattsville, MD: National Center for Health Statistics, Centers for Disease Control and Prevention, May 2005.
Mears G. 2003 Survey and Analysis of EMS Scope of Practice and Practice Settings Impacting EMS Services in Rural America: Executive Brief and Recommendations. https://asrs.arc.nasa.gov/.
Vilke GM, Tornabene SV, Stepanski B, et al. Paramedic self-reported medication errors. Prehosp Emerg Care 10(4):457–462, Oct–Dec 2006.
Wang HE, Fairbanks RJ, Shah MN, et al. Tort claims from adverse events in emergency medical services (abstract). Prehosp Emerg Care 11(1):96–97, 2007.

www.mepars.com

www.medicalhumanfactors.com

www.justculture.org

Karthik Rajasekaran, BA, EMT, is a second-year medical student at Chicago Medical School of Rosalind Franklin University of Science and Medicine.

Rollin (Terry) Fairbanks, MD, MS, EMT-P, is an assistant professor of Emergency Medicine at the University of Rochester (Rochester, NY), associate regional EMS medical director and REMAC chair.

Manish N. Shah, MD, MPH, is an associate professor of emergency medicine at the University of Rochester (Rochester, NY), chief of the Division of Prehospital Medicine, and the regional EMS medical director.

Advertisement

Advertisement

Advertisement