Turning Mistakes Into Learning

Self-reporting of errors helps ensure they’re not repeated


In 1999, a landmark report from the Institute of Medicine, To Err is Human , estimated that at least 44,000 (and perhaps as many as 98,000) Americans die in hospitals each year as a result of medical errors, and that hospital patient-safety incidents account for $6 billion a year in extra...


To access the remainder of this piece of premium content, you must be registered with EMS World. Already have an account? Login

Register in seconds by connecting with your preferred Social Network.

OR

Complete the registration form.

Required
Required
Required
Required
Required
Required
Required
Required
Required
Required

In 1999, a landmark report from the Institute of Medicine, To Err is Human, estimated that at least 44,000 (and perhaps as many as 98,000) Americans die in hospitals each year as a result of medical errors, and that hospital patient-safety incidents account for $6 billion a year in extra costs in the U.S.

That should make EMS leaders wonder about mistakes in their own systems that may harm patients and increase costs.

In a 2002 Prehospital Emergency Care article, authors led by Robert O’Connor, MD, MPH, wrote a consensus statement that represented the views of several respected medical directors regarding the national state of EMS safety. The group identified common EMS errors and concluded that standard operating procedures to prevent and recover from errors in the field were “in their infancy.” Shortly thereafter, researchers surveyed 283 EMS providers attending a North Carolina EMS conference and found 44% of them had committed one or more errors during the previous year. Only half of those errors were reported to a supervisor or medical director.

In 2008, the Richmond Ambulance Authority’s operational medical director, Joseph P. Ornato, MD—an instrument-rated pilot who had firsthand experience with the high level of safety achieved in the aviation field—instigated a successful error self-reporting program patterned after the Aviation Safety Reporting System (ASRS) developed by NASA. The NASA system was designed to detect all near-misses and translate lessons learned into operational process changes, rather than blaming individuals for human errors.

A successful self-reporting system requires high degrees of trust and confidence on all sides. Management must trust that providers are highly trained and will always do their best for their patients; staff must trust that a single first-time error or accidental oversight will result in learning and not termination. Emphasizing the tenet of complete trust, RAA has instigated a successful self-reporting program we consider a major pillar in our culture of safety and delivery of world-class EMS.

In the words of RAA’s own self-reporting standard operational guideline (SOG), “The purpose of self-reporting is a way to tell management when something goes out of the ordinary or [becomes] clinically unacceptable.” These could also be called near-misses. Examples include medication errors, overlooking important clinical procedures, driving infractions, etc.

RAA and its medical director acknowledge that self-reporting is nondisciplinary in nature—if a mistake happens, it is considered part of the learning process. That said, RAA’s self-reporting procedure only applies when a crew member promptly notifies RAA of their potential error. A report should be completed immediately after the apparent violation has been discovered and before RAA management learns of it by other means. There is a healthy expectation that reporting immediately after an incident can lead to learning, but making the same mistake more than once is not part of the learning process—it is a pattern and may ultimately be addressed via other methods that may include disciplinary action. This rarely occurs, as the lessons RAA identifies quickly become lessons learned.

The Process

When a staff member needs to self-report, they first must notify the chief clinical officer. They must then produce an incident report before the end of the shift that includes:

  • A brief description of the apparent violation, as well as how and when it was discovered;
  • Verification that any noncompliance with RAA policies or procedures ceased after the error was identified;
  • A brief description of the actions taken immediately upon discovery of the apparent violation, what was done to terminate the conduct that resulted in it, and the person responsible for taking the immediate action.
This content continues onto the next page...