Leaders, pundits and advocates (and, yes, writers) in the world of emergency medical services don’t know much. They’ll freely tell you so—the information we lack about EMS in the U.S. could fill many volumes. Even basics like the numbers of services and vehicles and providers can only be pegged to informed estimates.
Striking a blow against such ignorance comes the prodigious National EMS Assessment, the most extensive cataloguing yet done of the facts, figures and features of American EMS. Commissioned by FICEMS (the Federal Interagency Committee on EMS) and funded by NHTSA, the report portrays as fully as currently possible the present-day landscape of emergency medical services in the nation and its states and territories, including its 9-1-1 communications and emergency preparedness. Its goal is to help NHTSA better understand the EMS data currently being collected across the U.S.
“The idea was really to try to describe EMS using existing data sources,” says the Assessment project team’s Greg Mears, MD, director of innovation for the University of North Carolina’s Department of Emergency Medicine and that state’s EMS medical director from 1998–2011. “That meant we had to search high and low for data that was already in place, or had already been collected, that described EMS. And we also had to take that information and be able to extrapolate it out to the state or nation.”
For the Assessment team’s purposes, just two data sources were sufficiently comprehensive: the National EMS Database maintained by the NEMSIS Technical Assistance Center, and the National Association of State EMS Officials’ EMS Industry Snapshot. Data from the former came from the 30 states submitting NEMSIS data in 2010; the latter information was collected by NASEMSO in early 2011. Authors also convened four expert panels (two each for EMS and emergency management) to help suss out and characterize trends that eluded more objective measurement.
Other data sources were less helpful, although some information was also derived from the Emergency Medical Services for Children program’s 2010–11 federal reporting and the 2007 EMSC Indian Health Services Tribal EMS Pediatric Assessment. Generally, sources beyond those are still “maturing,” the Assessment’s authors observed, and may be more useful in the future.
For this first effort, though, that left some blanks to be filled in—with more informed opinion from state offices.
“With some of the information, we were truly able to just do number-crunching on data from all 50 states and come up with an answer,” says Mears, who led UNC’s EMS Performance Improvement Center as it spearheaded the project. “But some of it was based on information provided by the state EMS offices—just their current thoughts and opinions. In other words, it wasn’t necessarily objective, but more of a survey-type approach.”
That’s not to call the results into question; authors believe they’re a reliable depiction of EMS in the U.S., if perhaps not perfectly accurate to every count. But it does underscore the holes still plaguing our growing EMS data systems.
“With something like workforce injuries, we’d have loved to say, ‘We identified this many,’” says Mears. “What we ended up having to say is that only one state even collects this information. So some of the questions were almost flipped, and instead of describing the results, we described the lack of results.”
It’s worth noting also that even states that do track certain data may have gaps. Of the states presently contributing data to the national database, only around a quarter capture 100% of their EMS events.
Nonetheless, there’s a lot of good information in the Assessment that can tell us a lot about what EMS in America looks like today. In all, the document presents 200 data points. What investigators found was enlightening and occasionally surprising; some interesting aspects follow.