Skip to content Skip to navigation

Does Healthcare Need an NTSB?

April 18, 2012
| Reprints
IOM has recommended creating a coherent structure for reporting on health IT-related errors

An article in the April 18 Boston Globe noted that seven Massachusetts hospitals plan to offer patients harmed by medical errors a prompt apology and financial settlements before they resort to lawsuits. A coalition of physician, hospital, and patient groups predict the effort at greater transparency will increase reporting of medical mistakes and cut down on lengthy litigation that drives up health care costs, the Globe reported.

This news item got me thinking about nationwide efforts to cut down on the number and severity of medical errors, especially as they relate to the introduction of new technology. A March 2012 paper published in the Journal of Patient Safety and an upcoming conference both raise the question: Would it make sense to create the equivalent of the National Transportation Safety Board for healthcare?

The article has an unusual cross-section of authors, including Charles Denham, M.D., chairman of the Texas Medical Institute of Technology; the famous pilot Chesley Sullenberger III; the actor Dennis Quaid, whose children were impacted by a hospital error; and John Nance, a pilot and attorney who has written a book called “Why Hospitals Should Fly.”

As the authors point out, the NTSB is an independent federal agency established by Congress primarily to investigate all significant transportation accidents for the purpose of learning lessons from significant accidents and applying those lessons through specific recommendations to prevent repeats.

Last fall the Institute of Medicine issued a report calling for greater oversight of health information technology as it relates to patient safety. The report asked the secretary of the U.S. Department of Health and Human Services to publish a plan within 12 months to minimize patient safety risks associated with health IT and report annually on the progress being made. One key recommendation involved creating a coherent structure for reporting on those issues to make sure health IT improves health quality and safety.

The NTSB routinely issues so-called ‘‘Blue Cover Reports’’ that detail their findings. Pilots learn about accidents that have occurred and how to avoid such events in their own flying.  The authors recommend creating an equivalent ‘‘Red Cover Report’’ for healthcare.

I applaud the authors’ effort on behalf of patient safety. I think their suggestions should be taken very seriously, as should the IOM report. But I wonder if there is an equivalency in terms of what pilots and mechanics learn from airplane mishaps and what clinical teams would learn from medical error investigations. Creating alerts and decision support tools has proven much more complicated for healthcare than for other fields. Perhaps that will prove to be the case with error reporting and analysis as well. Yet other than cultural inertia, I can’t think of a good reason not to try. And just as the increased reporting of data breaches may help chief information security officers recognize patterns and address them, the healthcare NTSB reports might highlight common work-flow issues that hospitals and clinics could fix.

If you’re interested in this topic, you may want to virtually attend a summit meeting about health IT and patient safety hosted by the Texas Medical Institute of Technology.  Scheduled for Friday, April 27, from 10 a.m. to 4 p.m. Eastern time, the program will be broadcast live over the Internet and the Twitter hashtag is #HITTMIT. Register and find a complete list of speakers at





Terrific post. Thank you.

Does Healthcare need an NTSB? There's a lot that healthcare needs to learn and aviation has lead the way, albeit over many decades. The story of Air France Flight 447's investigation, covered here ( ) illustrates some aspects that the NTSB structure got right, in contrast to this other model. Two dimensions were clearly better: 1) the budget to investigate was not determined by those being investigated, and 2) the legal and economic consequences were similarly separated.

Do you have recommendations on articles about the various AHRQ work, disseminating hazard avoidance, safety and harm mitigation practices? There's some great work I've seen presented under their leadership.

[BTW, that article is a particularly well written.]