Healthcare Safety Lessons from Air France 447 | Joe Bormel | Healthcare Blogs Skip to content Skip to navigation

A Tragic Air Crash Helps Define HCIT Safety Needs (Part 1)

July 20, 2012
| Reprints

A Tragic Air Crash Helps Define HCIT Safety Needs (Part 1)     

Healthcare Safety Lessons from the Inter-Tropical Convergence Zone

I have long been an advocate of relating aviation safety to our needs in Healthcare IT.  In recent years I’ve been gratified to see that a number of my colleagues and knowledgeable healthcare IT advocates have also adopted this approach.  We do this to clarify our challenges to improve patient safety, quality of care, and cost control by incorporating real world needs to effectively evolve technology.

To this end, I am about to present the first installment of a 3-part series using the lessons learned from the tragic crash of Air France Flight 447 (AF447) as they relate primarily to Clinical Decision Support.  I invite your comments along this journey so that together we can evolve this blog.  That said, let’s begin.

There have been literally dozens of articles and thought leader blogs written about safety lessons from aviation for healthcare.  One that I found particularly interesting is an article (here) by Laura Landro on the Health Blog of The Wall Street Journal that considers aviation an “inspiration for improving patient safety.”  

Further, on the Healthcare Informatics website, David Raths posted a blog entitled “Does Healthcare Need an NTSB?”  In it, David points out that “IOM has recommended creating a coherent structure for reporting health IT-related errors,” and raises the question, “... I wonder if there is an equivalency in terms of what pilots and mechanics learn from airplane mishaps and what clinical teams would learn from medical error investigations.”  His blog is a thought-provoking read. 

Earlier this month, the final crash report on AF447 was released.  The implications for HCIT safety, usability and hazard governance are profound.  The crash occurred on June 1, 2009.  All 228 people onboard were killed, and it took three years to unravel the mysterious components of the story.  

Several of the elements of this tragedy are particularly salient to the Clinical Decision Support capabilities and the paired human systems we all see as factors necessary to improve health, healthcare and costs.  These factors are:

          1.  Sensor failure precipitating a lethal cascade

          2.  Sudden autopilot withdrawal

          3.  Team competence dynamics

          4.  Black box incident reconstruction

          5.  Real time management

          6.  Physics and physiology, the Coffin Corner

          7.  Safety regulation

          8.  Privacy and individual rights

In essence, the sequence of events that led to the crash of AF447 began with a failure of the air speed sensor system.  Of note is that this simple system—keep that in mind, simple system—uses a series of what are called Pitot static tubes mounted externally on the aircraft’s fuselage to determine air speed.  The plane’s forward motion forces air into the tubes and the pressure it creates is computed to determine air speed. 

This simple process triggered the disaster because all three of the Airbus A330 Pitot tubes froze.  Therefore, the sensor system reported back to the plane’s autopilot that the air speed was zero.  In turn, the autopilot informed the cockpit crew it could no longer perform its function and shut down automatically.

All of this occurred at high altitude and high speed where the aircraft’s stall speed (the speed at which it quits flying and begins falling through the sky) is very close to the actual speed of the plane.  Further complicating this deadly situation was an inaccurate report by a ground station of a severe storm AF447 had entered, which likely caused the tubes to freeze in the first place.   

A stall at high speed and altitude is very serious.  In the aircraft industry, it’s known as the “Coffin Corner.”  Managing such an emergency requires a few more degrees of sophistication than the challenges pilots are normally subjected to in typical flight simulators.  Succinctly, the following list, complied from the official report, provides insight to the deadly chain of events:

     PF = Pilot Flying; PNF = Pilot Not Flying 

The immediate factors leading to the stall that befell AF447 were so severe the pilots were unable to recover, resulting in a high impact crash with the ocean below.  In fact, many fight crews faced with the same situation would likely not be able to recover either.  That conclusion is at the crux of this issue.  However, the broader issues, what took place months before and months after the event, are perhaps even more prescient to the grand strategic vision for employing HCIT to improve the pursuit of better health, healthcare and cost control.

In Part 2 of this blog, we will begin in detail to explore the eight factors of this crash that relate to Healthcare IT issues and challenges.  If you would like to read more about AF447, I recommend a CNN article that captures the most salient points.

Your thoughts are welcome so together we can evolve this blog topic.

Joseph I. Bormel

CMO and Vice President

QuadraMed Corporation


“After seventy-five years,” said Captain Sullenberger in closing, “we [in aviation] have benefited from lessons learned at great cost ... lessons that we now offer up to the medical profession for the taking.”

                                                                        Capt. C.B. "Sully" Sullenberger


The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


See more on