Skip to content Skip to navigation

A Tragic Air Crash Helps Define HCIT Safety Needs (Part 3)

August 9, 2012
| Reprints

Healthcare Safety Lessons from the Inter-Tropical Convergence Zone

In the final edition of this particular blog, we will explore the last four HCIT CDS factors I developed that relate to the causes of the crash of Air France Flight 447.  Let’s begin.

5.  Real Time ManagementThe potential role of technologies, such as GPS and real time remote flight tracking, could have added much-needed clarity in the three minute, uncontrolled vertical decent of AF447.  However, they were not implemented. 

These technologies would have added to the amount of data that needed to be processed.  And, it would have been data that needed to be processed more quickly than unaided human interpretation could possibly achieve.  The resulting network could very well have shifted the “decision rights” out of the cockpit during a time of cascading, chaotic failure. 

But isn’t the “pilot as the ultimate decision maker” (or doctor for that matter) an explicit design assumption that’s being called into question?  The philosophy of more pilot and less automation is now evolving to less dependence on pilots.  As illustrated by this crash, it may be impossible to staff a cockpit adequately to sort out the confluence of interacting complexity of subsystems.  And as noted by Captain “Sully” Sullenberger, cockpits that bring together the pilot’s user experience and usability with analytics and related automation can have a huge impact on safety.  Captain Sullenberger has suggested that the lack of all necessary data hurt the survivability of AF447.

This is exactly the argument for much richer support to review information and test hypotheses, the so called “Watson for Healthcare” that I’ve elaborated on in a previous blog.  The fact is, at some point in time, this becomes the only way to review and analyze data within the required time constraints.

6.  “Coffin Corner” Stakes: The physics, and metaphorically, the complexity of human physiology and pathology of disease.

The presence of a “Coffin Corner,” the point at which an aircraft’s stall speed rises with altitude, is a narrow operating airspeed band that must be maintained.  It is also a technology-created hazard that cannot be avoided.  Before we attempted to fly so high, flight management was easier.  

The question becomes, with critically ill patients or simply considering genomic data, have we become more reliant on HCIT than we may be prepared to manage?

7.  Safety Regulation: The U.S. has the National Transportation Safety Board, which has the budget and governance to independently investigate accidents in the public's interest.  

It's argued that the French Bureau of Investigation and Analysis did not have the budget to do its job.  Further, the French government has a financial interest in both Airbus, the manufacturer of AF447, and the airline, Air France.  These factors could influence the nation’s judicial system, which is accountable to assign liability based on available data.  

In healthcare, the economic interests and politics of our vendors, device manufacturers, hospitals, health systems, doctors, payers, and pharmaceutical companies would likely be similar to the Airbus and Air France situation, and similarly co-mingling with government interests.

My take is two-fold.  First, we need an independent agency like the NTSB to produce unbiased reports.  Second, we should expect and plan for something far more nuanced. 

Instead of the rare, publically visible, simultaneous deaths of 200 people at one time in one location (AF447), the safety challenges to healthcare are far larger, far more common, and far less mature in terms of technologies in routine use.  The current scale of these challenges has been called the equivalent of 20 Boeing 747 Airliners crashing per week.  Therefore, improving healthcare delivery by even a small, measurable fraction is huge.  That, of course, starts with significant advances in our current reporting systems, exploiting the relatively new Patient Safety Organization Framework and Common Formats.   

8.  Privacy and Individual Rights: Once the wreckage of AF447 was located, the question of whether to exhume the bodies was raised.  The families of the victims had polar opposite feelings on what was appropriate.  In healthcare, there is precedent to perform an autopsy to establish cause of death when it is unknown or where foul play is considered a possibility.  

Take, for instance, the excerpt from a speech by Sorrel King describing her experience with her daughter, Josie King.  After reading it, I think you’ll agree that it is rare for a patient or family member to place the common good ahead of their personal feelings.  What Sorrel King did required tremendous strength of character, as well as forgoing the privacy that could have been evoked after a devastating event.

My wife and I once received some truly bad medical advice for our daughter’s care from multiple professionals.  Had we followed one of these recommendations in particular, certain harm would have resulted.  This was confirmed for me a half dozen times by experienced practitioners in the field in question.  My point is that improving healthcare safety is often less black and white than an aviation crash in terms of reporting and learning.




I’m in full agreement with you that we need a NTSB-like watchdog for HCIT patient safety. Rather than starting from scratch, do you think The Joint Commission could be evolved to handle this?

You’ve written a very informative series. But I would like to point out that HIM/RCM are more efficient and better able to cope with the nonstop increases in regulations and care mandates if an effective EMR is implemented.

The clinical side must become more aware of how it affects the financial viability of the institution as a whole. This does add another burden to the load clinicians manage, but nonetheless, it is an integral component of the care continuum.


Jack, Thanks for your comment and question.

The NTSB boasts that they are sufficiently independent to do their jobs. "We dont operate, regulate or fly aircraft." (Vice Chariman of NTSB Christopher Hart)

Clearly, The Joint Commission (TJC) through its accrediting and certification effectively regulates healthcare organizations. It seems that separation and independence would be challenging to evolve to. You do raise the interesting parallels, ie both organizations serve to promote safety:

-NTSB is charged with "determining the probable cause of transportation accidents and promoting transportation safety..." [home page: ]

-TJC is similarly focused on evaluation and promotion of safe and effective care of the highest quality and value [Mission statent, here: ]

The distinctions are both subtle and important. The NTSB is formed from 5 members, nominated by the president, confirmed by the Senate. Only three of the five members can be in the same party as the president; this creates a party balance. Those members have five year terms that are staggered; this helps ensure independence since only one member can be replaced every year. A new president cannot replace everyone. These safeguards and others are necessary to remove biases and ensure that conclusions are based on the facts, not politics. Also, NTSB does not look at cost, only impact on safety. Since a factor in value is cost, TJC focus on quality and value is distinct from NTSB's single minded focus on safety.

Thanks again for the comment.

Dr. Joe,
As usual an excellent piece. This was the best part for me…

“My point is that improving healthcare safety is often less black and white than an aviation crash in terms of reporting and learning.”

Goes right back to what I replied in your first post. Checklists work best when we know what the problem is and have a clear goal ‘a la’ procedural medicine. Checklists can be helpful but far from error proof when we do not know what the ‘real’ problem is (cognitive medicine).

Keep up the good work,
Frank Poggio
The Kelzon Group

Thanks Frank, for the kind words.

To borrow from Daniel Kahneman's recent book, Thinking, Fast and Slow, quoting Donald Rumsfeld about 'unknown unknowns':

"There are known knowns. There are things we know we know. We also know there are known unknowns. That is to say, we know there are some things we do not know. But there are also unknown unknowns, the ones we don't know we don't know."

Kahneman points out that it's an inherent properly in how we think fast, that we assume "what we see is all there is." To your point, Frank, that's a real fast way to miss the real problem!

Thanks for your comment and encouragement.

Joe - TERRIFIC summary of this disaster and its relevance to patient safety. The problem remains, however, that aviation knows how to learn from disasters and medicine does not. Its true that "To Err is Human" opened the door to the problem of patient safety and that many institutions have at least started to address the problem. But safety is NOT the first concern of either physicians or health care systems, and so long as safety remains an afterthought,there will never be enough motivation to truly learn the lessons from the safety disasters that are still so common. I would love to see the IT innovations you propose come to be, but they will be trumped by apathy every time.

Mark L Graber MD
Society to Improve Diagnosis in Medicine

Thank for your comment and kind words, Mark.

Mark's work on cognitive errors in medicine (recent reference: caused me consider the informational impact of ARRA MU here:

As you described in detail, defective synthesis of available data are more common than knowledge deficits. The fast "System One", described by Nobel prize winner Daniel Kahneman, a system we all have, is perfectly happy to come to conclusions with inappropriately high confidence using available data. One of the problems with this, as you have pointed out years ago, Mark, is that System One is incapable of statistical thinking. It's also incapable of asking the question "what information don't I have that I need to reason safely in this case?" It makes the assumption, as Kahneman states numerous times, that "what [data] I see is all there is," rather than asking what I might need to know.

I agree with your observation and concern that safety is rarely the objective function in design and implementation. It's generally more of a weakly articulated and often poorly understood constraint.

That said, based on my reading, I dont think Kahneman would agree with your framing the problem as apathy. It's much more closely akin to blindness. The semantics make a difference; blindness is unresponsive to visual training, policy or financial incentives. That's consistent with your previously stated recommendation for a metacognitive approach. That, along with other structural changes that you've advocated elsewhere (e.g. appropriate use of subspecialists) are critical to improve patient safety, as well as make HCIT more effective in general.

Interested readers are encouraged to read more at SIDM, here:

Thanks again, Mark.