Earlier this year, The New England Journal of Medicine published a very compelling commentary, based on research findings, in its Perspectives section, entitled “Use of Health IT for Higher-Value Critical Care,” by Lena M. Chen, M.D., and several other colleagues.
The study that Dr. Chen and her colleagues engaged in was a complex one, with complex results; I would urge readers to click on the link above and read that team’s findings and analysis for themselves. But what is abundantly clear here is that there remains a big disconnect right now between the rhetoric of optimized care and the reality of care delivery and resourcing choices being made by physicians in hospitals.
As is widely known, there is a major shortage of medical intensivists in hospitals these days; and those care specialists are needed more and more to tend to the immediate care needs of patients in today’s hospitals, who are far sicker than they were on average a decade ago, given the intensified utilization review on the part of payers that has meant that if patients are in the hospital, they are very sick or injured to begin with. Meanwhile, as Chen and her colleagues write, “In response to the shortage of intensivists, numerous strategies have been proposed. These include remote ICU telemonitoring and—as was recently recommended by the Society of Critical Care Medicine and the Society of Hospital Medicine—the critical care certification of hospitalists… Nevertheless, relatively little effort has been devoted to what could be the most promise approach to the problem: the application of advances in health information technology (HIT) to triage decisions. A few integrated health care systems such as the Veterans Affairs (VA) Healthcare System and Kaiser Permanente Northern California have already drawn on the ability of electronic health records (EHRs) to generate reliable estimates of the risk of death within 30 days for every patient on admission,” they note. “Yet these calculations of risk, which may combine real-time data on laboratory results, demographics, co-existing conditions, and vital signs, are not being used to inform decisions about admission to the ICU. To accelerate progress in this area,” the authors write, “we believe that more targeted incentives for meaningful use of HIT should be considered.”
Dr. Chen and her colleagues believe that the clinical decision support requirements under meaningful use should be made far more explicit, with regard to the triaging of patients at the time of hospital admission and at the time of evaluation by attending physicians or hospitalists. They write that “The triage of patients at the time of hospital admission is one such area ripe for study Triage decisions,” they emphasize, “frame the subsequent course of care for all hospitalized patients, yet in the case of critical care admissions, these decisions vary widely among hospitals, which suggests that there is at least some misallocation of resources. Reliable, individualized EHR-based predictions of risk have the potential to improve our ability to triage—and hence care for—patients.”
One particularly thought-provoking element in Dr. Chen’s teams findings from their data review is this: patients with a high severity of illness were much more likely to be admitted to the ICU than were patients with a low severity of illness; yet though for common cardiac diagnoses, severity of illness played a negligible role in decisions on whether to admit patients with such diagnoses to the ICU, the link between cardiac diagnosis and the potential for death was relatively quite low compared to the link between high acuity with non-cardiac conditions and the potential for death.
The bottom line? Chen and her colleagues conclude that physicians are simply taking the easy road forward in many cases, rather than using potentially easily available tools that could be embedded in EHRs, to carefully and precisely triage patients for more optimal ICU utilization. Since their conclusions may strike some as harsh, I’ll quote Chen and her team directly here, when they say, “Use of the ICU for providers’ convenience or peace of mind, as a temporizing measure for staffing problems, or as an all-purpose substitute for unavailable procedure or recovery rooms is unlikely to be an efficient use of valuable resources.” OUCH.
More positively, the authors note a couple of paragraphs later that “Data from the EHR offer us the chance to reexamine and improve the value of critical care. Incentives for reaching HIT targets related to patient triage could accelerate the research and collaboration necessary to take full advantage of this opportunity.”
I think the value of research and articles like this is very clear: they can stimulate long-needed discussions about how to improve EHRs to provide clinical decision support at the point of care that is far more specific and helpful than what is currently available almost anywhere. Will this be a difficult set of obstacles to overcome, insofar as triaging for potential ICU admission is concerned? Absolutely. But if there are areas in which CMIOs and other medical informaticists can play a crucial role in moving the U.S. healthcare system forward towards both better outcomes and improved cost-effectiveness, this appears absolutely to be one of those.
With so much already on their plates, clinical informaticists can be forgiven for feeling exhausted these days. Yet this NEJM article certainly points up yet another area in which we as a healthcare system are far from optimized yet in leveraging clinical IT to truly improve clinical outcomes and enhance cost-effectiveness.