Many patient care organizations are operationally focused on improving clinical and financial performance to succeed in a value-based environment. One of the primary ways to drive performance improvement is to leverage data and analytics to address care variations in clinical practice.
Franciscan Health, a 14-hospital health system based in Mishawaka, Indiana and serving patients in Indiana, Illinois and Michigan, is driving results in this area by utilizing a technology solution to analyze the system’s rich data to assess performances and, ultimately, reduce costs. Back in late 2012, Franciscan Health executive leaders began a system-wide effort to address clinical quality improvement.
“The leadership really looked at where the direction is headed as far as fee-for-value and trying to identify ways to tackle care variation and clinical quality improvement initiatives, and they wanted it to be an effort across the system,” David Kim, director of strategic and decision support at Franciscan Alliance, says. “They created what we call clinical operations groups, and they are all physician-led committees that are headed by the chief medical officer and/or the vice presidents of medical affairs (VPMAs) for each of the facilities and regions that we have.”
The first key step to this work was top leadership prioritizing the effort, drafting guidance and getting the right people to the table, Kim says. In addition to being physician-led, the clinical operations groups also are multidisciplinary teams, including leaders from nursing, pharmacy, case management and social work, “across the whole patient care continuum,” Kim says. “We were trying to get major departments together to tackle some of the areas, whether it was utilization, patient flow, performance and quality measurement. It’s now been in existence for six or seven years, and it’s been a concerted effort to have everybody focused on an on-going basis.”
Working with Skokie, Ill.-based Kaufman Hall, a provider of enterprise performance management software and consulting services, project leaders used the company’s Peak Software platform to analyze utilization, quality and cost data and internal and external benchmarks. Specifically, the team looked at four key pieces of data that indicate performance: lengths of stay (LOS), readmissions, risk-adjusted mortality rates and adjusted direct costs.
“The idea was to tackle care variation, looking at resource utilization, as well as looking at performance improvement for length of stay, readmissions and mortality rates and some of the quality metrics that we get monitored and measured on by CMS (the Centers for Medicare & Medicaid Services) and on pay-for-performance areas,” Kim says.
He continues, “Each region and facility was given some flexibility, as to challenges specific to them, so, in other words, they would prioritize different conditions, but across the board, we started off with targeted conditions like heart failure, pneumonia and sepsis. Those were common challenges across all the facilities, so those were some of the early wins of trying to build some momentum by targeting a few conditions rather than chew off too much at once.” He adds, “That started to have a halo effect, improving one condition, especially heart failure, for example, effects a large volume; it has a halo effect, in terms of improving other conditions.”
Kim notes that the Peak software platform includes clinical performance benchmarks at the national, state and hospital level. “The platform was very flexible in terms of giving us an ability to target and customize and provide ‘apples-to-apples’ analysis,” he says. “Their system helps us to group, to customize and profile; having that flexibility was one of the key components in trying to drill into some of these high-level opportunities. Choosing the right content as well as the analytic engine to drill down was really paramount in our process.”
The analytics tool allowed project leaders to integrate data sources, perform custom analytics and access a large library of benchmarks. The IT team was able to leverage the health system’s Epic electronic health record (EHR) to mine data for detailed internal process metrics. “In order to put that into perspective, it was important to have another engine and comparison point with benchmarks,” Kim says. “We can compare ourselves historically, that’s one thing. We may pat ourselves on the back if we improve by half a day or so, but if we’re still a day off the benchmark that lays the groundwork to push ourselves a little bit further and not just settle with historical improvement.”
He continues, “Risk adjustments are a part of that too, especially when you work with physicians; they always come up with explanations, such as my patients are sicker or I have a more challenging population to work with. So, the analytic tool has done some risk adjustment for us. So, we know that it’s apples to apples that we’re comparing heart failures at various levels of acuity, pneumonia patients at various levels of acuity, and knowing that patients are very different, we had to treat those things condition by condition, rather than trying to roll them up and then have some challenges with identifying where some of the opportunities are.”
From Data Analysis to Actionable Insights
Get the latest information on Health IT and attend other valuable sessions at this two-day Summit providing healthcare leaders with educational content, insightful debate and dialogue on the future of healthcare and technology.