At Franciscan Health, an Analytics-Driven Initiative is Improving Patient Care and Reducing Costs | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

At Franciscan Health, an Analytics-Driven Initiative is Improving Patient Care and Reducing Costs

June 11, 2018
by Heather Landi
| Reprints
The health system determined a $655,000 gap in care costs between its best- and average-performing physicians
Click To View Gallery

Many patient care organizations are operationally focused on improving clinical and financial performance to succeed in a value-based environment. One of the primary ways to drive performance improvement is to leverage data and analytics to address care variations in clinical practice.

Franciscan Health, a 14-hospital health system based in Mishawaka, Indiana and serving patients in Indiana, Illinois and Michigan, is driving results in this area by utilizing a technology solution to analyze the system’s rich data to assess performances and, ultimately, reduce costs. Back in late 2012, Franciscan Health executive leaders began a system-wide effort to address clinical quality improvement.

“The leadership really looked at where the direction is headed as far as fee-for-value and trying to identify ways to tackle care variation and clinical quality improvement initiatives, and they wanted it to be an effort across the system,” David Kim, director of strategic and decision support at Franciscan Alliance, says. “They created what we call clinical operations groups, and they are all physician-led committees that are headed by the chief medical officer and/or the vice presidents of medical affairs (VPMAs) for each of the facilities and regions that we have.”

The first key step to this work was top leadership prioritizing the effort, drafting guidance and getting the right people to the table, Kim says. In addition to being physician-led, the clinical operations groups also are multidisciplinary teams, including leaders from nursing, pharmacy, case management and social work, “across the whole patient care continuum,” Kim says. “We were trying to get major departments together to tackle some of the areas, whether it was utilization, patient flow, performance and quality measurement. It’s now been in existence for six or seven years, and it’s been a concerted effort to have everybody focused on an on-going basis.”

Working with Skokie, Ill.-based Kaufman Hall, a provider of enterprise performance management software and consulting services, project leaders used the company’s Peak Software platform to analyze utilization, quality and cost data and internal and external benchmarks. Specifically, the team looked at four key pieces of data that indicate performance: lengths of stay (LOS), readmissions, risk-adjusted mortality rates and adjusted direct costs.

“The idea was to tackle care variation, looking at resource utilization, as well as looking at performance improvement for length of stay, readmissions and mortality rates and some of the quality metrics that we get monitored and measured on by CMS (the Centers for Medicare & Medicaid Services) and on pay-for-performance areas,” Kim says.

David Kim

He continues, “Each region and facility was given some flexibility, as to challenges specific to them, so, in other words, they would prioritize different conditions, but across the board, we started off with targeted conditions like heart failure, pneumonia and sepsis. Those were common challenges across all the facilities, so those were some of the early wins of trying to build some momentum by targeting a few conditions rather than chew off too much at once.” He adds, “That started to have a halo effect, improving one condition, especially heart failure, for example, effects a large volume; it has a halo effect, in terms of improving other conditions.”

Kim notes that the Peak software platform includes clinical performance benchmarks at the national, state and hospital level. “The platform was very flexible in terms of giving us an ability to target and customize and provide ‘apples-to-apples’ analysis,” he says. “Their system helps us to group, to customize and profile; having that flexibility was one of the key components in trying to drill into some of these high-level opportunities. Choosing the right content as well as the analytic engine to drill down was really paramount in our process.”

The analytics tool allowed project leaders to integrate data sources, perform custom analytics and access a large library of benchmarks. The IT team was able to leverage the health system’s Epic electronic health record (EHR) to mine data for detailed internal process metrics. “In order to put that into perspective, it was important to have another engine and comparison point with benchmarks,” Kim says. “We can compare ourselves historically, that’s one thing. We may pat ourselves on the back if we improve by half a day or so, but if we’re still a day off the benchmark that lays the groundwork to push ourselves a little bit further and not just settle with historical improvement.”

He continues, “Risk adjustments are a part of that too, especially when you work with physicians; they always come up with explanations, such as my patients are sicker or I have a more challenging population to work with. So, the analytic tool has done some risk adjustment for us. So, we know that it’s apples to apples that we’re comparing heart failures at various levels of acuity, pneumonia patients at various levels of acuity, and knowing that patients are very different, we had to treat those things condition by condition, rather than trying to roll them up and then have some challenges with identifying where some of the opportunities are.”

From Data Analysis to Actionable Insights

The aim of this “apples-to-apples” analysis was to produce actionable data that could be used to eliminate or decrease performance gaps. As a result of this analytics work, the data helped to identify high-performing physicians, or what the clinical operations groups refer to as “best performers,” and then also revealed dramatic variances between the health system’s best-performing and average physicians.

Specifically, the data indicated that the best performing doctors had a zero percent mortality rate for heart failure patients, compared to 5.5 percent among the health system’s lowest performers. What’s more, the average LOS was 39 percent lower among the best-performing physicians, and 30-day readmission rates were 42 percent lower. In addition, among the best-performing doctors, direct costs were 25 percent lower.

“These ‘best performers’ were statistically better than the benchmark in length of stay, statistically better than the benchmark in mortality rate, better than our system average for readmissions; that’s an area we didn’t have national benchmarks for, so we used a system average, for example. And, then, after grouping them that way, we then analyzed their cost per patient and realized there were some significant differences in terms of the costs as well,” Kim says.

Upon an even deeper analysis, the team found that when accounting for reduced respiratory treatments, fewer lab tests and shorter time spent in an intensive telemetry bed across 4,996 patient cases for two years, the best-performing physician’s total cost of care was $654,609 lower than average-performing physicians. Franciscan Health has since leveraged these findings to help its lower-performing physicians bring their practice in-line with their best-performing colleagues, with the goal of not only improving patient care but also reducing overall costs.

“By segmenting performance and creating tiers of performance by our attending physician groups, we were able to find not just the best performers, but also identify the outliers, and that’s equally impactful because you can then approach those groups of physicians and tackle the ‘why.’ What’s causing them to be significantly worse in each of these areas? And then try to address that aggressively as you would try to identify examples of who you want to emulate and figure out best practices among best performers,” Kim says.

Data and analytics played a vital role in bringing these key insights to light so that action could be taken. “Physicians by nature are competitive, so when they know they are below average, or an outlier, that does grab their attention. Once the physicians understand the information and know that it has been risk-adjusted, verified and validated, then they are more eager to engage on, where should I improve?” And, he adds, “Nine times out of 10 they know they can improve, they just need some data to back up that perception.”

Project leaders also recognized the importance of targeting frontline caregivers in this performance improvement work. New positions, called physician advisors, were created, and these positions function as the “right-hand men and women” for the chief medical officers or VPMAs. Physician advisors review the care variation data and consult with the “outliers” to focus on improvement.

The clinical operations groups also created interdisciplinary care coordination rounds (ICCRs), or daily rounding teams. “We have multi-disciplinary frontline staffing rounding on patients, identifying those that were beyond the benchmark in terms of expected length of stay, so we put in a lot of additional practice, if you will, into hard-wiring some more preventative work. The aim was to tackle the issues as they come rather than continually look at these retrospectively. All of that, along with better engagement with the physician advisor position, all helped to take the data and translate it into some meaningful results and response,” Kim says.

As a result of this data-driven performance improvement initiative, Franciscan Health has realized $25 million in cost savings, since 2012, as a result of a reduction in length of stay and utilization. “For example, we reduced the use of red blood cells, as a result of changing protocols, verifications and number of units being implemented so that we reduced a lot of waste and unnecessary transfusion of units of blood,” Kim says. The clinical performance improvements also helped to reduce hospital stays, saving more than 25,000 risk-adjusted days over the past five years, which drove the bulk of the cost savings.

Kim notes that while technology is foundational to this work, getting the right stakeholders to the table to help drive compliance and standardization is vital for success.

“As a data person, it’s important to have strong content, strong engines to help you analyze, risk adjust and drill into the data so you can dispel any anecdotes or confirm anecdotes. To me, it’s an iterative process, starting with the overall high-level metrics and getting feedback from all the stakeholders, nursing, physicians, clinical departments and support staff, to help us identify opportunities to drill into them. You don’t just want analysts in a back room crunching numbers, but having data being presented and having them better engaged in talking about data and why that drives change, in terms of what you measure is what you can change. I think all that became very tangible for us when we created these groups so that they review the information and better understand how to interpret and how to drill down into those opportunities,” he says.

While these efforts have driven significant results so far, the performance improvement work continues, Kim says. “We’re still working on improving and hard wiring changes and processes. Sometimes it takes another system push to reinvigorate efforts. We’re doing the same thing now with identifying new benchmarks and really trying to transform some of the processes that we have today.”

He adds, “It’s always a back and forth in terms of identifying new benchmarks, knowing that, nationally, everybody improves on length of stay and readmissions. Even though we may improve, we know it’s a moving target. Many penalties are based on a curve; if you’re in the bottom quartile, you have to work twice as hard to get out of that area since everybody is trying to improve in the same initiative.”

Moving forward, Franciscan Health leaders are focused on using to data to shift from retrospective analysis to real-time care practice, he says. “A buzzword I hear now, and we certainly use it too, is predictive analytics, being able to better manage populations and using data to give us indicators as to, is this patient highly likely to be readmitted?”

“To me, that takes a lot of integration in the future,” he adds, “and that’s an area that we are constantly striving for, that integration of data, both for retrospective as well as predictive analytics. Retrospectively, we’re still connecting the dots on the cost of a complication, the cost of worsening performance, and linking also to patient satisfaction. That’s a key component that a lot of people are realizing—that highly satisfied patients typically are the ones that are hitting those benchmarks as well. Being able to link all of those areas and understand the correlations of each is a continued exploration for us,” he says.




2018 Seattle Health IT Summit

Renowned leaders in U.S. and North American healthcare gather throughout the year to present important information and share insights at the Healthcare Informatics Health IT Summits.

October 22 - 23, 2018 | Seattle


ASCO Picks IBM Watson Exec to Lead CancerLinQ

August 10, 2018
by David Raths
| Reprints
Big data platform collects and analyzes data from cancer patients at practices nationwide

The American Society of Clinical Oncology (ASCO) has named a former IBM Watson executive as the new CEO of its CancerLinQ big data platform.

Cory Wiegert was most recently vice president of product management for IBM Watson Health. Prior to joining IBM, Wiegert held positions with Sterling Commerce, Siebel Systems Inc., Centura Software and Safety-Kleen.

Kevin Fitzpatrick stepped down as the nonprofit CancerLinQ’s CEO in April 2018. Richard Schilsky, M.D., who was serving as interim CEO of CancerLinQ, will continue his role as ASCO's chief medical officer.

CancerLinQ collects and analyzes data from cancer patients at practices nationwide, drawing from electronic health records, to inform and improve the quality of cancer care. Its database contains more than a million cancer patient records. The effort has two major components:

• The CancerLinQ quality improvement and data-sharing platform for oncology practices,

• CancerLinQ Discovery, which provides access to high-quality, de-identified datasets derived from the patient data to academic researchers, non-profit organizations, government agencies, industry, and others in the oncology community.

CancerLinQ LLC also has established a number of collaborations with government and nonprofit entities -- including American Society of Radiation Oncology, Food and Drug Administration, and the National Cancer Institute -- as well as industry through its collaborators AstraZeneca, Tempus, and Concerto HealthAI.

In a statement, ASCO CEO and CancerLinQ LLC Board of Governors Chair Clifford A. Hudis, M.D., said Wiegert’s arrival “comes at a pivotal time, as we are quickly building on and improving CancerLinQ's core quality improvement platform for oncologists and data analytics services for the broader cancer community."

As CEO, Wiegert will be tasked with developing new solutions to help oncology practices improve the day-to-day care they provide their patients and continuing to serve CancerLinQ collaborators.




More From Healthcare Informatics


A ‘Google’ for Clinical Notes Draws Interest

August 8, 2018
| Reprints
Developed at the University of Michigan, EMERSE allows users to search the EHR’s unstructured clinical notes
Click To View Gallery

Those of us who cover healthcare informatics often hear clinicians and researchers talk about the problems involved in doing analytics or research on unstructured data in clinical notes. That was why I was intrigued when I saw that informatics teams at the University of North Carolina School of Medicine are implementing a tool called EMERSE (Electronic Medical Record Search Engine), which allows users to search free-text clinical notes from the electronic health record (EHR). They describe it as being like "Google" for clinical notes. 

But then I noticed that the tool was actually created quite a while ago, in 2005, at the University of Michigan, and has been in use there ever since. So I reached out to its developer, David Hanauer, M.D., a clinical associate professor of pediatrics and communicable diseases at the University of Michigan Medical School. He also serves as assistant director for clinical informatics in UM’s Comprehensive Cancer Center’s Informatics Core as well as associate chief medical information officer at the UM Medical Center.

Hanauer told me that the developers of EMERSE at Michigan have a grant from the National Cancer Institute to further develop the tool and help disseminate it, with a focus on cancer centers around the country. “We are about one year into the grant,” he said. “We have spent the last year cleaning up the infrastructure to make it even easier for people to adopt. We have been working hard on technical documentation. When we started it, we had almost no documentation; now we have substantial and detailed documentation about how to implement and run it.”  

The five sites implementing EMERSE as part of the grant are the University of North Carolina, University of Kentucky, University of Cincinnati, Case Western Reserve University and Columbia University.

I asked Hanauer if health systems continue to struggle with unstructured data in clinical notes. “They all absolutely struggle with it,” he said. “They have mostly been ignoring it, to tell you the truth. That is why we believe and hope EMERSE will fit well into this environment of people needing different tools.”

I also asked him to describe some of the use cases. Most generically, anybody who needs to look through the chart and doesn’t know exactly where to look can get benefit from it, he said. He described three categories of users: research, clinical care and operations. “For example, in research you could use it for cohort identification. You want to find patients who meet your needs when it comes to a research study. This is important in part because ICD codes, the go-to way people often try to identify a cohort, are often inaccurate and non-specific.”

According to the EMERSE web site, for studies in which eligibility determination is complex and may rely on data only captured within the free text portion of documents, EMERSE can be a rapid way to check for mentions of inclusion/exclusion criteria.

In another example, EMERSE also can be used to help find details about a patient rapidly, even during a clinical visit. “For example, if a patient mentions that a certain medication helped their migraine three years ago but can’t remember the name, just search the chart for 'migraine' and find that note within seconds,” the web site notes. Cancer registrars can use EMERSE for data abstraction tasks, including difficult-to-find information such as genetic and biomarker testing.

Hanauer said at Michigan, clinicians have a way to access EMERSE from their Epic EHR. “If you have a patient’s record open, you can click a button, it will log you into EMERSE and bring that patient’s context over, and you can start searching in just of a few seconds.”

In 2005, the platform was written to work with a homegrown EHR. When UM transitioned to Epic in 2012, Hanauer and team used that as an opportunity to make it more powerful. “When we went live with Epic, it became clear there were some architectural limitations that were probably going to limit the future power of the software,” he recalled. “We leveraged the design and concepts and rewrote it from scratch. But even though we were going to work with Epic, we designed it specifically so it would not be tied to any particular EHR.”

Because it deals with patient records, security and audit logs have to be taken very seriously. Every time you log into EMERSE, you come to an attestation page. “You have to declare why you are using it for this session,” Hanauer explained. “We have tried to make it as simple as possible. Almost every institution that does research now has an electronic IRB system, so we have a way you can pull a user’s IRB-approved study into the EMERSE database, and a list appears of that user’s studies only. The user can click on it, record that use, and move forward.” There also are quick buttons for common administrative use cases.

I asked Hanauer if other academic medical centers had developed similar search tools. He said some have created local tools. “The main difference with EMERSE is that it is proven it can work elsewhere. (It was used at the VA in Ann Arbor, Mich., on the VistA system.) “We have a long track record of use and have been working on the infrastructure to disseminate it,” he said. “We are giving it away at no cost, but it is almost like running a software company, where you have to have a web site, user documentation, and system administrator documentation. To me, it doesn’t make a lot of sense for others to reinvent the wheel when this is something we have invested millions of dollars in at this point.”

He stressed that although the grant project is focused on five cancer centers, they are giving the software away at no cost, and are glad to help anybody interested in getting it up and running. “One of the key challenges is that the users can’t control whether it gets deployed or not,” he said. “Our biggest challenges is not the users, who are contacting us and asking us for it, but getting this through local IT leadership, and that is a big hurdle.”

Why would CIOs be opposed to deploying this tool? “I think their plates are full and a lot of times people are looking for vendor solutions,” he surmised.  “I also think that often people don’t understand what the issues are. Some people think they will just get some off-the-shelf NLP software. But I can assure you that that software will not be able to do the kinds of things that EMERSE can do. That is partly because a lot of medical documents are not in natural language. Medical documents are anything but. They are a mess.”



Related Insights For: Analytics


Anthem Expands $500M Deal with IBM with Focus on IT Automation, AI

July 26, 2018
by Heather Landi
| Reprints

Health insurer Anthem has expanded its services agreement with technology leader IBM with a focus on using artificial intelligence (AI) and automation to improve operational efficiency and modernize technology platforms.

With this collaboration, Armonk, New York-based IBM and Indianapolis-based Anthem, one of the largest U.S. health insurance coampnies, will work together to help drive Anthem’s digital transformation and deliver an enhanced digital experience for its nearly 40 million consumers, Anthem said in a press release.

In 2015, Anthem entered into a five-year, $500-million-dollar strategic technology services partnership with IBM in which the technology giant provided operational services for Anthem’s mainframe and data center server and storage infrastructure. As part of that agreement, Anthem has been able to leverage IBM Cloud solutions to increase the ease, availability and speed of adding infrastructure to support new business requirements, the company said.

Under the expanded agreement, IBM will provide Anthem with enterprise services for its mainframe and data center server and storage infrastructure management. In addition, IBM will work with Anthem towards creating an AI environment which will allow for an automated infrastructure providing 24/7 digital capabilities. This will bring greater value and access to Anthem's consumers, care providers, and employees, Anthem said.

IBM and Anthem will also continue to work together on IT automation. Since 2015, the two companies’ have implemented over 130 bots, automating over 70 percent of the monthly high volume repetitive tasks. This includes bots that can identify when a server is reaching capacity to shift workloads to other less utilized servers ensuring that work is not impacted. This capability has improved systems availability as well as freed up resources to work on higher-value projects, Anthem said in a press release.
“We are seeing a dynamic change in the healthcare industry, requiring us to be more agile and responsive, utilizing advanced technology like AI to drive better quality and outcomes for consumers,” Tim Skeen, senior vice president and chief information officer, Anthem, Inc., said in a statement. “Our continued strategic partnership with IBM will help establish a stronger foundation for Anthem to respond to the changing demands in the market, deliver greater quality of services for consumers and help accelerate Anthem’s focus on leading the transformation of healthcare to create a more accessible, more affordable, more accountable healthcare system for all Americans.”

“The collaboration between IBM Services and Anthem has already laid the groundwork to improve healthcare processes and quality,” said Martin Jetter, senior vice president, IBM Global Technology Services. “Our latest agreement will accelerate Anthem’s growth strategy and continued leadership as one of the largest healthcare insurance companies and provide a solid path to bringing new efficiencies in driving digital transformation.”


See more on Analytics