Skip to content Skip to navigation

The 2017 Healthcare Informatics Innovator Awards: Co-Second-Place Winning Team—Mercy-St. Louis

January 24, 2017
by David Raths
| Reprints
Machine learning bolsters Mercy Health’s care pathways process

As part of a long-term effort to improve operational efficiency, the St. Louis-based Mercy Health system has spent years developing clinical pathways—a way to identify best practices for high-cost procedures such as total knee replacements and systematize them across the organization. Although the nonprofit, 45-hospital group had some success with that approach, Mercy made even greater strides once it turned to a machine learning application that uses advanced analytics to help it identify hidden patterns in its own data.

“We had a great EHR and tons of data,” says Vance Moore, president of business integration at Mercy, speaking of the organization’s electronic health record. “We had tried a couple of different data-mining solutions, and they showed promise, but they weren’t giving us what we were looking for. We had to find the truth within our data.”

The data-driven approach appears to have done that. In one example, an original care pathway developed manually at Mercy reduced the cost of total knee replacements by 7 percent. But the machine-learning approach cut an additional 5 percent off the cost of knee replacement, while improving or maintaining low rates of mortality and morbidity across all cases.

For this big-data breakthrough, the editors of Healthcare Informatics have selected Mercy as the co-second-place winning team in the 2017 Innovator Awards program. 

Mercy, which is the fifth-largest Catholic healthcare system in the United States with operations in four states, has been unified on Epic Systems’ EHR for almost 10 years and has worked to integrate that data with nonclinical data for analytics purposes. “Our clinical data set is extremely rich, so we have been doing multiple projects to try to operationalize the opportunities that come out of that,” says Todd Stewart, M.D., vice president of clinical integrated solutions. Among those efforts were the first steps to standardize care processes and the creation of care pathways, including the establishment of a governance structures to operationalize best practices. Each care pathway had a specialty council assigned to working with peers on identifying variances in care and working through common solutions where possible.

Health IT leadership team at St. Louis-based Mercy Health system

Stewart also notes that Mercy keeps in touch with clinical leaders at other health systems working on the care pathways concept. “Our specialty council structure is modeled after work Mayo Clinic has been doing for years, and we have worked quite a bit with Intermountain Healthcare as well,” he adds.

Although the early work with care pathways was valuable, the executives noticed a few limitations holding them back. First, there were inefficiencies, because the typical pathway took up to six months to develop. Physicians found it difficult to take time away from patient care to attend quality improvement meetings. Second, the pathways were vulnerable to the biases of the clinicians involved. The best practices they identified reflected their own clinical experience, but there was no way to tell whether it was backed up by patient data. Finally, they found that at least 20 percent of Mercy clinicians failed to adopt care pathways because they were skeptical of the process, as no internal data was available to back up the best practices.

It is one thing for an administrative team to look at a best practice or set up an expert panel, and develop an optimal way to do something, Stewart says, “But anyone who has worked with a large group of physicians knows it is very difficult to motivate experienced clinicians who are driven by their own best practices and the way they were trained.” He says that they learned early on that they had to take a peer-driven approach. “If your peers are saying this is a better way to do a total knee replacement, and they are doing those procedures all day long, it is a different conversation than hearing it from an administrator who is just looking at data.”

Additionally, they had to show clinicians their own data, not industry-wide benchmark studies. “When you take that peer-to-peer process and combine it with our own data, and benchmark their results and costs against their peers internally, it is a very different discussion,” Stewart says. They can get down to the granular level of whether a scalpel tip that costs $100 more is really worth it.

In mid-2013, Mercy started realizing it had to find a better way to analyze its own data. Moore happened to be at a meeting in Silicon Valley, where he had a dinner conversation with Amy Chang, who was formerly in charge of Google Analytics. “I told her I have all this information, but I don’t know how to surface the truth out of it,” he recalls. She pointed him to a startup company called Ayasdi that was being developed by former Stanford University researchers. She told him that Ayasdi doesn’t start out with a theory and try to prove it; it starts with the unknown and provides you patterns in your data that you should investigate. Moore set up a meeting with Ayasdi executives right away.

Ayasdi, which Healthcare Informatics profiled in 2016 as one of its “Up and Coming” companies, has created clinical variation management tools that leverage both machine learning and what it calls “topological data analysis” (TDA) to extract insights from millions of data points. TDA brings together machine learning with statistical and geometric algorithms to create compressed representations and visual networks that allow organizations to more easily explore critical patterns in their data.

Pages

Get the latest information on Healthcare Analytics and attend other valuable sessions at this two-day Summit providing healthcare leaders with educational content, insightful debate and dialogue on the future of healthcare and technology.

Learn More

RELATED INSIGHTS FOR:
Topics