Skip to content Skip to navigation

National Quality Forum Urges Providers Forward on Data and Analytics in Healthcare

August 14, 2015
by Mark Hagland
| Reprints
The National Quality Forum publishes a white paper analyzing how U.S. healthcare organizations can better leverage data to improve care delivery

On Aug. 6, the Washington, D.C.-based National Quality Forum released a white paper, “Data Needed for Systematically Improving Healthcare,” intended to highlight strategies to help make healthcare data and analytics “more meaningful, usable, and available in real time for providers and consumers.”

According to a press release issued on that date, “The report identifies several opportunities to improve data and make it more useful for systematic improvement. Specific stakeholder action could include the government making Medicare data more broadly available in a timely manner, states building an analytic platform for Medicaid, and private payers facilitating open data and public reporting. In addition, electronic health record (EHR) vendors and health information technology policymakers could promote “true” interoperability between different EHR systems and could improve the healthcare delivery system’s ability to retrieve and act on data by preventing recurring high fees for data access.”

The press release noted further that “The report identifies actions that all stakeholders could take to make data more available and usable, including focusing on common metrics, ensuring that the healthcare workforce has the necessary tools to apply health data for improvement, and establishing standards for common data elements that can be collected, exchanged, and reported.”

The report emerged out of an initiative supported by the Peterson Center on Healthcare and the Gordon and Betty Moore Foundation, and spurred by a 2014 report by the President’s Council of Advisors on Science and Technology that called for systems engineering approaches to improve healthcare quality and value.

The press release included a statement by Christine K. Cassel, M.D., president and CEO of NQF. “Data to measure progress is fundamental to improving care provided to patients and their outcomes, but the healthcare industry has yet to fully capture the value of big data to engineer large-scale change,” Dr. Cassel said in the statement. “This report outlines critical strategies to help make data more accessible and useful, for meaningful system wide improvement.” 

Following the publication of the report, Rob Saunders, a senior director at the National Quality Forum, and one of the co-authors of the report, spoke with HCI Editor-in-Chief Mark Hagland about the report and its implications for healthcare IT leaders. Below are excerpts from that interview.

What do you see as the most essential barriers to moving forward to capture and correctly use “big data” for clinical transformation and operational improvement in healthcare?

There are sort of two buckets we looked at through this project. We looked at the availability of data, and we’re seeing more availability of electronic data. Interoperability remains a major challenge. But it wasn’t just about interoperability between electronic health records, but also being able to link in data from elsewhere.

Does that mean data from pharmacies, from medical devices, from wearables?

Some of these may be kinds of data from community health centers, or folks offering home-based and community-based services. So, getting a broader picture of people’s health, as they’re living their lives in their communities. And there are exciting things on the horizon, too, like wearable devices. But the first barrier we heard about was just getting more availability of data. Perhaps the harder problem right now is actually using more data, and turning that raw data into meaningful information that people can use. There’s so much raw data out there, but it so often is not actionable or immediately usable to clinicians.

So what is the solution?

That is an excellent question. Unfortunately, there’s no silver bullet. We’ve looked at a wide range of possible solutions, but it will take action from healthcare organizations trying to improve their internal capacity, for example, creating more training for clinicians to use data in their practices, or even state governments taking action. I think it will require a lot of action from all the stakeholders around healthcare to make progress.


The white paper mentioned barriers involving information systems interoperability, data deidentification and aggregation, feedback cycles, data governance, and data usability issues. Let’s discuss those.

I think one of the challenges with all of those is that there are some big strategic issues around all of those, and some large national conversations around all of those, esp. interoperability, but there are also just a lot of large technical details to iron out. And unfortunately, that’s not something we can just solve tomorrow. But there’s opportunity with these new delivery system models, and that will hopefully be helpful.

How might all this play out with regard to ACOs, population health, bundled payments, and other new delivery and payment models?

What we’ve heard is that those new models are becoming increasingly more common, and because of those, clinicians and hospitals have far more incentive to look far more holistically at the entire person, and think about improvement, and to really start digging into some of this data.