The ability to exchange and manage data effectively is being hampered by a lack of standards on multiple levels. Traditionally, healthcare organizations have either had to invest in systems from a single vendor that could provide a seamless transfer of information or write interfaces between best-of-breed systems. But the problem is more complex, say industry experts.
“Best-of-breed or single vendor? There is no best-of-breed that does everything,” says Russ Rudish, vice chairman and leader of Deloitte's U.S. Healthcare Provider Practice in New York. “There is no one solution.”
And even if you can get disparate systems to “talk” to each other, the data is essentially useless if it's not available to everyone who needs it — from clinicians at the point-of-care to those in the billing office. To ensure that information is readily available, Rudish says data warehousing and data mining are becoming more important not only as a way to improve patient care, but also to maximize the financial strengths of the organization.
“Through data mining, you get better outcomes and better commercialization on the part of organizations that own the data,” he says. “If you don't have the information, it's hard to run your business.”
Not unlike other corporations, more and more hospitals are beginning to partner with companies that specialize in analyzing clinical and financial data to spot trends, says Rudish. By identifying areas of strengths and weaknesses, a hospital is then able to make whatever changes are necessary to improve patient care or to bolster the bottom line.
However, the standards by which that data was collected may vary from system to system or hospital to hospital.
Most hospital CIOs and clinicians tend to focus on systems integration, believing that data standardization is the answer to better data management. But that's only part of the solution, says Liz Rockowitz, executive consultant with Weymouth, Mass.-based Beacon Partners Inc.
“Two systems can try to interface with HL7 but you could still have an issue with terminology,” she explains. “HL7 is great as an implementation guide. But it doesn't address the message or content.” True interoperability cannot be achieved without standardizing processes. “It's more than just the data,” she continues. “It's how it's being translated.” In addition, HL7 has user-defined fields. “As long as you have that, you don't have a standard,” she says.
Marc Holland, research director for Framingham, Mass.-based Health Industry Insights, an IDC company, says the ultimate benefits derived from healthcare IT lie in the successful exchange of information between organizations, and that process standards should be the basis of data standards.
Holland believes that while it will take years for the industry to adopt standards for processes, it won't take long to develop data standards because efforts are already under way on the federal level and among some vendors.
But as an example of how important process standards are in exchanging information between organizations, Holland says, “A lab test done in a commercial lab and a lab test done in the hospital have to be the same. The identification of panels from one lab to another is critical.”
This would especially be true for the successful deployment of regional health information organizations (RHIOs) or the proposed National Health Information Network (NHIN).
Rockowitz suggests forming a consortium that would address process standardization. “We have to continue to put pressure on the vendors,” she says.
There is, however, a glimmer of hope that standardizing processes won't take as long as many predict. “People are starting to dump their data into third-party repositories where the data is scrubbed,” she says. “We're seeing this as a trend.” But she adds that while this solution does normalize data, it adds another layer on top of existing applications.
One organization that is at the forefront of developing this type of solution is Salt Lake City, Utah-based Intermountain Healthcare. “We've been running normalized data for 40 years,” says CIO Marc Probst. “We are a very low-cost healthcare organization, and we've been able to achieve that by understanding our data and doing analytics.”
By teaming up with GE Healthcare, Intermountain has been building a home-grown system that stores and normalizes data (See Sidebar).
“There's way too much textural data stored in databases,” Probst says. “Doing on-the-fly decision support is nearly impossible if you don't have coded data. But you have to get it to one standard.”
The Bush administration's push to get all healthcare organizations to adopt EMRs and to establish the NHIN by 2014 may offer enough incentive to develop data and process standards. Or it may not. “Where's the requirement for adoption?” asks Rockowitz. Unlike the Health Insurance Portability and Accountability Act, which was mandated by law, these initiatives and efforts to develop industry standards are being done voluntarily, she says.
Rudish says the government needs to take a more active role because of Medicare expenditures. “Because the government pays most of the freight, they should have most of the say, as should employers.”
Holland agrees. “If you want standards, mandate them tied to Medicare.”