Talking Semantic Interoperability with the VA’s Keith Campbell | David Raths | Healthcare Blogs Skip to content Skip to navigation

Talking Semantic Interoperability with the VA’s Keith Campbell

December 18, 2017
| Reprints
SOLOR project seeks collaborators with developer’s launch
I am one of those who have touted HL7 FHIR’s potential to solve some of the industry’s interoperability woes. But is focusing on FHIR like starting to build a skyscraper on the third floor? That’s what Keith Campbell, director of informatics architecture at the U.S. Veterans Health Administration, argues. His focus — on the ground floor — is semantic interoperability so that health systems don’t have to keep creating maps between different terminologies. 
 
FHIR provides a level of interoperability for APIs and consumption of transports that is way beyond where HL7 v3 was, so in some regards you can’t sing high enough praises of FHIR, Campbell said in a recent interview. “But some people feel it solves all the problems. It doesn’t solve the semantic interoperability problem. It solves a problem of how do I open a socket and how do I send a standard resource.”
 
In July 2016 I wrote a news story about the nonprofit Healthcare Services Platform Consortium’s sponsorship of an open source effort to improve semantic interoperability between SNOMED, LOINC and RxNorm with a project called SOLOR (SnOmed LOinc, Rxnorm). Now the project, which Campbell is leading, is mature enough that HSPC is opening the doors to developers to collaborate on SOLOR.
 
I asked Campbell to describe the problem SOLOR is trying to solve. He explained that SNOMED, LOINC and RxNorm are important Meaningful Use standards, but where there is overlap and no clean separation of concerns, each health system has to map how data elements are defined in their respective EHRs to these standard terminologies, which leads to great difficulty in sharing things like clinical decision support modules across health systems such as between the VA and the Department of Defense.
 
“The VA has a desire to try to make things simpler,” Campbell said. “It is just too complicated today. We have seen issues related to how data is managed in the industry that have brought harm to veterans, quite frankly. And yet it just seems to be an accepted status quo. That is one of the things we are trying to push with SOLOR.”
 
He said they sought to come up with a moniker that was acceptable and put everyone on equal standing with regard to recognition of their unique contributions. The goal is not to get LOINC to go away in preference of SNOMED or to get SNOMED to go away in preference of LOINC. “We are really just trying to make it all work,” he said. “These are really important Meaningful Use standards, yet pulling them all together is an exercise left to the end user.”
 
Campbell stressed that the problem is not unique to the VA and DoD. “Let’s say the VA was trying to do a data-sharing initiative with Kaiser Permanente, so veterans could choose to have their care in the private sector,” he said. “If we wanted to exchange data, we would have to go through a very similar mapping exercise with any private care partner. This is a problem in the industry.” 
 
If through these integrations and mappings, you fail to identify things that should be equivalent, he said, it has potential consequences that are as serious as any other medical complication. 
 
Besides the consequences for patient safety, there are hindrances to being able to establish an industry, he noted. “If you can’t create a knowledge support or decision support product that can go from organization to organization with fairly low friction of adoption, that is a problem in my book.” He said if people implement the common foundation of SOLOR, then decision support vendors wouldn’t have to depend on a custom one-off integration with an enterprise’s unique interpretation of Meaningful Use standards.
 
Semantic data interoperability is a complicated topic, and how SOLOR works gets pretty technical. For those interested, there is a white paper on the SOLOR website that goes into detail. It is described as a way to integrate terminology content in a single model. As its web site states, it uses a common model in which the integrated terminologies are transformed. “During transformation, the content from one or more terminologies populate the common model. Each SOLOR data element retains the original identifiers and additionally provides a common UUID (universally unique identifier)- based identifier.” A SOLOR Viewer app will allow a user to import, transform, and view SNOMED, LOINC, and RxNorm in SOLOR’s common model.
 
“We are getting mature enough and getting enough consensus that we are opening the doors for other people to collaborate and work on it,” Campbell said. “Our goal is that within a year we would have something that is of equivalent quality to what people are actually using in practice today. Within the VA, we would be working to push it into production systems and also make it available so that others can contribute to it and make their own choices about whether they want to continue with a proprietary mapping process or whether to go more with an integrated terminology model with a standard extension model.”
 
I asked Campbell if EHR vendors would be leaders or followers in this effort. “If semantic interoperability were really important to the large EHR vendors, you would think they would have solved that problem up front,” he said. “They get no revenue from solving the data interoperability problem.”
 
He believes the big health delivery organizations that are moving to value-based care will ultimately make the difference. “Part of getting better value out of their data,” he said, “is having it be better normalized.”
 
 
 
 

2018 Raleigh Health IT Summit

Renowned leaders in U.S. and North American healthcare gather throughout the year to present important information and share insights at the Healthcare Informatics Health IT Summits.

September 27 - 28, 2018 | Raleigh


/blogs/david-raths/interoperability/talking-semantic-interoperability-va-s-keith-campbell
/article/interoperability/ehr-compatible-pharmacist-care-plan-standard-opens-door-cross-setting-data

EHR-Compatible Pharmacist Care Plan Standard Opens the Door to Cross-Setting Data Exchange

September 14, 2018
by Zabrina Gonzaga, R.N., Industry Voice
| Reprints
Pharmacists drive information sharing towards quality improvement

Pharmacists work in multiple environments—community, hospital, long term care, clinics, retail stores, etc.—and consult with other providers to coordinate a patient’s care.  They work with patients and caregivers to identify goals of medication therapy and interventions needed, and to evaluate patient outcomes.  Too often, pharmacy data is trapped in a silo and unavailable to other members of the care team, duplicated manually in disparate systems which increases clinical workloads without adding value.

To address these issues, Lantana Consulting Group and Community Care of North Carolina (CCNC) developed an electronic document standard for pharmacist care plans—the HL7 Pharmacist Care Plan (PhCP). The project was launched by a High Impact Pilot (HIP) grant to Lantana from the Office of the National Coordinator for Health Information Technology (ONC).

Before the PhCP, pharmacists shared information through paper care plans or by duplicative entry into external systems of information related to medication reconciliation and drug therapy problems. This documentation was not aligned with the in-house pharmacy management system (PMS). The integration of the PhCP with the pharmacy software systems allows this data to flow into a shared care plan, allowing pharmacists to use their local PMS to move beyond simple product reimbursement and compile information needed for quality assurance, care coordination, and scalable utilization review.

The PhCP standard addresses high risk patients with co-morbidities and chronic conditions who often take multiple medications that require careful monitoring. Care plans are initiated on patients identified as high risk with complex medication regimes identified in a comprehensive medication review. The PhCP is as a standardized, interoperable document that allows pharmacist to capture shared decisions related to patient priorities, health concerns, goals, interventions, and outcomes. The care plan may also contain information related to individual health and social risks, planned interventions, expected outcomes, and referrals to other providers. Since the PhCP is integrated into the PMS or adopted by a software vendor (e.g. care management, chronic management, or web-based documentation system), pharmacist can pull this information into the PhCP without redundant data entry.

Webinar

Safety & Unintended Consequences of Interoperability: Establishing High Reliability Principles & Techniques

Interoperability may seem like just a technology challenge, but in actuality it is a people, process, and technology challenge. Healthcare systems increasingly look to create high-reliability...

The PhCP allows pharmacists for the first time to share information with support teams and paves the way for them to support value-based payment. The project goals align with the Center for Medicare & Medicaid Services’ (CMS’) value-based programs, which are part of the Meaningful Measure Framework of improved care team collaboration, better health for individuals and populations, and lower costs.

Scott Brewster, Pharm.D., at Brookside Pharmacy in East Tennessee, described the PhCP as a tool that helps them enhance patient care delivery. “From creating coordinated efforts for smoking cessation and medication utilization in heart failure patients, to follow up on recognized drug therapy problems, the eCare plan gives pharmacists a translatable means to show their value and efforts both in patient-centered dispensing and education that can reduce the total cost of care.” (The eCare plan reference by Scott Brewster is the local term used in their adoption of the PhCP).

The pilot phase of the project increased interest in exchanging PhCPs within CCNC’s pharmacy community and among pharmacy management system (PMS) vendors. The number of vendors seeking training on the standard rose from two to 22 during the pilot. Approximately 34,000 unique care plans have been shared with CCNC since the pilot launch.

This precedent-setting pilot design offered two pharmacy care plan specifications: one specification is based on the Care Plan standard in Clinical Document Architecture (CDA); the other standard is a CDA-on-FHIR (Fast Healthcare Interoperability Resources). The latter specification directly transforms information shared using the FHIR standard into CDA. FHIR is straight forward to implement than CDA, so this is an appealing option for facilities not already using CDA. The dual offerings—CDA and CDA-on-FHIR with lossless transforms—provide choice for implementing vendors while allowing consistent utility to CCNC.

What’s on the horizon for the pharmacy community and vendors? With the support of National Community Pharmacists Association (NCPA), the draft standards will go through the HL7 ballot process for eventual publication for widespread implementation and adoption by vendors. This project will make clinical information available to CCNC and provide a new tool for serving patients with long-term needs in the dual Medicare-Medicaid program and Medicaid-only program.  This is a story about a successful Center for Medicare and Medicaid Innovation (CMMI)funded project that started out as a state-wide pilot and is now rolling out nationwide as Community Pharmacy Enhanced Service Network (CPESN)USA. 

The PhCP is based on a CDA Care Plan standard that is part of ONC’s Certified EHR Technology requirements, so it can be readily implemented into EHRs. This makes the pharmacist’s plan an integral part of a patient’s record wherever they receive care. 

Adoption of the PhCP brings pharmacies into the national health information technology (HIT) framework and electronically integrates pharmacists into the care planning team, a necessary precursor to a new payment model and health care reform. In addition, receiving consistently structured and coded pharmacy care plans can augment data analysis by going beyond product reimbursement to making data available for, utilization review, quality assurance and care coordination.

Troy Trygstad, vice president for Pharmacy Provided Partnerships at CCNC, described the strategic choice now available to pharmacists and PMS vendors. “Fundamentally, pharmacy will need to become a services model to survive. Absent that transformation, it will become a kiosk next door to the candy aisle. The reasons vendors are buying into the PhCP standard for the first time ever is that their clients are demanding it for the first time ever."

The move to value-based payment will continue to drive the need for pharmacists, as part of care teams, to provide enhanced care including personal therapy goals and outcomes. Sharing a medication-related plan of care with other care team members is critical to the successful coordination of care for complex patients.

Zabrina Gonzaga, R.N., is principal nurse informaticist and director of health informatics at Lantana Consulting Group and led the design and development of the PhCP standard. 

Email:  zabrina.gonzaga@lantanagroup.com

Twitter: @lantana_group

 


More From Healthcare Informatics

/news-item/interoperability/health-it-now-pushes-information-blocking-regulation-says-administration

Health IT Now Pushes for Information Blocking Regulation, Says Administration “Must Uphold its End of the Bargain”

September 13, 2018
by Rajiv Leventhal, Managing Editor
| Reprints

The executive director of Health IT Now, a coalition of healthcare and technology companies, is again criticizing the Trump administration for not yet publishing any regulation on information blocking, as required by the 21st Century Cures Act legislation.

In an op-ed published recently in STAT, Health IT Now’s Joel White wrote, “More than 600 days after the enactment of the Cures Act, not a single regulation has been issued on information blocking.” White added in frustration, “Health IT Now has met with countless officials in the Trump administration who share our commitment to combat information blocking. But those sentiments must be met with meaningful action.”

The onus to publish the regulation falls on the Office of the National Coordinator for Health IT (ONC), the health IT branch of the federal government that is tasked with carrying out specific duties that are required under the 21st Century Cures Act, which was signed into law in December 2016. Some of the core health IT components of the Cures legislation include encouraging interoperability of electronic health records (EHRs) and patient access to health data, discouraging information blocking, reducing physician documentation burden, as well as creating a reporting system on EHR usability.

The information blocking part of the law has gotten significant attention since many stakeholders believe that true interoperability will not be achieved if vendors and providers act to impede the flow of health data for proprietary reasons.

But ONC has delayed regulation around information blocking a few times already, though during an Aug. 8 episode of the Pulse Check podcast from Politico, National Coordinator for Health IT Donald Rucker, M.D., said that the rule is "deep in the federal clearance process." And even more recently, a bipartisan amendment to the U.S. Senate's Department of Defense and Labor, Health and Human Services, and Education Appropriations Act for Fiscal Year 2019 includes a requirement for the Trump administration to provide Congress with an update, by September 30.

White, in the STAT piece, noted a June Health Affairs column in which Rucker suggested that implementation of the law’s information blocking provisions would occur “over the next few years.” White wrote that this is “a vague timeline that shows little urgency for combating this pressing threat to consumer safety and stumbling block to interoperability.”

Health IT Now is not alone in its belief that the rule should have been published by now, nor is it the first time the group is bringing it up. Last month

Related Insights For: Interoperability

/whitepaper/are-you-data-blocker-how-fit-onc-s-new-interoperability-framework-and-regulation

Are You a Data Blocker? How to Fit into ONC’s New Interoperability Framework and Regulation

Please register to download


By the end of this year, ONC’s implementation and interpretation of data blocking will also be published and available for comment, as was the case with the TEFCA proposed rule. The TEFCA final rule is also anticipated by the end of 2018.

HOWEVER…there’s still time to prepare for TEFCA and the data blocking regulation, and final rules for both in the coming months will set concrete timelines, and for TEFCA it will be interesting to see how ONC reacts to stakeholder comments, internal and external.

See more on Interoperability