National Coordinator Donald Rucker, M.D. Assesses Progress Towards Interoperability and the Development of Open APIs | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

National Coordinator Donald Rucker, M.D. Assesses Progress Towards Interoperability and the Development of Open APIs

May 21, 2018
by Mark Hagland
| Reprints
Donald Rucker, M.D. spoke recently about the progress towards interoperability, a key administration policy objective

Donald Rucker, M.D., who was named National Coordinator for Health IT in the spring of 2017, has been focusing his tenure as National Coordinator on a handful of top policy priorities for the Trump administration, among them, interoperability across the U.S. healthcare system, via, among other elements, the Trusted Exchange Framework and Common Agreement (TEFCA), and the overall 21st Century Cures Act of 2016, out of which TEFCA emerged; and reducing the administrative burden on physicians and other providers, as articulated in recent speeches at major healthcare conferences by Alex Azar, Secretary of Health and Human Services, and Seema Verma, Administrator of the Centers for Medicare and Medicaid Services (CMS).

Dr. Rucker continues to speak at all the major industry conferences at which national health IT leaders gather, including at the annual HIMSS Conference. And he recently spoke with Healthcare Informatics Editor-in-Chief Mark Hagland to talk about his current efforts, and how he sees industry evolution towards greater interoperability and healthcare system transformation. Below are excerpts from that interview.

What are the main areas you’ve been working on recently, Dr. Rucker?

We’re still working on the things that I’ve talked about publicly—the rulemaking in process on open APIs, information-blocking, TEFCA; and we’re sorting out what we can do on burden reduction. We’re still pretty much working on the exact same things that we had been working on before.

Donald Rucker, M.D.


Safety & Unintended Consequences of Interoperability: Establishing High Reliability Principles & Techniques

Interoperability may seem like just a technology challenge, but in actuality it is a people, process, and technology challenge. Healthcare systems increasingly look to create high-reliability...

What are you putting the most energy into right now?

Obviously, an aggregate 21st-Century Cures has a number of specific provisions. By law, we have to work on all of them, and by law and federal rulemaking processes, they’re sort of going to come out together. That’s in the list of upcoming rules. So they sort of come out as a pack; and they all feed off each other on some level; some feed off each other directly. The focus on open APIs, and preventing information-blocking, sort of mutually reinforce each other. And clearly, that was the explicit intent of Congress. The broader intent, probably of the prior administration and certainly of this administration, is for us to get more for what we pay for in healthcare. It’s no secret that we’re not getting a good deal in American healthcare. And certainly, that idea picked up a lot of steam with Don Berwick and the IHI [Donald M. Berwick, M.D., the former CMS Administrator and the president emeritus of and a senior fellow at, the Cambridge, Mass.-based Institute for Healthcare Improvement], and similar efforts. And, I’m not a pollster, but I’m told that’s a major issue in voters’ minds. All of us are seeing our prices rising, as consumers.

And with all of that going on, one of the big opportunities, I think, and the White House thinks—Jared Kushner in the Office of American Innovation, and Secretary Azar and Administrator Verma, all think, and certainly I do as an IT person—everybody sees that IT is one of the potential keys to the kingdom, in terms of rethinking business models, and achieving accountability. Right now, as a provider, there is no broad-based, computational capability around what you do. You may have to provide a narrowly scoped set of quality measures; and private payers may ask for some specific data downloads; but there’s no clear interoperability standard to look at the overall performance of providers, with any of the modern computer science tools we hear about—AI, machine learning, big data. You can talk about big data all you want, but if there’s no computational interface, it remains limited, often right now to individual providers. Ultimately, we’ll need to get data out from all providers simultaneously, to be able to shop intelligently for care, identify disease outbreak threat vectors, etc. So those are some of the things the senior federal healthcare officials are thinking about.

Secretary Azar spoke about hospital pricing transparency, and the potential for direct physician contracting, in his keynote address at the World Health Care Congress at the beginning of this month. And he and Administrator Verma both spoke of the freeing of data to support the new healthcare, particularly to support healthcare consumers.

Yes, and as part of the search for value and empowering consumers to shop for their care, clearly, it’s very hard to shop for anything if you don’t know the costs. There’s a broader desire to empower patients with information, whether it’s through Blue Button 2.0 or anything else. And as I understand it, at CMS, they already use Chargemasters to build up their cost baskets; so there is data already in the public process. So this information is there. And what the CMS news release said is that they want to make this information that’s already on the public record, electronically actionable. It’s one thing to make something ‘available,’ and another thing to make it easily accessible. There are a number of [vendor] companies that are working on price transparency; and the assumption is that they or other groups would combine this with other consumer information. Who will end up doing this or succeeding with this, is to be determined. You’re starting to see some of the major payers explore the world of apps, and fronting those directly to their insureds. So that’s going to be part of the fabric of price transparency.

Physicians will say they don’t have the IT infrastructure to take on direct contracting right now. Do you see that as a potential challenge to direct contracting?

Adam Boehler, the new head of CMMI [the Center for Medicare and Medicaid Innovation] is a very smart guy. My guess is that there will be a lot of things going on. Secretary Azar is bringing on experts in some of these various pricing areas; it’s an ongoing thing. I don’t have any comments on specific payment models. But it’s certainly embedded in the design of CMMI.

Would you agree that physicians will have to step up pretty quickly in terms of upgrading their IT capabilities, in order to participate in the new healthcare? Many feel they simply are not in a position to be fully capable, in terms of their IT infrastructure, of participating in some of the activities they’ll be required to participate in going forward.

It’s an interesting question, and I’m going to put on my MBA hat here. It’s a very interesting question what the natural scale of the business is or should be. Right now, it varies from solo practitioners to dominant IDNs who are hiring primary care docs, right? That’s a pretty broad range of scale. And it sets healthcare apart from many other industries. So I think we really know the range of optimum scaling factors here; part of what determines the optimum can be IT, but, very big caveat here, that does not mean that it needs to be a big enterprise software system, right? It may just need to be on the cloud. Uber and Lyft have massive IT infrastructures, but that doesn’t mean that individual drivers need massive infrastructures. So there’s no simple answer to that question, because I think that the IT can scale to different levels. Now, as a practical matter, in the U.S., various favored layers of scale have probably been created by the government or market, by implication.

When you look at the current trajectory of the development of open APIs, would you say that that development is not moving fast enough? How do you see it?

I think you need to focus on one key element, the phrase “without special effort,” in the Cures Act. The language says, not, just open APIs, but “open APIs without special effort.” I can write code and expose my function wall to the world, and call it an open API. And while it is technically open in that I haven’t locked it down, it’s not really usable. If somebody wants to use it, what data is even behind there? Why would I do it? It has no component of operational transparency. So you could imagine that kind of strategy maybe working for one or two of the largest vendors, but it doesn’t have public good to it.
“Without specific effort” means you must use industry standards that allow ordinary developers to access your technology, using normal tools. There are a bunch of interop tools, such as IHE, DICOM, etc.; but when you look on a go-forward basis, and this is clearly tied to consumer sovereignty of getting their chart on a smartphone—Cures says, “open API without special effort.”

So ONC has to figure out, OK, what does that mean, in a rule? How do we determine which APIs meet that criterion? The open API without special effort, when you think about the app world—every app that you have that uses mapping tools, like Google Maps, they’re using a technology called RESTful [the representational state transfer architectural style]. And if you look really fast, you can get the outbound string, but you’ll mostly now see the prettified URL. That API of the app economy is what we want to bring into healthcare. That API has three major components, besides connectivity to the Internet. They’re using the RESTful gets and puts; the data standard, typically JSON [JavaScript Object Notation open-standard file format], a really simple way of representing data; and then, to get these data structures into the language of healthcare, FHIR resources are put on top of that. That’s the modern stack. It’s sort of hard to come up with anything besides that to meet that criterion. So we’re sorting out how best to say that.

In terms of development, there are already small companies using that. There was The Argonaut Project that put the implementation guide together. And you saw a couple of months ago Apple announcing that they’re using the FHIR stack to fuel their health app. There was a very nerdy article on machine learning and neural networks written by 30 authors from Google, UCSF, Stanford, and the University of Chicago recently [“Opportunities and obstacles for deep learning in biology and medicine,” Travers Ching et al, Journal of the Royal Society Interface, published online April 4, 2018], and they were using neural networks using hundreds of thousands of data fields per patient, and representing all of that as FHIR. So I would submit that the first step across the dance floor has been taken. We just heard from someone who said that the morning after Apple made that announcement, probably 300 hospital CIOs were called into their CEOs’ offices to ask what they were doing. Who knows how many it really was? But I think people get it. They obviously know something’s going on. So I think some of this progress may be so fast that it will be even faster than the timelines that we embed in rulemaking.

The recent announcements from ONC and HHS indicate that you’ve moved on from meaningful use—am I correct?

The program has morphed. And again, the meaningful use stuff is a CMS construct rather than an ONC construct, just to be clear. ONC does the certification; CMS incorporates certified EHRs [electronic health records] into their rulemaking, but sets all of the parameters around MU. It’s no secret that the big focus of this is now promoting interoperability. Clearly, we’re trying to reduce some of these burdens. Congress, in the [continuing resolution funding the federal government that was passed in February of this year], passed a rider saying there didn’t need to be an ongoing escalation of certification requirements. And the line in the HITECH Act saying essentially that each version had to be more or less more stringent—I’m not sure the precise wording—was eliminated in the last go-around of budget development, as a line item. So you’ll see a focus on promoting interoperability. Now, the certification act, the HITECH Act, is still there; but our go-forward focus is really on interoperability for patient empowerment, price transparency.

Is there anything you’d like to add?

In the discussions around transparency, it’s also important to understand that we need transparency into the services provided. Transparency is a broad narrative. There’s price transparency, there’s transparency into what was provided. And this same transparency can help with markets lower cost, help develop a learning healthcare system. And at the center of all that, is open APIs.



2018 Raleigh Health IT Summit

Renowned leaders in U.S. and North American healthcare gather throughout the year to present important information and share insights at the Healthcare Informatics Health IT Summits.

September 27 - 28, 2018 | Raleigh


EHR-Compatible Pharmacist Care Plan Standard Opens the Door to Cross-Setting Data Exchange

September 14, 2018
by Zabrina Gonzaga, R.N., Industry Voice
| Reprints
Pharmacists drive information sharing towards quality improvement

Pharmacists work in multiple environments—community, hospital, long term care, clinics, retail stores, etc.—and consult with other providers to coordinate a patient’s care.  They work with patients and caregivers to identify goals of medication therapy and interventions needed, and to evaluate patient outcomes.  Too often, pharmacy data is trapped in a silo and unavailable to other members of the care team, duplicated manually in disparate systems which increases clinical workloads without adding value.

To address these issues, Lantana Consulting Group and Community Care of North Carolina (CCNC) developed an electronic document standard for pharmacist care plans—the HL7 Pharmacist Care Plan (PhCP). The project was launched by a High Impact Pilot (HIP) grant to Lantana from the Office of the National Coordinator for Health Information Technology (ONC).

Before the PhCP, pharmacists shared information through paper care plans or by duplicative entry into external systems of information related to medication reconciliation and drug therapy problems. This documentation was not aligned with the in-house pharmacy management system (PMS). The integration of the PhCP with the pharmacy software systems allows this data to flow into a shared care plan, allowing pharmacists to use their local PMS to move beyond simple product reimbursement and compile information needed for quality assurance, care coordination, and scalable utilization review.

The PhCP standard addresses high risk patients with co-morbidities and chronic conditions who often take multiple medications that require careful monitoring. Care plans are initiated on patients identified as high risk with complex medication regimes identified in a comprehensive medication review. The PhCP is as a standardized, interoperable document that allows pharmacist to capture shared decisions related to patient priorities, health concerns, goals, interventions, and outcomes. The care plan may also contain information related to individual health and social risks, planned interventions, expected outcomes, and referrals to other providers. Since the PhCP is integrated into the PMS or adopted by a software vendor (e.g. care management, chronic management, or web-based documentation system), pharmacist can pull this information into the PhCP without redundant data entry.


Safety & Unintended Consequences of Interoperability: Establishing High Reliability Principles & Techniques

Interoperability may seem like just a technology challenge, but in actuality it is a people, process, and technology challenge. Healthcare systems increasingly look to create high-reliability...

The PhCP allows pharmacists for the first time to share information with support teams and paves the way for them to support value-based payment. The project goals align with the Center for Medicare & Medicaid Services’ (CMS’) value-based programs, which are part of the Meaningful Measure Framework of improved care team collaboration, better health for individuals and populations, and lower costs.

Scott Brewster, Pharm.D., at Brookside Pharmacy in East Tennessee, described the PhCP as a tool that helps them enhance patient care delivery. “From creating coordinated efforts for smoking cessation and medication utilization in heart failure patients, to follow up on recognized drug therapy problems, the eCare plan gives pharmacists a translatable means to show their value and efforts both in patient-centered dispensing and education that can reduce the total cost of care.” (The eCare plan reference by Scott Brewster is the local term used in their adoption of the PhCP).

The pilot phase of the project increased interest in exchanging PhCPs within CCNC’s pharmacy community and among pharmacy management system (PMS) vendors. The number of vendors seeking training on the standard rose from two to 22 during the pilot. Approximately 34,000 unique care plans have been shared with CCNC since the pilot launch.

This precedent-setting pilot design offered two pharmacy care plan specifications: one specification is based on the Care Plan standard in Clinical Document Architecture (CDA); the other standard is a CDA-on-FHIR (Fast Healthcare Interoperability Resources). The latter specification directly transforms information shared using the FHIR standard into CDA. FHIR is straight forward to implement than CDA, so this is an appealing option for facilities not already using CDA. The dual offerings—CDA and CDA-on-FHIR with lossless transforms—provide choice for implementing vendors while allowing consistent utility to CCNC.

What’s on the horizon for the pharmacy community and vendors? With the support of National Community Pharmacists Association (NCPA), the draft standards will go through the HL7 ballot process for eventual publication for widespread implementation and adoption by vendors. This project will make clinical information available to CCNC and provide a new tool for serving patients with long-term needs in the dual Medicare-Medicaid program and Medicaid-only program.  This is a story about a successful Center for Medicare and Medicaid Innovation (CMMI)funded project that started out as a state-wide pilot and is now rolling out nationwide as Community Pharmacy Enhanced Service Network (CPESN)USA. 

The PhCP is based on a CDA Care Plan standard that is part of ONC’s Certified EHR Technology requirements, so it can be readily implemented into EHRs. This makes the pharmacist’s plan an integral part of a patient’s record wherever they receive care. 

Adoption of the PhCP brings pharmacies into the national health information technology (HIT) framework and electronically integrates pharmacists into the care planning team, a necessary precursor to a new payment model and health care reform. In addition, receiving consistently structured and coded pharmacy care plans can augment data analysis by going beyond product reimbursement to making data available for, utilization review, quality assurance and care coordination.

Troy Trygstad, vice president for Pharmacy Provided Partnerships at CCNC, described the strategic choice now available to pharmacists and PMS vendors. “Fundamentally, pharmacy will need to become a services model to survive. Absent that transformation, it will become a kiosk next door to the candy aisle. The reasons vendors are buying into the PhCP standard for the first time ever is that their clients are demanding it for the first time ever."

The move to value-based payment will continue to drive the need for pharmacists, as part of care teams, to provide enhanced care including personal therapy goals and outcomes. Sharing a medication-related plan of care with other care team members is critical to the successful coordination of care for complex patients.

Zabrina Gonzaga, R.N., is principal nurse informaticist and director of health informatics at Lantana Consulting Group and led the design and development of the PhCP standard. 


Twitter: @lantana_group


More From Healthcare Informatics


Health IT Now Pushes for Information Blocking Regulation, Says Administration “Must Uphold its End of the Bargain”

September 13, 2018
by Rajiv Leventhal, Managing Editor
| Reprints

The executive director of Health IT Now, a coalition of healthcare and technology companies, is again criticizing the Trump administration for not yet publishing any regulation on information blocking, as required by the 21st Century Cures Act legislation.

In an op-ed published recently in STAT, Health IT Now’s Joel White wrote, “More than 600 days after the enactment of the Cures Act, not a single regulation has been issued on information blocking.” White added in frustration, “Health IT Now has met with countless officials in the Trump administration who share our commitment to combat information blocking. But those sentiments must be met with meaningful action.”

The onus to publish the regulation falls on the Office of the National Coordinator for Health IT (ONC), the health IT branch of the federal government that is tasked with carrying out specific duties that are required under the 21st Century Cures Act, which was signed into law in December 2016. Some of the core health IT components of the Cures legislation include encouraging interoperability of electronic health records (EHRs) and patient access to health data, discouraging information blocking, reducing physician documentation burden, as well as creating a reporting system on EHR usability.

The information blocking part of the law has gotten significant attention since many stakeholders believe that true interoperability will not be achieved if vendors and providers act to impede the flow of health data for proprietary reasons.

But ONC has delayed regulation around information blocking a few times already, though during an Aug. 8 episode of the Pulse Check podcast from Politico, National Coordinator for Health IT Donald Rucker, M.D., said that the rule is "deep in the federal clearance process." And even more recently, a bipartisan amendment to the U.S. Senate's Department of Defense and Labor, Health and Human Services, and Education Appropriations Act for Fiscal Year 2019 includes a requirement for the Trump administration to provide Congress with an update, by September 30.

White, in the STAT piece, noted a June Health Affairs column in which Rucker suggested that implementation of the law’s information blocking provisions would occur “over the next few years.” White wrote that this is “a vague timeline that shows little urgency for combating this pressing threat to consumer safety and stumbling block to interoperability.”

Health IT Now is not alone in its belief that the rule should have been published by now, nor is it the first time the group is bringing it up. Last month

Related Insights For: Interoperability


Are You a Data Blocker? How to Fit into ONC’s New Interoperability Framework and Regulation

Please register to download

By the end of this year, ONC’s implementation and interpretation of data blocking will also be published and available for comment, as was the case with the TEFCA proposed rule. The TEFCA final rule is also anticipated by the end of 2018.

HOWEVER…there’s still time to prepare for TEFCA and the data blocking regulation, and final rules for both in the coming months will set concrete timelines, and for TEFCA it will be interesting to see how ONC reacts to stakeholder comments, internal and external.

See more on Interoperability