In San Diego, a Robust Exploration of HIE and Consumer Engagement | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

In San Diego, a Robust Exploration of HIE and Consumer Engagement

January 20, 2015
by Mark Hagland
| Reprints
What will it mean to achieve “next-generation” health data exchange and true patient/consumer engagement and patient health management? A panel of industry leaders explores the topic at iHT2-San Diego

A very robust, expansive discussion of health information exchange and patient/consumer engagement helped kick off the Health IT Summit in San Diego, being held this week at the Omni San Diego Hotel, and sponsored by the Institute for Health Technology Transformation (iHT2), a sister organization to Healthcare Informatics. Broad questions around standards development, HIE process optimization, and true patient/consumer engagement, dominated the discussion on Tuesday morning’s first panel discussion, entitled “Next-Generation Data Exchange Driving PHM.”

The panel was moderated by John Mattison, M.D., the CMIO and assistant medical director at Kaiser Permanente. The other members of the panel were Wesley Combs, CIO of the Holston Medical Group (Kingsport, Tenn.); Bill Russell, senior vice president and CIO of St. Joseph Health (Orange, Calif.); David Minch, president and COO of HealthShare Bay Area and president and board chair of the California Association of Health Information Exchanges (CAHIE); Daniel Chavez, executive director of the San Diego Regional Health Information Exchange; and Kirk Larson, regional CIO of  the Sunnyvale, Calif.-based NetApp.

Panelists: from l. to r.: Chavez, Minch, Larson, Russell, Combs, Mattison

Early on in the discussion, Dr. Mattison asked panelists what some of the technical challenges are facing the robust sharing of patient data in a next-generation HIE context. Minch of HealthShare Bay Area said, “We don’t have good reference implementations. And part of the problem with the CCDA [consolidated clinical document architecture and the continuity of care document [CCD] that preceded it,” he said, “is that everybody actually implements those phenomena differently. There was a study that looked at CCDs from 21 different vendors, and they found over 600 errors or differences in interpretation, many of which would cause the receiving entity not to understand what was in the CCDA. So from the technology standpoint,” he said, “we now know how to move data pretty efficiently, through DIRECT, and through other HIE means, but in terms of content, we’re still pretty far away.”

Minch added that “California, New York, and 16 other starts participated in the Interoperability Workgroup, which began in 2012, and one of the major purposes” of the creation of that group “was to create a reference implementation. Everybody signed an MOU [memorandum of understanding] saying that we would all implement that reference implementation, and we’re still waiting for the vendors.”

“And that’s not a technology problem,” said Russell of St. Joseph Health, who bemoaned the lack of standards development among electronic health record (EHR) vendors, with regard to CCDA development industry-wide. “Right now, the doctors don’t even want to look at those CCDAs; they’re just awful. Look at the standards development in banking that allowed for ATMs. We don’t have anything like that” yet in healthcare. But, he added, “One of the things to watch is FHIR [the Fast Healthcare Interoperability Resources standard], via HL7,” which he said offers real promise. “We need to create discrete data elements so that physicians can actually act on that data,” he said.

“Look at the railroad industry so long ago, where they standardized on a gauge,” Minch offered.

Meanwhile, said Russell, “There is some economic pressure to drive costs out. Well, how are you going to do that? A lot of it has to be done through technology and automation.”

Speaking of one of the major practical obstacles involved, Minch said, “I think this country should be embarrassed that we don’t have a universal patient identifier. With apologies to all the politicians who don’t even want to address the problem,” he said, “we suffer dramatically and profoundly because we lack a universal patient identifier.”

“I think we should be horrified,” Mattison said. “And it actually originates in legislation from Congress in 1998 banning the creation of a universal patient identifier by any federal agency without approval of Congress.” Given that policy/political landscape, Mattison said, “Identity management is more likely to be the long-term solution. And it’s my expectation that what AlphaPay and other initiatives are doing now to establish two-factor identification and protocols will probably migrate into healthcare as the solution to identity management, as we’ve failed over and over again to achieve that goal.”

“And,” said NetApp’s Larson, “As a father, I’m not interested in my son having multiple blood draws because we can’t create a universal patient identifier.

Combs offered that, at his organization anyway, “We can’t wait on the standards. So we hardly ever touch the CCD or CCDA interfaces, because we can’t overcome the problems. So there isn’t much [true] meaningful use of the data right now. We just extract the data from our own interfaces, at a very granular level. So you’ve got to figure out a place to start and get going. We don’t have a huge problem in our region, because we’re not involved in multiple federated HIEs. We’re a little different; we don’t have any federal or state funding in Tennessee. We had an HIE with federal funding, that went away. So the physicians got together and helped build our model. And we’re private. We just had to move past the standards.


Get the latest information on Health IT and attend other valuable sessions at this two-day Summit providing healthcare leaders with educational content, insightful debate and dialogue on the future of healthcare and technology.

Learn More