Although progress has been made since Stage 1 of the meaningful use program, any expectation that Consolidated Clinical Document Architecture (C-CDA) documents could provide complete and consistently structured patient data is premature, according to new research published in the Journal of the American Medical Informatics Association (JAMIA).
The researchers, from Harvard Medical School, Boston Children’s Hospital, Vermont-based Lantana Consulting Group, and elsewhere, note that in the second stage of the federal incentive program for EHR adoption, or meaningful use, requires use of the C-CDA for document exchange. MU2 requires providers to exchange C-CDA documents for 10 percent of care transitions and for certified EHR technology to be capable of ingesting select data upon receipt. This is a significant advance from MU1, where only data display and testing of exchange were required. In an effort to examine and improve C-CDA based exchange, the SMART (Substitutable Medical Applications and Reusable Technology) C-CDA Collaborative brought together a group of certified EHR and other health information technology vendors.
As such, the researchers examined the machine-readable content of collected samples for semantic correctness and consistency. This included parsing with the open-source BlueButton.js tool, testing with a validator used in EHR certification, scoring with an automated open-source tool, and manual inspection. They conducted group and individual review sessions with participating vendors to understand their interpretation of C-CDA specifications and requirements.
Indeed, they contacted 107 health IT organizations (44 of whom responded) and collected 91 C-CDA sample documents from 21 distinct technologies. Manual and automated document inspection led to 615 observations of errors and data expression variation across represented technologies. Based upon their analysis and vendor discussions, the researchers identified 11 specific areas that represent relevant barriers to the interoperability of C-CDA documents.
Although not comprehensive, each of the 11 “trouble spot” represents a relevant, common issue in C-CDA documents. The 11 trouble spots can be seen in the figure below. The researchers noted that “The severity and clinical relevance of these trouble spots vary according to the context of C-CDA document use. Data heterogeneity or omission may impose a minimal burden in cases where humans or computers can normalize or supplement information from other sources.”
Source: Journal of the American Medical Informatics Association
During the study, vendors commented that they did not always know how to represent data within the C-CDA. While the ONC created a website to assist in C-CDA implementation with a scorecard that tests providers' C-CDAs and grades them, and HL7 increased its help desk content, vendors suggested these were inadequate and sometimes unclear. “There is need for a site where public samples and common clinical scenarios of C-CDA documents, sections, and entries can be queried,” the researchers stated.
They continued, “A simple and powerful solution would be to require every technology to publish C-CDA documents with standardized fictional data used in EHR certification. While vendors may take different implementation approaches, publication would foster transparent discussion between vendors, standards bodies, and providers.” They added that “C-CDA documents produced from technologies in Stage 2 of MU will omit key clinical information and often require manual data reconciliation during exchange.”
The researchers concluded, “In an industry often faulted for locking down data and stifling interoperability, we were heartened by Collaborative participants who helped identify specific problems and equally specific ways to improve document exchange. This research demonstrated the power of group collaboration and utility of open-source tools to parse documents and identify latent challenges to interoperability.”