Now that the dust has settled from HIMSS 2018, interoperability was a hot topic that has promise for improving interaction between systems and entities. Many are jumping on the FHIR (Fast Healthcare Interoperability Resources) bandwagon as a methodology for improving interoperability. This is great news for IDN’s, and for data accessibility across entities, but hopefully, the marketplace will consider lessons learned from the imaging space.
Many years ago, there was great enthusiasm for another industry standard, and what it was going to mean for image exchange – that being the DICOM (Digital Imaging Communications) Standard. DICOM was purported to be the answer to how disparate imaging systems and PACS (Picture Archive and Communications System) could interoperate.
What those involved in the early DICOM efforts discovered was that it requires more than simply defining a protocol or means for sharing information. For example, there needs to be a common definition of nomenclature so that information can be reliably passed between systems.
I recall an early issue that cropped up between a certain vendor’s CT scanner and different vendor’s PACS. CT images are composed of pixels that are quantified by a measurement known as Hounsfield Units. Tissue densities are represented by numerical values that range from -1000 (air) to +1000 (bone). To simplify things, this particular CT vendor stored values by adding 1000 units to make everything positive (0-2000). Since the PACS was expecting images with the standard Hounsfield units, images displayed on the PACS were blank (black) screens, as normal tissue with a value of zero would now be 1000.
Despite the use of the DICOM standard to pass images between these devices, a workable solution required a reinterpretation on the part of the CT vendor to make a viable interface. Given the type of information handled between a CT and PACS, it seems like child’s play compared to the complexity of EHR environments where there is substantially more opportunity for different interpretations of data elements. It will require a significant effort on the industry’s part to insure a common interpretation/definition of data elements.
Perhaps even more challenging is the issue of long-term support. Again, relying on the experience of DICOM, each vendor has its own unique development cycle for its products. Consequently, there is always the challenge of keeping systems current and in synchronization. For example, assume that a vendor updates a system that is interfaced to an EHR. Without testing interoperability for every conceivable EHR configuration, there are likely to be situations where that vendor’s changes are no longer compatible with the EHR. Similarly, the reverse may also be true. An EHR change may upset an interface to a smaller vendor’s system.
One would hope that the industry has learned from prior experiences, and that current initiatives are more inclined to foster sharing among vendors to achieve greater commonality in data formats. Experience with ARRA/Meaningful Use and subsequent “connectithons” will help improve data compatibility collaboration. Vendors need to remember that it’s not the data itself that is unique, it’s they way they mange and present it that differentiates them in the market.
The significance for IT organizations will be to pay special attention in the acquisition of new and updated systems to ensure that vendors agree to cooperate in terms of maintaining interoperability over the specified life of the product. This can often be overlooked, and it can result in considerable unplanned support resources when things break. As the old adage states “an ounce of prevention is worth a pound of cure!”