Diving into Healthcare’s Data Blocking Enigma | Rajiv Leventhal | Healthcare Blogs Skip to content Skip to navigation

Diving into Healthcare’s Data Blocking Enigma

October 11, 2016
| Reprints
Key healthcare stakeholders have argued both sides. Where does the truth lie?

I recently had a great conversation with Dr. Farzad Mostashari, M.D., one of the leading voices in health IT. A former National Coordinator for Health IT, Mostashari subsequently founded Aledade in 2014—a Bethesda. Md.-based company focused on physician-led accountable care organizations (ACOs).

During our interview, which can be read in full here, we touched on a multitude of industry-wide issues, but there was one thing that Mostashari said that definitely deserved a deeper dive. We were talking about IT challenges that ACOs in the trenches were having, and then I asked him about other physician pain points when it comes to IT. Here is what he said:

 "First, there is the real world data blocking that we’re seeing. The first example is EHR [electronic health record] vendors—in order to fully develop that picture and really know your patient, and to know who needs your help, you need to do predictive modeling with the clinical data. It’s about getting clinical data out of EHRs that the practices have paid for and spent tens of thousands of hours putting data into them. Wanting to get your own data out is way too hard, expensive and slow. It’s neither cheap, easy nor fast; you get zero out of those three, and honestly I would settle for getting two out of those three. So that needs to be fixed."

He continued…

 "The part that galls me the most is that the vendors can’t or won’t do what they pledged to do as part of the certification program for EHRs. These EHRs got tested in a lab to be able to produce batch downloads of patient care summaries, but in the field they either can’t or won’t do it. Some vendors actually implemented their technical solution in order to past the certification lab test, so it’s as if they 'hardcoded' it to their lab test. It’s like knowing what the questions would be, they hardcoded their answers to that. But you can’t have a conversation with them in the field. They played a compliance game to pass the test, but they knew they didn’t actually have to have it working in production. That needs to have consequences. There needs to be a robust surveillance program response from ONC. If vendors don’t comply with the certification requirements they should be at risk of having their certification revoked. Or the vendors will charge you, say $40,000 for an interface engine that they didn’t originally say was needed as part of the certification program’s transparency requirements. They said it was a complete EHR."

To me, this was as damning a statement we have heard in quite some time regarding data blocking, a major point of contention ever since the Office of the National Coordinator for Health Information Technology (ONC) produced a report last year, per the request of Congress, that detailed several examples of EHR developers and health systems blocking health information sharing between each other. If data blocking does take place on either the vendor or provider system side, the likely cause is that they are avoiding competition by favoring the services they control. Of course, as healthcare looks to become more interoperable, this blocking of information would have a direct effect on that goal.

Naturally, organizations such as the HIMSS EHR Association (EHRA), which is comprised of some 40 health IT vendors including the very biggest industry players, called the ONC report into question, saying that “the concept of ‘information blocking’ is still very heterogeneous, mixing perception, descriptive, and normative issues in ways that are not easily untangled. The EHRA later said that charging for interface software and services should not be considered information blocking.

It should also be noted that the ONC, which also called for Congressional action to put a stop to the data blocking in the report, based much of its findings on and anecdotal evidence and accounts of potential information blocking found in various public records and testimony, industry analyses, trade and public news media, and other sources.

It’s not hard to see why the government is trying to figure out if data blocking is a true problem in healthcare, after it has spent $30 billion in healthcare technology investments via the Health Information Technology for Economic and Clinical Health (HITECH) Act alone. But depending on who you ask, it may or may not exist. And if it does, there are plenty of complications about what exactly defines data blocking.

That’s why Mostashari’s comments stuck with me. He is directly claiming that vendors are “working” the system to their advantage, meaning doing just enough to pass the certification test, but not enough to fulfill the requirements of the spirit of the regulation, in terms of functionality in the field when it matters. This is not just a simple “loophole” without consequences either; the lack of information sharing can unequivocally have significant healthcare implications.

On the provider side, Mostashari also said that he is seeing data blocking form hospitals. He said in our interview, “It’s a way to keep patients in their own network, to encourage doctors to join their ACO, rather than an external ACO, out of concern there might actually be fewer admissions, maybe? I don’t know what it is, but we are seeing very conscious and active information blocking on the part of hospitals.”

But once again, this accusation has been refuted in healthcare circles. The Medicare Access and CHIP Reauthorization Act (MACRA) proposed rule from this past April called for physicians to attest they are not engaging in information blocking, and stated that there would be surveillance to ensure that health systems’ EHRs were enabled for information exchange. However, John Halamka, M.D., CIO of Beth Israel Deaconess Medical Center in Boston, in response to that proposal, said on his blog that the surveillance is simply unnecessary. “I’ve never seen a location in Boston where a clinician, in a volitional way, disabled functionality in an EHR to block information flow.” Halamka said.

Meanwhile, last September, Daniel Barchi, senior vice president and CIO of Yale New Haven Health System & the Yale School of Medicine, was so focused on this issue that he penned a piece for Healthcare Informatics titled, “Eclipsing the Perception of Data Blocking.” Barchi wrote:

"There is little evidence that hospitals or physicians are hoarding patient data for their own gain. Quite the opposite is true—after years of building and implementing EMRs, health providers have turned their focus to better data sharing with patients and other providers.”

He gave examples of patient access to data and information sharing at his own health system, also noting that “Other hospitals and physician practices are also using technology and interface standards to share data locally."

Why is there such disconnect about what people think about information blocking? Mostashari said it’s because those who deny it either have strategic reasons to disagree or simply aren’t “walking in the shoes of the people who are in the field trying to get data across networks.”

To me, the answer may lie somewhere in the middle. And as healthcare stakeholders continue to struggle to move data in a free flowing manner, this debate likely won’t go away. Issues such as lack of progress in technology system upgrades and lack of agreed upon standards only muddy the waters more. Is it really data blocking when vendors are charging money for interface software? And when hospitals point to the many frictions that make data exchange less of a priority for them, does that count as information blocking?

In the end, opinions will vary and there will be misconceptions. At this year’s HIMSS conference in Las Vegas, companies that provide 90 percent of EHRs used by hospitals nationwide as well as the top five largest private healthcare systems in the country agreed to sign a pledge that among other things, not block electronic health information (defined as knowingly and unreasonably interfering with information sharing).

But perhaps the core issue is in that definition itself. Only those in the trenches know the real truth, and here’s hoping that the future brings with it greater clarity.

The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


/blogs/rajiv-leventhal/interoperability/diving-healthcare-s-data-blocking-enigma
/news-item/interoperability/hl7-model-identifies-clinical-genomics-workflows-use-cases

HL7 Model Identifies Clinical Genomics Workflows, Use Cases

January 16, 2019
by David Raths, Contributing Editor
| Reprints
Domain Analysis Model covers pre-implantation genetic diagnosis, whole-exome sequencing, RNA sequencing and proteomics

HL7’s Clinical Genomics Work Group has published an HL7 Domain Analysis Model (DAM) to identify common workflows and use cases to facilitate scalable and interoperable data standards for the breadth of clinical genomics scenarios.

The Domain Analysis Model (DAM), which has underdone a rigorous ISO/ANSI-compatible balloting process, covers a myriad of use cases, including emerging ones such as pre-implantation genetic diagnosis, whole-exome sequencing, RNA sequencing and proteomics.

The effort “builds on the DAM Clinical Sequencing work that is already being used to design precision medicine workflows at hospitals across the country,” said Gil Alterovitz, Ph.D., an HL7 Clinical Genomics Work Group co-chair, in a prepared statement. He also serves as a Harvard professor with the Computational Health Informatics Program/Boston Children’s Hospital.

The Clinical Sequencing DAM fueled the design of FHIR Genomics, the subset of HL7’s FHIR standard designed to communicate clinical genomic information. “By extending to broader domains, it can serve as a standard going forward to aid in the design of workflows, exchange formats as well as other areas,” Alterovitz added,

The document presents narrative context and workflow diagrams to guide readers through the stages of each use case and details steps involving the various stakeholders such as patients, health care providers, laboratories and geneticists. This contextual knowledge aids in the development and implementation of software designed to interpret and communicate the relevant results in a clinical computer system, especially a patient's electronic health record.

The HL7 Clinical Genomics Work Group developed several new applications and refinements in the Domain Analysis Model beyond its original scope of clinical sequencing. One notable addition is the analysis of the common workflows for pre-implantation genetic diagnosis (PGD). For those undergoing in-vitro fertilization, advanced pre-implantation genetic screening has become increasingly popular as it avoids the implantation of embryos carrying chromosomal aneuploidies, a common cause of birth defects. Implementers can follow the workflow diagram and see the context for each transfer of information, including the types of tests performed such as blastocyst biopsy and embryo vitrification.

As the clinical utility of proteomics (detecting, quantifying and characterizing proteins) and RNA-sequencing increases, the DAM also outlines clinical and laboratory workflows to capitalize on these emerging technologies.

HL7 notes that future challenges arise from uncertainty about the specific storage location of genomic data, such as a Genomics Archive and Computer/Communication System (GACS), as well as the structure of a patient’s genomic and other omics data for access on demand, both by clinicians and laboratories. Best practices in handling such considerations are being formulated within HL7 and include international input from across the spectrum of stakeholders. In parallel, the HL7 Clinical Genomics Work Group has been preparing an implementation guide for clinical genomics around many of these use cases, to be leveraged alongside the newly published HL7 FHIR Release 4 standard.

 

More From Healthcare Informatics

/news-item/interoperability/onc-releases-interoperability-standards-advisory-reference-2019

ONC Releases Interoperability Standards Advisory Reference 2019

January 15, 2019
by Heather Landi, Associate Editor
| Reprints

The Office of the National Coordinator for Health IT (ONC) has released the 2019 Interoperability Standards Advisory (ISA) Reference Edition, which serves as a “snapshot” view of the ISA.

The 2019 Interoperability Standards Advisory represents ONC’s current assessment of the heath IT standards landscape. According to ONC, this static version of the ISA won’t change throughout the year, while the web version is updated on a regular basis. The ISA contains numerous standards and implementation specifications to meet interoperability needs in healthcare and serves as an open and transparent resource for the industry.

The Interoperability Standards Advisory (ISA) process represents the model by which ONC coordinates the identification, assessment, and public awareness of interoperability standards and implementation specifications that can be used by the healthcare industry to address specific interoperability needs including, but not limited to, interoperability for clinical, public health, research and administrative purposes. ONC encourages all stakeholders to implement and use the standards and implementation specifications identified in the ISA as applicable to the specific interoperability needs they seek to address. Furthermore, ONC encourages further pilot testing and industry experience to be sought with respect to standards and implementation specifications identified as “emerging” in the ISA.

The newest ISA reference edition includes improvements made based on comments provided by industry stakeholder during the public comment period, which ended Oct. 1, according to a blog post written by Steven Posnack, executive director of ONC’s Office of Technology, Chris Muir, standards division director, Office of Technology, and Brett Andriesen, ONC project officer. ONC received 74 comments on the ISA this year, resulting in nearly 400 individual recommendations for revisions.

According to the blog post, the ISA contains “a variety of standards and implementation specifications curated by developers, standards gurus, and other stakeholders to meet interoperability needs (a term we use in the ISA to represent the purpose for use of standards or implementation specifications – similar to a use case) in healthcare.”

“The ISA itself is a dynamic document and is updated throughout the year, reflecting a number of substantive and structural updates based on ongoing dialogue, discussion, and feedback,” Posnack, Muir and Andriesen wrote.

The latest changes to the reference manual include RSS feed functionality to enable users to track ISA revisions in real-time; shifting structure from lettered sub-sections to a simple alphabetized list; and revising many of the interoperability need titles to better reflect their uses and align with overall ISA bets practices. According to the ONC blog post, the updates also include several new interoperability needs, including representing relationship between patient and another person; several electronic prescribing-related interoperability needs, such as prescribing weight-based dosing and request for refills; and operating rules for claims, enrollment and premium payments.

The latest changes also include more granular updates such as added standards, updated characteristics and additional information about interoperability needs.

The ONC officials wrote that the ISA should be considered as an open and transparent resource for industry and reflects the latest thinking around standards development with an eye toward nationwide interoperability.

The ISA traditionally has reflected recommendations from the Health IT Advisory Committee and its predecessors the HIT Policy Committee and HIT Standards Committee and includes an educational section that helps decode key interoperability terminology.

 

Related Insights For: Interoperability

/news-item/interoperability/onc-report-health-it-progress-stifled-technical-financial-barriers

ONC Report: Health IT Progress Stifled by Technical, Financial Barriers

January 15, 2019
by Heather Landi, Associate Editor
| Reprints

While progress has been made in the adoption of health IT across the U.S. healthcare industry, significant interoperability hurdles remain, including technical, financial and trust barriers, according to a report from the Office of the National Coordinator for Health Information Technology (ONC).

Currently, the potential value of health information captured in certified health IT is often limited by a lack of accessibility across systems and across different end users, the ONC report stated.

The annual report from the U.S. Department of Health and Human Services (HHS) and ONC to Congress highlights nationwide health IT infrastructure progress and the use of health data to improve healthcare delivery throughout the U.S.

The report, “2018 Report to Congress: Annual Update on the Adoption of a Nationwide System for the Electronic Use and Exchange of Health Information,” also reflects progress on the implementaions of the Federal Health IT Strategic Plan 2015-202 and the Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap.

In the report, ONC notes that most hospitals and health care providers have a digital footprint. As of 2015, 96 percent of non-federal acute care hospitals and 78 percent of office-based physicians adopted certified health IT. The increase in health IT adoption means most Americans receiving health care services now have their health data recorded electronically.

However, hurdles to progress still remain. For example, ONC notes that many certified health IT products lack capabilities that allow for greater innovation in how health information can be securely accessed and easily shared with appropriate members of the care team. “Such innovation is more common in other industries. Also, lack of transparent expectations for data sharing and burdensome experiences for health care providers limit the return on investment for health care providers and the value patients are able to gain from using certified health IT,” the report authors wrote.

While health information is increasingly recorded in a digital format, rather than paper, this information is not always accessible across systems and by all end users—such as patients, health care providers and payers, the report authors note. Patients often lack access to their own health information, healthcare providers often lack access to patient data at the point of care, particularly when multiple healthcare providers maintain different pieces of data, own different systems or use health IT solutions purchased form different developers, and payers often lack access to clinical data on groups of covered individuals to assess the value of services provided by their customers.

Currently, patients electronically access their health information through patient portals that prevent them from easily pulling from multiple sources or health care providers. Patient access to their electronic health information also requires repeated use of logins and manual data updates, according to the report. For healthcare providers and payers, interoperable access and exchange of health records is focused on accessing one record at a time. “Without the capability to access multiple records across a population of patients, healthcare providers and payers will not benefit from the value of using modern computing solutions—such as machine learning and artificial intelligence—to inform care decisions and identify trends,” the report authors wrote.

Looking at the future state, the report authors contend that certified health IT includes important upgrades to support interoperability and improve user experience. Noting ONC’s most recent 2015 edition of certification criteria and standards, these upgraded capabilities will show as hospitals and healthcare provider practices upgrade their technology to the 2015 edition, the report authors state.

“As HHS implements the provisions in the Cures Act, we look forward to continued engagement between government and industry on health IT matters and on the role health IT can play to increase competition in healthcare markets,” the report authors wrote, noting that one particular focus will be open APIs (application programming interfaces). The use of open APIs will support patients’ ability to have more access to information electronically through, for example, smartphones and mobile applications, and will allow payers to receive necessary and appropriate information on a group of members without having to access one record at a time.

Healthcare industry stakeholders have indicated that many barriers to interoperable access to health information remain, including technical, financial, trust and business practice barriers. “In addition, burden arising from quality reporting, documentation, administrative, and billing requirements that prescribe how health IT systems are designed also hamper the innovative usability of health IT,” the report authors wrote.

The report also outlines actions that HHS is taking to address these issues. Federal agencies, states, and industry have taken steps to address technical, trust, and financial challenges to interoperable health information access, exchange, and use for patients, health care providers, and payers (including insurers). HHS aims to build on these successes through the ONC Health IT Certification Program, HHS rulemaking, health IT innovation projects, and health IT coordination, the report authors wrote.

In accordance with the Cures Act, HHS is actively leading and coordinating a number of key programs and projects, including “continued work to deter and penalize poor business practices that lead to information blocking,” for example.

The report also calls out HHS’ efforts to develop a Trusted Exchange Framework and a Common Agreement (TEFCA) to support enabling trusted health information exchange. “Additional actions to meet statutory requirements within the Cures Act including supporting patient access to personal health information, reducing clinician burden, and engaging health and health IT stakeholders to promote market-based solutions,” the report authors wrote.

Moving forward, collaboration and innovation are critical to the continued progress on the nationwide health IT infrastructure. To that end, the HHS report authors recommend that the agency, and the health IT community overall, focus on a number of key steps to accelerate progress. Namely, health IT stakeholders should focus on improving interoperability and upgrading technical capabilities of health IT, so patients can securely access, aggregate and move their health information using their smartphones, or other devices, and healthcare providers can easily send, receive and analyze patient data.

The health IT community also should focus on increasing transparency in data sharing practices and strengthen technical capabilities of health IT, so payers can access population-level clinical data to promote economic transparency and operational efficiency, which helps to lower the cost of care and administrative costs, the report authors note.

Health IT developers and industry stakeholders also needs to prioritize improving health IT and reducing documentation burden, time inefficiencies and hassle for healthcare providers so clinicians and physicians can focus on their patients rather than their computers.

 

See more on Interoperability

agario agario---betebet sohbet hattı betebet bahis siteleringsbahis