One Consultant’s Take on GDPR and How It Raises the Stakes for U.S. Healthcare Organizations | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

One Consultant’s Take on GDPR and How It Raises the Stakes for U.S. Healthcare Organizations

April 23, 2018
by Heather Landi
| Reprints
Click To View Gallery

The General Data Protection Regulation (GDPR), Europe’s new framework for data protection laws, is set to go into effect in one month, and the new regulation has far-reaching implications for organizations worldwide that collect personal information about European Union residents. In the U.S., physicians and healthcare providers will be facing new laws regarding the safeguarding of Personally Identifiable Information (PII) for EU patients.

GDPR was adopted in April 2016 and will be fully enforced on May 25, 2018 by the UK Information Commissioner’s Office (ICO). GDPR is designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy and to reshape the way organizations across the region approach data privacy. According to many experts, the regulatory framework pertains to any organization that handles EU data, whether that organization is in the EU or not. The entire regulation can be accessed here, the EU GDPR website's frequently asked questions page can be found here and a breakdown of key changes can be found here.

Moving forward, U.S. healthcare organizations will need to safeguard EU patients’ data based on the GDPR in addition to the Health Insurance Portability and Accountability Act (HIPAA) regulation and other U.S. regulations. The GDPR will affect when and how a healthcare provider must report breaches, and fundamentally changes how personal and sensitive data can be used, processed, managed, stored, deleted and disclosed.

According to the website of the Spiceworks virtual IT community, in a nutshell, “the regulations affect how companies must handle personal user data commonly tracked online. This includes IP addresses, geographic locations, names, home or work addresses, gender, and a wide range of more sensitive information such as health status, political affiliation, religion, and ethnicity, among other things.”

What’s more, the GDPR imposes stiff fines on data controllers and processors for non-compliance, up to 4 percent of the organization’s global annual revenue or 20 million euros, whichever is higher. According to the EU GDPR website, this is the maximum fine that can be imposed for the most serious infringements, such as not having sufficient customer consent to process data or violating the core of Privacy by Design concepts. There is a tiered approach to fines—a company can be fined 2 percent of global annual revenue for not having their records in order, not notifying the supervising authority and data subject about a breach or not conducting impact assessment. It is important to note that these rules apply to both controllers and processors—meaning “cloud” will not be exempt from GDPR enforcement, according to the EU GDPR website.

According to the Spiceworks virtual IT community, the following are some, but not all, of the provisions organizations collecting or processing any personal data on EU residents must comply with if they want to avoid the risk of incurring potentially large financial penalties:

Privacy by design — Organizations that collect personal data on EU residents can only store and process data when it's absolutely necessary. Additionally, they need to limit access to this personal data on a “need to know” basis.

Consent — Under GDPR, individuals must explicitly opt in to allowing organizations to collect personal data by default. Additionally, an individual's consent can be removed at any time.

Right to access — Organizations must provide an individual residing in the EU with access to the personal data gathered about them upon request.

Breach notification — Under the regulation, in the event of a data breach, organizations must provide notification to affected parties within 72 hours.

Right to erasure — Sometimes called the right to be forgotten, organizations must honor requests to erase personal user data when asked to do so.

Data portability — Organizations must provide a way for individuals to transmit or move data collected on them from one data collector or data processor to another. 

Data protection officers —Organizations that process large sums of GDPR data must assign a data protection officer (DPO).

John Barchie, a security consultant and senior fellow at Phoenix-based Arrakis Consulting, recently spoke with Healthcare Informatics’ associate editor Heather Landi to drill down further into the implications of the GDPR regulation for U.S. healthcare organizations and what steps organizations should be taking now to be compliant with GDPR. Below are excerpts of that interview, edited for length.

What are some of the key requirements of GDPR, and what do healthcare organization leaders need to know about the regulatory framework?

This is a disruptive regulation and it will require organizations to get their legal departments involved. Once it’s understood, it’s fairly straightforward. Healthcare organizations should already be fairly compliant with what GDPR is asking for. If an organization is strongly HIPAA compliant, then it will be much easier for them to absorb GDPR; if they have been going off HIPAA for a while, then GDPR is going to come as a shock.

There are some major categories that they need to be aware of—one is the concept of consent, and then the right to access. When we say the right to access, we mean the data subject’s right to access their own data. And then there is the data subject’s ‘right to be forgotten,’ and the data subject’s right to have their data portable. Then, there are the obligations of the processor and the controller—the controller is the one who is collecting the data from the data subject, and the processor is the one processing that data. If you give your data to a health system, that health system might have a sub-contractor that is processing the data. So, the health system would be the controller and the sub-contractor would be the processor. The controller and processor must have something built in called privacy by design.

The regulation requires a new role to be created called the data protection officer. In terms of HIPAA, it’s similar to the chief privacy officer, but it’s a different concept. The chief privacy officer is responsible for determining when data should legitimately be released. The purpose of the data protection officer is to ensure that the data that is processed in a legal manner for the regulation. This role is a requirement if you’re going to be GDPR compliant.

There also are implications with regard to breach notification. In the U.S., we’re used to providing breach notification after our investigation, and with GDPR, you’ll need to inform the supervisory authority within 72 hours of identifying that a breach has occurred.

What steps should healthcare organizations be taking now to be GDPR-compliant?

The first thing is, you need to read the regulations; there are 99 articles within the regulation. The biggest thing organizations can do right now is go over their consent forms and evaluate how they collect the data. And then, on the back end, they need to actually diagram out how their data is processed. With regard to their consent form, that’s going to involve the legal department, as there are new requirements above and beyond what they are used to providing for HIPAA. And on the back end, on the clinical side, they really need to take inventory of where their data is and where it is processed. By that I mean, writing it down and having diagrams and data flows; all the things a regulator is going to look for when they come in and start asking questions.

The fact is, GDPR will be disruptive, and if organizations are starting now they are not likely, in my opinion, to be compliant by May 25. If you’re not going to be compliant May 25, you need to show a willingness to comply. To show that willingness to comply, organizations should, at the very least, have their consent forms ready to go by May 25. Organizations also should know who their supervisory authority is (Article 51 of GDPR). They also should have taken inventory of all the uses and data processes. They should know, in writing, where is that data in the organization and who are we sub-contracting that data out to? The contracts with those sub-contractors also need to be updated. The goal is, if they have a working plan and even if they are only 60 percent done on the working plan when May 25 rolls around, they are probably in good shape, because they have a working plan and they are showing what’s known as a willingness to comply.

At an organization level, who should be involved in this work?

From an executive point of view, this should be a board-level item that the board discusses in a regular meeting and there should be board minutes as to how they intend to address GDPR. The executive steering committee should provide direction on how they intend to address GDPR. A program manager might need to be assigned the task of breaking out the project. The CIO should definitely be involved, as well as the CISO (chief information security officer). The head of customer service also should be involved, and, obviously, the organization’s chief privacy officer. Within IT, there has to be an understanding of where the data is and how it’s being processed. The database administrator, or the system administrator, will need to be involved. It’s not a one- or two-man show and it’s not just something IT does. There’s a lot that IT can’t do because it has to do with privacy and how information is released and handled. But, there is the technical piece that IT needs to do, such as the ‘right to be forgotten,’ that’s a big deal from an IT perspective.

What are the implications for health IT leaders specifically?

With regard to the data subject rights, the biggest one is the ‘right to be forgotten,’ which is a provision that says a data subject has the right to insist on the total and complete erasure of their data. If a data subject doesn’t want to do business with you, you should not be processing their data anymore. And that’s a technical challenge, as, technically, if you’re in a database, you’re in the database forever. With the ‘right to be forgotten,’ there needs to be a mechanism where an organization can guarantee that data is not being processed. That’s a tough row to hoe, especially for healthcare organizations or companies if they use that healthcare data as part of their research, as now that piece of research has been taken away from you, and that is the data subject’s right to say I’m not doing business with you, and by default, you’re not allowed to use my data anymore. That whole concept is foreign to healthcare organizations.

The key is, compliance officers at healthcare organizations need to read up on the regulations as GDPR changes the way organizations handle people’s data. It will be disruptive and already is disruptive to healthcare organizations that are in the middle of all this work.

The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


Should HIPAA Privacy Rules Change? HHS Seeks Input

December 13, 2018
by Rajiv Leventhal, Managing Editor
| Reprints

The Office for Civil Rights (OCR) has issued an RFI seeking input from the public on how Health Insurance Portability and Accountability Act (HIPAA) Rules, particularly the HIPAA Privacy Rule, could be modified to reflect the administration’s goal of promoting coordinated, value-based care.

As the government noted in a press release on the RFI, “HHS developed the HIPAA Rules to protect individuals’ health information privacy and security interests, while permitting information sharing needed for important purposes. However, in recent years, OCR has heard calls to revisit aspects of the Rules that may limit or discourage information sharing needed for coordinated care or to facilitate the transformation to value-based healthcare.”

Now, the RFI serves to request “information on any provisions of the HIPAA Rules that may present obstacles to these goals without meaningfully contributing to the privacy and security of protected health information (PHI) and/or patients’ ability to exercise their rights with respect to their PHI.”

In addition to requesting broad input on the HIPAA Rules, the RFI also seeks comments on specific areas of the HIPAA Privacy Rule, according to HHS, including:

  • Encouraging information-sharing for treatment and care coordination
  • Facilitating parental involvement in care
  • Addressing the opioid crisis and serious mental illness
  • Accounting for disclosures of PHI for treatment, payment, and health care operations as required by the HITECH Act
  • Changing the current requirement for certain providers to make a good faith effort to obtain an acknowledgment of receipt of the Notice of Privacy Practices

“This RFI is another crucial step in our Regulatory Sprint to Coordinated Care, which is taking a close look at how regulations like HIPAA can be fine-tuned to incentivize care coordination and improve patient care, while ensuring that we fulfill HIPAA’s promise to protect privacy and security,” said Deputy Secretary Eric Hargan.

He added, “In addressing the opioid crisis, we’ve heard stories about how the Privacy Rule can get in the way of patients and families getting the help they need. We’ve also heard how the Rule may impede other forms of care coordination that can drive value. I look forward to hearing from the public on potential improvements to HIPAA, while maintaining the important safeguards for patients’ health information.”

Comments are due by Feb. 11, 2019.


More From Healthcare Informatics


AMIA Calls for Harmonization of Data Privacy Policies

November 16, 2018
| Reprints

As the lines between consumer and clinical data systems continues to blur, there is a need to harmonize health sector data privacy policy, such as the Health Insurance Portability and Accountability Act (HIPAA) and consumer data policy to develop a new era of privacy policy, according to the American Medical Informatics Association (AMIA).

AMIA provided written comments last week in response to the National Telecommunications and Information Administration’s Request for Comment (RFC) on the Administration’s approach to consumer privacy. NTIA, an agency within the Department of Commerce, was seeking feedback on ways it can advance consumer privacy while also protecting innovation. The RFC sought feedback on how certain organizational privacy goals and outcomes can be achieved. These outcomes include organizational transparency, user control over personal information, reasonable minimization of data collection, organizational security practices, user access and correction, organizational risk management, and organizational accountability.

In its written comments, AMIA encouraged the Trump administration to closely examine both HIPAA and the Common Rule and develop an explicit goal to harmonize “health sector” and “consumer sector” data privacy policies. The informatics group cautioned the Administration against a patchwork of consumer privacy policies that is already the norm in the health sector.

Jeff Smith, vice president, public policy at AMIA, notes that given the health sector’s experience with HIPAA and the Common Rule, there is a unique opportunity to accomplish two aims with this executive and legislative branch conversation—harmonize health sector data privacy policy with consumer data privacy policy and develop a national forum and framework to allow states flexibility to address local needs and norms.

In its written comments, AMIA noted that differences in the interpretation of HIPAA have led to wild variations in application. The group thus urged the administration to balance the need for both prescriptive process-oriented policies and outcome-oriented policies, writing that “[a]n over-emphasis on vague or difficult-to-measure outcomes without guidance on process will result in the failings of HIPAA – wide variation in interpretation and inconsistent implementation.”

AMIA went on to not only reiterate its support for patients always having access to their data, but advocated extending this principle to other sectors of the economy and elevating it to “a prerequisite condition and central organizing principle from which other outcomes derive.”

Further, while AMIA broadly supported the RFC’s high-level goals, it recommended that the administration also focus on “closing regulatory gaps” that endanger data privacy. Citing a 2016 ONC report, AMIA pointed out that there are health-related technologies that exist outside the scope of HIPAA, Federal Trade Commission (FTC) regulation, or state law. Thus, a truly comprehensive approach to consumer privacy should address these gaps, AMIA wrote.

Finally, AMIA encouraged the administration to take several steps to address data governance and ethical use. It recommended that FTC “develop a framework for organizations to use that supports trust, safety, efficacy, and transparency across the proliferation of commercial and nonproprietary information resources,” in addition to an “ethical framework around the collection, use, storage, and disclosure of the personal information consumers may provide to organizations.”

“We applaud the administration for initiating this long overdue conversation. As the lines between consumer and clinical devices continues to blur, the need for harmonized federal policy becomes more pronounced,” Douglas B. Fridsma, M.D., Ph.D, AMIA President and CEO, said in a statement. “Just as we strive to ensure that patients have access to and control over their data, we must strive to deliver the same for consumers. The administration should learn from the health sector and develop improved privacy policies across all sectors of the economy.”


Related Insights For: Privacy


Time to End ‘Wild West’ of Health Data Usage in HIPAA-Free Zones

| Reprints
Beyond consent, bioethicists argue for ethical guidelines governing fair use of data
Click To View Gallery

In a recent conversation, a CMIO described the era of Meaningful Use and ICD-10 to me as the “doldrums of regulatory reform” that “sucked up all the oxygen” in the industry, leaving little room for innovation. So I can see why there would be little appetite for more regulation related to health data, and obviously the current administration prefers market-based solutions to regulatory ones.

Yet the Oct. 22 meeting, “Data Min(d)ing: Privacy and Our Digital Identities,” put on by the U.S. Department of Health & Human Services, made it clear to me that as more health data is gathered (and sold) outside the clinical setting, there is a “Wild West” atmosphere in which pretty much anything goes in terms of what companies not covered by HIPAA can do with our health data.

As an example, an April 2018 CNBC article noted that Facebook “has asked several major U.S. hospitals to share anonymized data about their patients, such as illnesses and prescription information, for a proposed research project. Facebook was intending to match it up with user data it had collected in order to help the hospitals figure out which patients might need special care or treatment.” (That project is currently on hiatus, Facebook said.)

The HHS meeting brought together industry leaders and researchers for some thought-provoking presentations about the many ways genetic, wearable and EHR health data is being used. For instance, James Hazel, Ph.D, J.D., a research fellow at the Center for Biomedical Ethics and Society at the  Vanderbilt University Medical Center, presented his research that involved a survey of the privacy policies proffered by U.S. direct-to-consumer genetic testing companies. Hazel noted that there has been huge growth in direct-to-consumer genetic testing, with an estimated 12 million people tested in the United States. Beyond offering consumers the services, these companies doing the testing wish to monetize that data through partnerships with pharmaceutical companies and academic researchers. There is also value to government and law enforcement officials – to solve cold cases, for instance.

There is a patchwork of federal and state laws governing disclosure of secondary data usage to consumers, but the industry is largely left to self-regulate, he said. In his survey of 90 companies offering these genetic data services, “10 percent had no policies whatsoever,” he said. About 55 companies had genetic data policies, but there was tremendous variability in policies about collection and use. Less than half had information on the fate of the sample. In terms of secondary use, the majority of policies refer to internal uses of genetic data. However, very few addressed ownership or commercialization. And although almost all made claims to being good stewards of the data, 95 percent did not provide for notification in case of a data breach. The provisions for sharing de-identified data are even less restrictive. Hazel noted that 75 percent share it without additional consent from the consumer.

Hazel’s take-home message: “We saw variability across the industry. Also, we had a group of law students and law professors read the policies and there was widespread disagreement about what they meant,” he said. “Also, nearly every company reserves the right to change the policy at any time, and hardly any company provided for individual notice in event of a change.” He finished his presentation with a question. “What is the path forward? Additional oversight by the Federal Trade Commission? Or allowing industry efforts to take the lead before stepping in?”

In a separate presentation, Efthimios Parasidis, J.D., a professor of Law and Public Health at the Ohio State University, spoke about the need for an ethical framework for health data.

Parasidis began by noting that beyond data security and privacy, consent and notice are inadequate ethical markers. “If one looks at regulations, whether it is HIPAA, the European Union’s GDPR, or California’s recently enacted consumer privacy law, the regulatory trend has been to emphasize consent, deletion rights and data use notifications,” he said. While these are important regulatory levers, missing is a forum for assessing what is fair use of data. “Interestingly, few areas of data collection require ethics review,” he stressed. HIPAA does not speak to when data use is ethical but rather establishes guidelines for maintaining and sharing certain identifiable health information. Even those protections are limited. HIPAA only applies to covered entities, he noted. It does not apply to identifiable health information held by a wide variety of stakeholders, including social media, health and wellness apps, wearables, life insurers, workers’ compensation insurers, retail stores, credit card companies, Internet searches, and dating companies.

“While the volume of identifiable health information held in HIPAA-free zones engulfs that which is protected by HIPAA and may support more accurate predictions about health than a person’s identifiable medical records,” Parasidis said, “the limits of HIPAA’s protections go beyond scope. For data on either side of the HIPAA divide, an evaluation of ethical implications is only required for human subject research that falls under the Common Rule. Much of data analytics falls outside the Common Rule or any external oversight.”

Citing the Facebook example mentioned above, Parasidis noted that tech giant Amazon, Apple, Google, Microsoft and Uber are entering the digital health space. “The large swathes of identifiable information that these entities hold raise a host of ethical questions,” he added, “including widespread re-identification of de-identified health information, health profiling of individuals or groups and discrimination based on health conditions.”

Policies and guidelines can supplement the small subset of data covered under legally mandated ethics review, he explained. For instance, federal agencies sometimes use internal disclosure review boards to examine ethical implications of data disclosure. But it is not clear this type of review is happening in the private sector.

Parasidis described work he has done with Elizabeth Pike, director of Privacy Policy in the Office of the Chief Information Officer at HHS, and Deven McGraw, who served as deputy director of health information privacy at HHS, on a framework for ethical review of how health data is used.

One way to think about more robust ethics review is the use of data ethics review boards, he said. Their structure can be modeled on institutional review boards or disclosure review boards. “This new administrative entity is necessary because much of contemporary data analytics falls outside existing frameworks,” he said. “We argue that these boards should focus on choice, responsiveness, accountability, fairness and transparency — a CRAFT framework. For instance, choice goes beyond consent. Individuals have an ongoing interest in their health data and should be able to specify how it is collected, analyzed and used.”

Reasonable minds can disagree on the relative weight of ethical principles or how they should be enacted into the context of data use deliberations, he said. “We nevertheless believe there remains an urgent need to craft an ethical framework for health data.”



See more on Privacy

agario agario---betebet sohbet hattı betebet bahis siteleringsbahis