In a recent conversation, a CMIO described the era of Meaningful Use and ICD-10 to me as the “doldrums of regulatory reform” that “sucked up all the oxygen” in the industry, leaving little room for innovation. So I can see why there would be little appetite for more regulation related to health data, and obviously the current administration prefers market-based solutions to regulatory ones.
Yet the Oct. 22 meeting, “Data Min(d)ing: Privacy and Our Digital Identities,” put on by the U.S. Department of Health & Human Services, made it clear to me that as more health data is gathered (and sold) outside the clinical setting, there is a “Wild West” atmosphere in which pretty much anything goes in terms of what companies not covered by HIPAA can do with our health data.
As an example, an April 2018 CNBC article noted that Facebook “has asked several major U.S. hospitals to share anonymized data about their patients, such as illnesses and prescription information, for a proposed research project. Facebook was intending to match it up with user data it had collected in order to help the hospitals figure out which patients might need special care or treatment.” (That project is currently on hiatus, Facebook said.)
The HHS meeting brought together industry leaders and researchers for some thought-provoking presentations about the many ways genetic, wearable and EHR health data is being used. For instance, James Hazel, Ph.D, J.D., a research fellow at the Center for Biomedical Ethics and Society at the Vanderbilt University Medical Center, presented his research that involved a survey of the privacy policies proffered by U.S. direct-to-consumer genetic testing companies. Hazel noted that there has been huge growth in direct-to-consumer genetic testing, with an estimated 12 million people tested in the United States. Beyond offering consumers the services, these companies doing the testing wish to monetize that data through partnerships with pharmaceutical companies and academic researchers. There is also value to government and law enforcement officials – to solve cold cases, for instance.
There is a patchwork of federal and state laws governing disclosure of secondary data usage to consumers, but the industry is largely left to self-regulate, he said. In his survey of 90 companies offering these genetic data services, “10 percent had no policies whatsoever,” he said. About 55 companies had genetic data policies, but there was tremendous variability in policies about collection and use. Less than half had information on the fate of the sample. In terms of secondary use, the majority of policies refer to internal uses of genetic data. However, very few addressed ownership or commercialization. And although almost all made claims to being good stewards of the data, 95 percent did not provide for notification in case of a data breach. The provisions for sharing de-identified data are even less restrictive. Hazel noted that 75 percent share it without additional consent from the consumer.
Hazel’s take-home message: “We saw variability across the industry. Also, we had a group of law students and law professors read the policies and there was widespread disagreement about what they meant,” he said. “Also, nearly every company reserves the right to change the policy at any time, and hardly any company provided for individual notice in event of a change.” He finished his presentation with a question. “What is the path forward? Additional oversight by the Federal Trade Commission? Or allowing industry efforts to take the lead before stepping in?”
In a separate presentation, Efthimios Parasidis, J.D., a professor of Law and Public Health at the Ohio State University, spoke about the need for an ethical framework for health data.
Parasidis began by noting that beyond data security and privacy, consent and notice are inadequate ethical markers. “If one looks at regulations, whether it is HIPAA, the European Union’s GDPR, or California’s recently enacted consumer privacy law, the regulatory trend has been to emphasize consent, deletion rights and data use notifications,” he said. While these are important regulatory levers, missing is a forum for assessing what is fair use of data. “Interestingly, few areas of data collection require ethics review,” he stressed. HIPAA does not speak to when data use is ethical but rather establishes guidelines for maintaining and sharing certain identifiable health information. Even those protections are limited. HIPAA only applies to covered entities, he noted. It does not apply to identifiable health information held by a wide variety of stakeholders, including social media, health and wellness apps, wearables, life insurers, workers’ compensation insurers, retail stores, credit card companies, Internet searches, and dating companies.
“While the volume of identifiable health information held in HIPAA-free zones engulfs that which is protected by HIPAA and may support more accurate predictions about health than a person’s identifiable medical records,” Parasidis said, “the limits of HIPAA’s protections go beyond scope. For data on either side of the HIPAA divide, an evaluation of ethical implications is only required for human subject research that falls under the Common Rule. Much of data analytics falls outside the Common Rule or any external oversight.”
Citing the Facebook example mentioned above, Parasidis noted that tech giant Amazon, Apple, Google, Microsoft and Uber are entering the digital health space. “The large swathes of identifiable information that these entities hold raise a host of ethical questions,” he added, “including widespread re-identification of de-identified health information, health profiling of individuals or groups and discrimination based on health conditions.”
Policies and guidelines can supplement the small subset of data covered under legally mandated ethics review, he explained. For instance, federal agencies sometimes use internal disclosure review boards to examine ethical implications of data disclosure. But it is not clear this type of review is happening in the private sector.
One way to think about more robust ethics review is the use of data ethics review boards, he said. Their structure can be modeled on institutional review boards or disclosure review boards. “This new administrative entity is necessary because much of contemporary data analytics falls outside existing frameworks,” he said. “We argue that these boards should focus on choice, responsiveness, accountability, fairness and transparency — a CRAFT framework. For instance, choice goes beyond consent. Individuals have an ongoing interest in their health data and should be able to specify how it is collected, analyzed and used.”
Reasonable minds can disagree on the relative weight of ethical principles or how they should be enacted into the context of data use deliberations, he said. “We nevertheless believe there remains an urgent need to craft an ethical framework for health data.”