Is It Time to Encrypt Data Even Inside the EHR? Maybe So, Says Mac McMillan | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

Is It Time to Encrypt Data Even Inside the EHR? Maybe So, Says Mac McMillan

July 28, 2015
by Mark Hagland
| Reprints
In the wake of the data breach at UCLA Health and other recent data breaches, data security guru Mac McMillan says it’s time to think about internal data encryption inside the EHR

On Monday, July 20, moments after he had delivered a keynote address to the CHIME Lead Forum-Denver, at the Sheraton Downtown Denver, Mac McMillan, CEO of CynergisTek, the Austin, Tex.-based consulting firm, sat down with HCI Editor Mark Hagland to talk about IT data security.

In his opening keynote address at the Forum, sponsored by the Ann Arbor, Mich.-based College of Healthcare Information Management Executives (CHIME) and by the Institute for Health Technology Transformation (iHT2—a sister organization of Healthcare Informatics through our parent company, the Vendome Group LLC), Mac McMillan had laid out in the clearest possible terms for his audience of IT executives the growing cybersecurity dangers threatening patient care organizations. Among the key areas of concern he had discussed were “increased reliance”; “insider abuse”; “questionable supply chains”; “device-facilitated threats”; “malware”; “mobility”: “identity theft and fraud”; “theft and losses”; “hacking and cyber-criminality”; “challenges emerging out of intensified compliance demands”; and a shortage of chief information security officers, or CISOs.

McMillan did reference the massive data breach at UCLA Health, which had occurred just a few days before the CHIME Lead Forum took place (it was announced on July 17). And that event was a starting point for his post-keynote exclusive interview with Hagland. Below are some excerpts from that interview.

You briefly mentioned the UCLA Health breach in your comments just now. Without over-emphasizing that one breach, what do you think we should all take from that incident, going forward?

While it’s not unusual to not have data encrypted internally—in fact, the majority of patient care organizations still do not internally encrypt—where we’re getting to with these breaches is that perhaps it is becoming time to internally encrypt data within the EHR [electronic health record]. Maybe two or three years ago, that wouldn’t have made sense. But maybe we need to change our thinking about that. Not knowing all the facts of that case, I would think about several things. One, how are we protecting the data, and architecting our environments to segregate patient information away from other information, and communicating information internally and encrypting it at rest or not? I think a lot of those historical thoughts do need to change.

The thought before was if you have the network segmented properly and you have your patient information within your data center, with physical protections, then you probably don’t need to encrypt it; you’re relying on other protective measures to compensate.

So you do think patient care organizations should look at encrypting patient records at rest within the EHR, then?

Yes, I do. I think we need to have a serious discussion about that. I think we need to also look at how we architect our environments. Do we need to put those systems inside an internal firewall so you only get to that system if you need to. We’re assuming that everyone inside is trusted. But let’s say I’m a hacker, and I compromise you externally and obtain credentials, and now the door is open to me. And here’s the kicker: why is it so easy for hackers to acquire credentials? We’re not even encrypting our passwords internally; we’re not encrypting people with elevated credentials; and we’re not using two-factor authentication with respect to people with elevated credentials.

And let’s say I’m a hacker and I get in and compromise your environment and start looking for passwords. If all your passwords are encrypted, now I’ve got to decrypt your passwords. If I’m able to do that, it’s game over again. But if there’s a second factor associated with that password—a second factor that’s a PIN or a soft token on my phone, then you’re still secure. Thinking about what hackers are trying to do, they’re looking to gain elevated privileges to allow them to make chanegs in the environment—turn things off, turn things on, etc. If they can’t get those elevated privileges, they’re done. They can’t exploit you, or it’s going to e incredibly hard to do so.

And guess what? If I’m monitoring behaviors and activity, sooner or later, you should notice something going on. For instance, when registry settings are changed—registry settings should never change unless someone with elevated privileges allow those to change. If I’m monitoring, I’ll notice that. Often, hackers will disable auditing activity. The minute someone disables auditing, or IDS (intrusion detection) gets disabled, that should be obvious. Whenever there’s a change in the environment, like a security setting, it should be registered, and somebody should be checking it out.

And part of that involves making one’s team and organization alert to patterns, of course.

I used to teach this: whenever you see your environment suddenly get better, you’d better check, because when hackers get in, as soon as they can break in, they fix something and install their own back door. They don’t want some knucklehead coming up behind them and ruining their party. So you got to your change control log and you’re looking at your monitoring tools, and if you can’t figure out who’s making changes, you’d better go check. Some of those other hacks that have occurred, most organizations when they talk about it, they’ll say, we noticed some activity but we didn’t think anything of it at the time. The point is that any anomalous behavior has to be noticed. Computers don’t turn something on or off. If it happens, it’s because somebody did it.


Get the latest information on Health IT and attend other valuable sessions at this two-day Summit providing healthcare leaders with educational content, insightful debate and dialogue on the future of healthcare and technology.

Learn More



Although employee negligence and lost/stolen devices continue to be major causes of data breaches, criminal attacks are now the leading cause of breaches in healthcare.

What are these cyber criminals doing to get access to the data, and what is causing the breaches in our healthcare organizations? Ponemon’s report says that 88 percent of these breaches came from phishing to get a foothold into a network. The attackers try to compromise employees who have elevated privileges that will give them access to sensitive systems and critical data.

Stronger technical controls like encryption and bio-access security devices will prevent damages from most of these attacks. These criminal are not looking for gall bladder surgery data; they are looking for financial information they can use to rob unsuspecting patients.

Two things need to take place immediately; we need to begin to encrypt all stored PHI and we need to improve the security measures that protect access to that data.

Even though data processing speed sometimes suffers as a result of encryption, the justifications for not encrypting data are quickly going away. Safe and secure are better than fast.

Secondly, we must take the human element out of PHI data access. Bio-access security systems must be employed that will thwart unsuspecting healthcare workers from falling prey to sophisticated “phishing expeditions” by professional hackers.

I agree that “we have to do behavioral monitoring, we have to move towards higher levels of encryption, and we have to act proactively and strategically, going forward.”

We need to detect when sensitive data is accessed in a pattern that is not normal, and we need to be able to block that access before all data is stolen. External and internal people can misuse data and our current monitoring and reporting features are not adequate since less than 14% of breaches are detected by internal security tools according to the annual international breach investigations report from Verizon. We need to apply a security monitoring approach that is data-centric.

We need to lock down the sensitive data itself with modern granular data security approaches. I found great advice in a Gartner report, covering enterprise and cloud, analyzed solutions for Data Protection and Data Access Governance and the title of the report is "Market Guide for Data–Centric Audit and Protection.” I recently read another interesting Gartner report, "Big Data Needs a Data-Centric Security Focus," concluding," In order to avoid security chaos, Chief Information Security Officers (CISOs) need to approach big data through a data-centric approach.

Aberdeen Group reported in a very interesting study with the title “Tokenization Gets Traction” that tokenization users had 50% fewer security-related incidents than non-users and 47% of respondents are using tokenization for something other than cardholder data. Aberdeen also has seen a steady increase in enterprise use of tokenization as an alternative to encryption for protecting sensitive data.

Ulf Mattsson, CTO Protegrity