If there is one thing that healthcare security professionals across the U.S. can agree on, it’s that if cybersecurity isn’t a top priority in your respective patient care organization, trouble is ahead—if it hasn’t already reared its ugly head.
Fortunately, though, having to fight to get cybersecurity prioritized organization-wide has become less of an issue for chief information security officers (CISOs). A recent survey from the Darwin Deason Institute for Cyber Security at Southern Methodist University in Dallas found that 88 percent of CISOs and CIOs report that their security budgets have increased in response to high-profile data breaches, while 81 percent reported that their upper-level management is supportive of their cybersecurity efforts, with 85 percent reporting increasing levels of support.
Nonetheless, major data breaches and losses of revenue are still far too common in healthcare, and many experts point to a lack of a multi-layered strategy that evolves well beyond simply “following the rules” as the reason why. A noted data security luminary who is at the core of this belief is Mac McMillan, CEO of the Austin, Texas-based consulting firm CynergisTek, who during a recent CHIME Lead Forum-Atlanta event, co-sponsored by the Ann Arbor, Mich.-based College of Healthcare Information Management Executives (CHIME) and the Institute for Health Technology Transformation (iHT2—a sister organization to Healthcare Informatics under the joint umbrella of the Vendome Group, LLC), specifically focused on the challenge of insider abuse. “More than half of all security incidents involve staff. Folks still think you can do traditional audit methods as a manual process. You will fail if you do that,” McMillan warned. Instead, he said, “behavior modeling, pattern analysis, and anomaly detection is what’s needed. You won’t catch folks based on rules and compliance. We need to do a better job of monitoring our users. The only way to catch bad actions is to monitor the behavior,” he emphasized.
In a more recent interview, McMillan further explains the need to move beyond using security tools that are rule-based. The problem with rule-based technology, which works by triggering an alert if a rule is “tripped,” is that rules are essentially focused on compliance, he says. “What is a person’s privileges and what does he or she have access to? As long as they are looking at what their profile says they have access to, then it wouldn’t trip a rule because nothing is going on in there that would cause a rule-based technology to see that,” McMillan explains.
As such, McMillan says that “when you have people who engage in things such as medical identify theft or snooping, you have to hone in on their behavior as opposed to compliance.” For instance, he continues, if a person is given access to the electronic health record (EHR), that person is put into a group of individuals of the same function who are given that same level of access to the system. “If you have access, you have complete access,” he says. “If the body of access you have is associated with pediatric files because you work in the children’s ward for example, then you have access to all of the children’s records that are in that portion of the EHR. It doesn’t differentiate; if you’re a nurse and you work in pediatrics, you have access to all pediatric patients. So when you talk about behavioral monitoring, it’s not just talking about what the nurse does with respect to what the rules are, but what she does period. During Nurse Jane’s watch, she looked at 58 patients even though she only treated 12. So what was she doing in the other 46 records?”
Indeed, CISOs at forward-thinking healthcare organizations are now in agreement that a compliance-driven model isn’t sufficient for cyber defense, says Darren Lacey, CISO at the Baltimore, Md.-based Johns Hopkins University/Johns Hopkins Medicine. “I have been pleasantly surprised how in healthcare recently, compared to 10 years ago, that’s become a consensus truth. It’s not just about compliance with HIPAA anymore—that won’t work,” Lacey asserts.
Hussein Syed, CISO at the West Orange, N.J.-based Barnabas Health, adds that now, the whole premise of data security has evolved from IT security to information security. As such, there has been more of an emphasis on collecting trending data, analyzing it, and figuring out where to focus on, Syed says. “There are two big areas to look at,” he says. “You have to gain as much visibility as you can into your environment from both the network and the data side. Also, you need to be able to get early warnings for any type of behavior change, whether it’s intentional or unintentional. We try to build on the defense and depth in those areas. That might sound cliché, but it’s really where [cybersecurity] is getting to—you have to have a layered defense. Attackers can compromise one layer, and will then get to the next and the next. You need controls in at least few of those places, and if you have them, the attacker will get caught somewhere, which could prevent a breach or some kind of revenue crisis,” Syed says.
A big part of a multi-layered approach, as recommended by Lacey, is a method that he says works better than any other control—and that is multi-factor authentication, both principally for administrators and for privileged users. Multi-factor authentication, a technology that requires users to provide at least one additional form of identification beyond user name and password to gain electronic access to protected health information (PHI), has been used for more than five years at Johns Hopkins, Lacey notes, and includes system administrators as well as required for remote access. “This far and away has the biggest impact for our cyber defense,” Lacey says.
On the behavioral monitoring front, Lacey also notes that the things to look for in sophisticated monitoring tools are anomaly detection and changes in behavior, but there is something else that John Hopkins is paying particular attention to beyond just general purpose anomaly detection—factors inside the behavior of the network itself that are specific to the organization. “You can’t just get that from a tool, but you really need to leverage the tools and also have enough flexibility in your own analysis to identify anomalies that are organization-specific. That’s a big deal right now,” he says.
Syed gives an example of web proxy logs. Barnabas proxies all traffic via its automated proxy solution, he notes. “One click on the Internet can bring down 40 to 50 lines of log data of how the communication took place, what information got downloaded, if there were images, and references to other websites. All of that has to be collected and rule sets created to identify normal versus abnormal behavior,” he says.
Indeed, McMillan says that the behavior detection process, when done manually, is a recipe for failure. “A person has to go in to the system and review those logs manually. He or she has to go through literally lines and lines of log information to figure out exactly what the person had access to and if it was relevant,” he says.
With automation, however, there are basic rules the system understands, McMillan continues. “So when the system gets that log data, it says I have a user who has a specific set of privileges, works in a certain location, and there is a certain set of data he or she should have access to, and it automates the process of what did a person did between X time and Y time in the system. Then it basically takes that data, indexes it, and applies the rules to it so you don’t have to do it manually. When someone is manually monitoring, they almost never get out of the reactive mode. They will typically only look when someone asks a question,” McMillan says.
McMillan adds that some organizations are recognizing the level of granularity it takes to automate this type of monitoring, while others fall woefully short in realizing this. But he notes that if more health systems take this sort of behavioral recognition approach, cybersecurity should improve uniformly.
“When you start monitoring your users proactively and you start communicating to them that these incidents have happened and here was the outcome, you have accountability associated with it,” he says. “That will begin to modify behavior, absolutely. And then less and less of that activity happens, as most people do not want to get caught and punished. Organizations that have a high sense of awareness to what’s going in their environment, be it servers or internet traffic or user population access, tend to do better than those who are flying blind in those areas.