Advice to Regulators: Be Specific

Federal healthcare regulations should include clear standards for information security so those implementing electronic health records know exactly what steps they need to take, says risk management expert Mac McMillan.

In an exclusive interview (full transcript below), McMillan:

  • Argues that the "meaningful use" rule for the Medicare and Medicaid EHR incentive program should have included required standards for encryption as well as other security controls without specifying that particular technologies must be used.
  • Contends that the proposed modification to the HIPAA rules should more precisely specify, for example, how often a risk assessment should be conducted and what risks are most important to address. Otherwise, the requirement for risk assessments will be "very much open to interpretation."

Security policies, unfortunately, are often driven by regulations, McMillan says. And if the regulations are vague, it's difficult to explain to upper management "what we need to do," he argues.

McMillan advises those shopping for an EHR system to include security specifications in their requests for proposals and ask for a demonstration of security functions before selecting a system. Once a system is selected, he says hospitals and clinics should ask the vendor to sign a detailed security agreement in addition to a broader business associate agreement.

McMillan is co-founder and CEO of CynergisTek Inc., an Austin, Texas-based firm specializing in information security and regulatory compliance. He is chair of the Healthcare Information and Management Systems Society's Privacy & Security Steering Committee. He was a contributing author and editor for the HIMSS book, "Information Security in Healthcare: Managing Risk."

HOWARD ANDERSON: In recent weeks, federal regulators issued two final rules for the Medicare and Medicaid electronic health records incentive program. One spells out the security capabilities that electronic records software must contain to qualify for the program, and the other defines how hospitals and physicians must meaningfully use that software to qualify for the incentives. The "meaningful use" rule stops short of actually requiring the use of any specific security technologies when implementing EHRs. Is that disappointing?

MAC McMILLAN: Disappointment is probably the wrong adjective. You know, the government should stay away from specifying particular technologies; no one wants that, nor would it really be appropriate and it's not consistent with what they have done in other industries as well. However, what they could do, and many believe they should do, is be clearer in specifying minimal standards -- things like security controls, types of technologies that would apply -- so that individuals still have the ability to select the best solution or the best technology for their particular environment, but they select it based on the set of criteria that meets some standard that is going to then meet the compliance objective of the rule.

What the industry really needs is those security standards, not just for the healthcare folks to be able to select the right technologies, but more importantly for vendors who are developing applications and systems for healthcare so that they are developed with some consistency and with compliance in mind so that burden doesn't get thrown back in the laps of the healthcare providers themselves.

I don't think anyone wants the government to specify particular technologies....I know certainly the government is not interested in doing that; they have expressed that over and over again....But, I do believe that the government could provide help in being more clear and more specific with respect to the standard that they expect the people to meet, the types of technologies that people need to be thinking about that apply to the provisions that they are putting in the rules, etc.

I think the breach notification rule was a good example of this, or at least the closest one I have seen so far. The specifics that they included in that rule (the safe harbor provision) around encryption and the processes created a call to action by most vendors and covered entities, while not limiting them on how they chose to meet the standard. I think that's what we need more of.

ANDERSON: So to clarify, what types of standards would you have liked to see included in the meaningful use rule that were not in there?

McMILLAN: I think what would be appropriate are things more akin to what we saw literally in the breach notification rule.

In that rule they took established standards that were created by a well-known standard body, the National Institute of Standards in Technology. They referenced the NIST guidelines for encryption, which don't specify a particular technology, but they specify guidance in terms of the features, functions and specifications that an appropriate encryption technology should meet in order for it to do what it is intended to do and meet the intended purpose.

And then they also reference the FIPS guidelines, the Federal Information Processing Standards, in terms of the algorithms for encryption. So the breach notification rule provided a roadmap, which helped organizations to then go out and figure out what technology they needed to select in order to meet the standard. But, it didn't dictate exactly which technology they had to purchase or had to use. So there are lots of encryption solutions out there and lots of applications out there that include encryption in them that meets that standard and therefore are acceptable.

That's the correct way to do it -- for the government to basically provide enough guidance and enough clarity in the rule and in the standard such that organizations can say, "OK, I know what I need to go get in order to meet it, and I know that what I am doing is going to be compliant because I understand the functions that the system must include, or I understand the level of rigor that the algorithm must meet."

ANDERSON: Earlier another proposal was unveiled calling for modifying the HIPAA privacy, security and enforcement rules. That proposal does not include any requirements for the use of certain categories of security technology, such as encryption or authentication. Would you like to see the federal government eventually revise HIPAA to include some sort of mandates for generic types of security technologies so they apply to all healthcare organizations and not just those participating in the EHR incentive program?

McMILLAN: Absolutely. What I really would like to see them do is include greater clarity or specificity with respect to the standard that organizations need to meet and technologies they need in order to know which ones best apply to what they are trying to accomplish.

And the folks who are actually trying to implement security programs in healthcare are trying to explain to clinicians and to technologists and to managers what they need to do in order to meet the requirements of the meaningful use rule or the HITECH Act or the HIPAA privacy and security rules, and oftentimes they don't have enough guidance or enough clarity around the requirement to basically say, "This is the technology we need and I know that this is the one I need because this is what it does and this is how it applies and this is how it meets this requirement."

I think that is one of the ways that we have let them down, so to speak, with these rules, and made it harder for them to actually do the things that they know they need to do. Because oftentimes they can't justify or explain why they need a particular solution when the requirement doesn't call for that or doesn't provide enough specificity with respect to what is required in order to be able to articulate exactly why they need to do something to answer the requirement. Security has always been, unfortunately, a regulatory-driven phenomenon. Often people follow regulations when they are addressing security. So if the regulation is vague or doesn't include at least some minimal standard for compliance, then it is very difficult for them to be able to identify and to be able to explain to management exactly what it is that they need to do or why it's important for them to do it.

There is a reason why when they wrote HIPAA that they wanted it to remain flexible. You want a rule that can apply to the large healthcare entity, multi-hospital system and at the same time, the small physician's office.

But I think one of the things that they failed to recognize is that you can still have a standard that describes what needs to be done in order to meet the requirement, without specifying particular technologies so that you don't end up in a situation where it doesn't work in one environment or the other, or where there is still flexibility for that organization to go out and figure out, based on their environment and their needs and their risk, how best to meet that standard, whether that be anything from a manual process to an automated process. And you can still have that specificity with respect to the standard. That's what we have, quite frankly, in other industries when you look at security standards or security frameworks as they apply to other industries.

They are not specific in terms of "you have to go use this specific IDS system or this access control system, or this firewall." What they are specific in is the standard that the technology or the control must meet, and then the user or the entity involved has the ability, based on their environment, to go figure out what is the best technology to meet that particular standard or control objective.

That is what is missing with HIPAA. A classic example of that is risk assessment. In the meaningful use rule, they reinforce the need for organizations to conduct a risk assessment. But, the reference they gave for risk assessment was the original specification within HIPAA, which is where the regulatory mandate comes from to conduct that risk assessment.

That particular specification in HIPAA provides no guidance or no clarity around what must be done, how broad the scope of the risk assessment should be, how often the risk assessment should be accomplished and what areas are important to cover in the risk assessment. So again, users are left with this situation where it is very much up to interpretation.

Now what the HHS Office for Civil Rights did in their very first guidance document was that they published an outstanding review of the risk assessment or risk analysis process and they referenced the NIST guidelines with respect to how to conduct a risk assessment. But once again, they fell short of identifying some of the specifics around what is expected or what is the minimal standard -- how often should I do this.

The bottom line is: In a guidance document like the NIST guidelines, you are not going to lay down the requirement; that ought to be in the rule. Because we are looking at the new rule on the HIPAA privacy and security modifications, it's an opportunity to actually provide some clarity around the requirements that are in those rules -- not add to them, not greater security, but just explain and use a recognizable standard for what is expected of an organization when we say "risk assessment" and what is considered compliant so that organizations can turn around and take that just like they did with the breach notification rule. (That rule creates a safe harbor, stating breaches of patient information that's encrypted do not need to be reported). You read that rule and say, "You know what, I now know what I need to do and I've got to go encrypt data in motion, I've got to encrypt data at the transport layer, I've got to encrypt data at rest if I want to create safe harbor, and oh, by the way, the encryption standard that I have to use is specified at NIST. And now I can go look and see what's out there that meets that standard and I can then choose the one that works best for me."

It should be the same thing with risk assessments: What minimally is required of me when conducting a risk assessment? What is an acceptable risk assessment as defined by the federal government with respect to this standard?

ANDERSON: Based on your experience, what are the one or two most important steps hospitals and physicians implementing comprehensive EHRs should take to ensure clinical information remains secure?

McMILLAN: The first piece of it involves the vendor relationship. I would say for those organizations that haven't already implemented an EHR, absolutely incorporate security specifications into your RFP process or your selection criteria. Make it clear upfront that this is a requirement.

Obviously make sure that the privacy and security functionality is demonstrated during any demonstrations or any proofs of concept that you have the vendors go through, and request what standard that they are using. There are some established standards out there with respect to how things get coded and developed and where security requirements come from. What did your vendor select for how they built the functionality into their system? Is it something proprietary, which we know is going to eventually equate to some interoperability issue? Or, is it a recognizable standard that they have applied that we know will have a greater ability to communicate with other systems and integrate better into the environment?

Post selection, provide that vendor with not only the required business associate agreement but also consider giving them a security agreement with respect to what is expected of them after the sale, particularly if they are going to be involved in any way with either managing the system or they are going to host the system for you. I think that is very important in today's environment -- that we explain to vendors exactly what we expect of them with respect to security and privacy.

If you already have an EHR system, then you really need to work with your vendor to figure out exactly what functionality is there and whether or not it meets the requirement. If there is something missing, determine what the plan is for releasing the next version or the upgrade that is going to provide that functionality, making sure that it is actually on the vendor's roadmap. Also revisit your business associate agreement, which should already have been done earlier this year, but consider also serving them with a security agreement that spells out what you expect from them with respect to privacy and security.

And then conduct that risk analysis because even if you have an EHR that is certifiable or that has all the functionality, that functionality still has to be configured and has to be set up properly in the system. And the real key to doing that properly is understanding exactly how your data is created, how it is used, who uses it, where it is stored, where it could potentially be at risk or become unsecured, and then applying the right configurations, as well as the right security functionalities to protect that data appropriately.

ANDERSON: Thanks very much. We have been talking today with Mac McMillan, CEO at CynergisTek. This is Howard Anderson of Information Security Media Group.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing healthcareinfosecurity.com, you agree to our use of cookies.