Breach Aftermath: Lessons Learned
CEO Offers Practical Advice Based on ExperienceFollowing a breach, take responsibility for your actions as an organization and as a leadership team. That's an important lesson learned by the CEO of an organization that experienced a breach last year caused by the theft of an unencrypted laptop.
"I, especially, as a CEO, took individual responsibility for not providing the leadership and not providing the policies, and not providing the tools to those on the front line who were just trying to do their jobs and just trying to do the right thing," says Micky Tripathi of The Massachusetts eHealth Collaborative, a not-for-profit consultancy that experienced a breach last year.
Tripathi spelled out in a recent blog the details of the organization's breach, which involved the theft of an unencrypted laptop from an employee's car, The breach, which affected about 1,000 patients of the collaborative's physician group practice clients, cost almost $300,000 to resolve.
Tripathi outlines in an in-depth interview with HealthcareInfoSecurity's Howard Anderson (transcript below) eight important lessons learned. Among them are:
- If you experience a breach, "treat it as your most high-priority project." Tripathi held daily meetings with a crisis team to coordinate breach resolution efforts.
- Do not underestimate how difficult it is to respond to and remediate a breach.
- Assume all portable devices contain sensitive information and take action to protect it, including the use of encryption.
In the wake of the breach, Massachusetts eHealth Collaborative broadened its use of encryption and trained all staff on how to use the technology. It now uses whole disk encryption of laptops, file-level encryption for passing files to and from its clients, and secure e-mail.
Tripathi is president and CEO of the collaborative, which is supported by 34 non-profit healthcare organizations in Massachusetts. The organization specializes in advising physician group practices and others about the implementation of electronic health records. Tripathi also chairs the Health Information Exchange Workgroup of the federal Health IT Policy Committee, which makes recommendations about health information exchange to the Office of the National Coordinator for Health Information Technology in the U.S. Department of Health and Human Services. Before joining the collaborative, Tripathi was a manager at the Boston Consulting Group and served as founding president and CEO of the Indiana Health Information Exchange. He has a Ph.D. in political science from Massachusetts Institute of Technology.
HOWARD ANDERSON: For starters, can you please briefly describe your organization for us?
MICKY TRIPATHI: The Massachusetts eHealth Collaborative is a non-profit organization that was founded in 2004 to facilitate the implementation and adoption of electronic health records and health information technologies primarily in the ambulatory part of the healthcare delivery market.
Data Breach Details
ANDERSON: You recently wrote a lengthy blog describing your organization's experience with a breach, offering valuable lessons learned. So for starters, please summarize the details of the incident briefly. It involved the theft of a laptop, right?
TRIPATHI: Yes it did. The incident happened when one of our employees had a laptop with them while they were out in the field. [The employee] stopped to have an appointment, left the laptop in their automobile, in a briefcase, and the briefcase was stolen. We believe and have every indication it was just a random, incidental theft. The laptop wasn't exposed, just a briefcase was exposed. A window was broken and the briefcase taken with all of its contents. As it turned out, the laptop had some patient demographic information on it, which thus would constitute a breach of PHI [protected health information] by federal standards and PII, which is personally identifiable information, by Massachusetts state standards.
Lessons Learned
ANDERSON: In your blog you offered a summary of eight lessons learned from the experience. Can you go over those quickly for us now, please, so people can learn from your experience?
TRIPATHI: The first one was really just about ... self-awareness. It was basically saying that if you're a physician practice or a contractor, as I said in the blog, look in the mirror. Look at your organization and make sure that you really understand what's going on in the front lines of the organization. And one of the things that I think is probably true for many, many organizations, not because they're bad or negligent - I certainly don't consider us negligent - is that business processes develop over time and organizations get big and complicated, and while everyone is doing the right thing and they believe that they're doing the right thing, sometimes things start to happen on the front lines that may not be fully in alignment with what your company policy would require.
Often convenience is the enemy of good security practices. It's not because any individual wants to violate good security practices. It's because they have job requirements that suggest a certain level of performance and they start cutting corners - sometimes not even knowing that they're cutting corners - in order to get their job done.
One of the things that we found was that we needed to do a really meaningful self-examination end-to-end of what's going on in our organization, and that's the first lesson learned I would offer to any organization. I would say the only organizations I know of that really know this are the ones who have been through a security breach. What I would like to do is see if we can get to the point that you don't have to go through the experience that we went through to actually do this kind of self-assessment, as painful as it might be, particularly for your front lines. Understand where that interface is between your organization and the outside world and your customers, where you're getting that PHI or where you're giving it back, and what are the processes ... on the front lines. I think you will almost always find that there's room for improvement.
Protect Mobile Devices
Second, I would say that it's probably a good practice to assume that your portable devices contain some kind of sensitive information. ... Your mobile device probably almost certainly contains some kind of sensitive information if you're dealing with PHI on a routine basis. The example that I drew on was not one that necessarily pertains to us because we're not a covered entity. We're not a provider organization, but we work with many, many provider organizations, and one of the things that we found is that they have mobile devices where they use their EHRs. Those EHRs on the mobile devices are synched up either with a cloud-based system or with a client-server based system back in their office, and those EHRs are supposed to, whenever they're shut down, delete any locally, temporarily stored information that was made available on that mobile device. Well it turns out that we have found in almost every case that we've gone into that the mobile device actually has PHI on it, either because the software wasn't quite as robust as suggested by the vendor or more often the users were doing things that the vendor would probably say they shouldn't do. But the only guard against that was just the statement that you shouldn't do it [store PHI on mobile devices].
For example, saving a referral letter in the system [may] create a Word document. If we [open] up [the device], create the referral letter and then save it back into the EHR, there are many applications that either would allow, or as a matter of routine course, save a temporary file on that mobile device. And when the computer shuts down, it doesn't delete the temporary file, or the physician might inadvertently, or on purpose, save it locally. Again, because they don't know, they don't realize that it would constitute a real risk on their part. That's why we strongly recommend now and we're strong believers in encryption as really a back-stop to those policies. It doesn't mean you shouldn't have those policies, but encryption is an important back-up.
Understand Work Flow
Third, we would strongly recommend that you understand who's working in your practice and what they're doing. We certainly understand that part of the challenge with that - there are a couple of challenges - one is that physician offices are really, really busy. There are people coming in and out all the time and they have complex systems - not just IT systems - all sorts of other systems going on in their practice and there are vendors who they rely on coming in and out, and it's hard to know what each and every one of them is doing. But that said, anyone who is touching your EHR system, or touching PHI, you probably ought to get a handle on what they're doing. That doesn't mean that there's day-to-day diligence applied to what they're doing, but it's probably worth having a conversation upfront with that vendor about what exactly they are going to be doing in the way of having to access PHI and how they handle PHI.
One of the things we do now, based on this experience as a company, is that we will force that conversation with our customers, so as an ... initial part of our engagement, we have a little statement that we've created about ... the data that we expect and need access to and here's how we treat the data and manage the data. That doesn't really protect us from anything. I mean, if we have a breach - we have a breach. But we really use it as a vehicle to generate awareness and to have the conversation with the provider organization to make sure they're aware of what we're doing, because, at the end of the day, one of the things that we were surprised to find - and somewhat disappointed because we wanted to take full responsibility for this - was that the provider organizations who we worked for were, from a federal perspective, the ones who were really held responsible.
The federal law basically assumes that you as a covered entity know what your vendors are doing, and if something happens with one of your vendors, one of your contractors, the responsibility is yours. I don't think many provider organizations really know that. I think that we've come to sort of take for granted business associate agreements and providers think that, "Well, I've signed a business associate agreement so all the responsibility is now theirs." Not true from a federal perspective.
Need for Crisis Management
The fourth lesson I would offer is that you need to take it seriously. What I mean by that is - and we didn't come to this right away - after the initial shock we contacted our attorneys and I assumed that they would start to manage the process. But what we found was that it was a much more engaged activity than that. It took us about a day or two to realize that we're going to have to treat this as a serious crisis ... I put everything aside, cleared my desk, cleared my calendar for the first day or two while we [discerned] what the problem was. Then [we] got together a crisis team, which included our attorneys, my security officer and our customers and had literally daily meetings every day at 5 p.m. We had an hour phone call where we'd manage the progress. We put together a project plan.
My company took responsibility for all of this, but we did ask that all those people be a part of those daily phone calls, and obviously our attorneys were working by our sides. But we wanted to make sure that we were, on a daily basis, all in touch, all staying on top of any new information that was coming out as we did the forensic analysis, and, most important, that we were all on the same page with respect to what the actions were that we needed to take going forward as we were trying to navigate the state and federal laws and what the notification requirements were, and media notices and all of that.
So I would certainly say treat it as a serious project ... even think of it as a crisis. Put together a crisis team. Put together a project plan and then treat it as your most high-priority project.
Sorting Out Complexities
The fifth lesson I would say is to not underestimate how difficult it's going to be just to figure out what you actually did, and what you might have violated and what you might not have violated, and what remediation you're supposed to do. Because one of the things that we found was that it takes a while to figure that out. And one of the recommendations that we had - and this is going to the sixth one - is to keep a daily log of your activities from day one. This was just advice from one of our board members who worked for a large provider organization that had been through a number of these [breaches] before.
One of the things that we just came to recognize ... was that as hard as it is to get your arms around this - and you're working really hard and we think that we were doing everything in the provider's interest and the patient's interest - you'll still get criticism from those who are on the outside and don't realize that some of the stuff takes time just to figure out what actually happened, let alone to then take the remediation steps. So you'll inevitably find someone, a significant stakeholder, who will look at something where you've taken two and a half weeks and they'll say, "What were you doing? You were sitting on it." The reality was you were working incredibly hard just to get to the point that you could find out who needs to be notified of what and what actually happened so that you don't give anyone any misinformation. A daily log can really help with that.
Take Responsibility
Number seven - as I say in the blog - is you've got to "man up." What I mean by that is not staff up but man up in the sense of take responsibility. Take responsibility for your actions. It's really, really easy, we found, to sort of shuck responsibility and to say, "Well it's their fault and if only the process was better, and this is a multi-stakeholder thing and it's your fault and our fault." The reality of the situation was we were the ones who had the information and made a mistake and the information was taken. What we found was that everyone was incredibly respectful and forgiving.
All of our customers, most importantly the one who had the breach that was over 500 and who arguably suffered most from all of this, had their name published on the federal list of large breaches, also had the media notices issued on their behalf that they had a breach - they were incredibly sympathetic with our situation. And I think that a part of that was because we took accountability from the beginning and made sure that they understood that we took accountability and that we were going to do everything in our power to make them whole. Some things you can't make them whole on, like their loss of reputation, but they were incredibly sympathetic with our situation as well.
Executive Leadership
The last thing that I would offer is really about the importance of not only taking responsibility as an organization, but having taken responsibility from a management and leadership perspective.
One of the things that we found was that as we did the analysis of the work flow, what our front-line people were doing and how so much patient information could be on a laptop in an unencrypted form, one of the things we found was some of the other people on the front lines basically were saying, "There but for the grace of god, go I," basically meaning that this was a mistake by the person who definitely violated company policy, definitely did something they shouldn't do, but, on the other hand, probably didn't have enough education and training. They probably didn't have enough tools to allow them to do their job securely. Because as I said earlier, in our opinion, convenience is the biggest enemy of security. We can say all we want about policy and you must do this and you have to do that, but if we're also telling them you have to cover these 10 practices and you better get this done in the next two months, then you've given them basically a conflicting set of objectives that's very hard for them to do in a secure way if they're also not given security tools that will allow them to get the job done according to what the company goals are ...
We took responsibility as an entire company. And I, especially, as a CEO took individual responsibility for not providing the leadership and not providing the policies and not providing the tools to those on the front line who were just trying to do their jobs and just trying to do the right thing. That was a part of the responsibility that we took. We did end-to-end analysis, we brought in the front-line people, we asked them to describe in great detail what their work flow was, what their need for PHI was [in various] circumstances ... and then we crafted a set of policies that would abide by all federal and state laws as well as our company priorities - our ethical responsibilities as a company.
Then we provided them with a set of tools, very specific tools that we made sure that everyone of our front-line people had, and we made sure they had the training to use those tools. We did each of those steps with the participation of the front-line folks so that those tools were things that the front-line folks participated in the selection of as well. That gave us the confidence that the organization, from the top all the way down to the bottom ... had all bought in and understood why we had all these tools in place. We had some confidence that they were going to be used appropriately and on a daily basis.
Those are the eight lessons learned. As I said, there are many, many others. I'm sure many of your readers and listeners would have many others to offer, but I hope that these lessons learned can be lessons that other people can take and learn from without having to go through our experience.
Use of Encryption
ANDERSON: When you refer to tools, you're now using encryption on all your mobile devices? Is that the most significant thing you did in making sure that people have that technology and know how to use it and are using it consistently?
TRIPATHI: We have three specific tools that we rolled out. One was whole-disk encryption. I made sure that I was the first person to get the whole-disk encryption on my laptop just to convey to the entire company that this is incredibly important and it's a high priority, even though I never have PHI on my laptop because I don't do any real work, I'm just a CEO. But I wanted to make sure that whatever pain they were going through - because it's a little bit of an inconvenience ... to have the encryption software on there because it takes time. It adds to the boot-up time. I wanted to make sure that everyone understood that this is really important. So whole-disk encryption was one.
The second was a tool for file-level encryption. Often we want to be able to pass a file, get a file from a customer and receive it on our end, but also be able to pass it back. We want to be able to do that in a secure way, so we provided them a tool that they could zip something up according to the FIPS encryption standard and be able to give that to someone else.
The last tool that we provided was secure e-mail. There are sometimes circumstances where they need to be able to e-mail something back and forth. It may be too big or it may be inconvenient for them to try to zip it up and then send it, which would also be HIPAA compliant, and so we offered them the secure e-mail option as well - a secure open channel for them to be able to send whatever they need to send.
Those three tools were [in] the tool kit that we came up with, with the input of the front-line folks, that would provide them with enough flexibility to be able to use any one or any combination of those to get their jobs done.