Privacy & Security Roundtable: Putting a Lock on Patient Data

As the healthcare industry drives greater use of patient data through interoperability, information exchange and care coordination, privacy and security concerns surrounding that very data also grow. Clinical Innovation + Technology hosted a roundtable discussion during which the participants discussed the practicalities of accounts of disclosure, the impact of the Omnibus Rule, managing data breaches and more.

CIT: Are a certain amount of breaches part and parcel of the healthcare industry? Office for Civil Rights director Leon Rodriguez has said that just because an organization experiences a breach doesn’t mean it’s doing anything wrong.

Jennings Aske: On some level no breach is acceptable in the sense that patients are harmed by it.  However, I do agree with Leon Rodriguez that there are times that organizations are frankly doing everything they can and, due to accidents or error, incidents happen. I would say all institutions, including my own, can do more.

Partners Healthcare has made commitments to increasing investments in security and privacy to a level that we have never done before and part of that’s driven by the fact that the threat environment we face is different. The penetration of consumer technology and mobile technologies has really made the increase in our privacy and security efforts necessary and we recognize there is more we can do because we owe it to our patients.

Linn Foster Freedman:I deal with emergency breach responses for healthcare entities all over the country. I can tell you that since the OCR has increased its enforcement in this area, I’m seeing covered entities increasing their privacy and security efforts and understanding the importance of it for the very reason that ever since HITECH and self-reporting obligations, we are seeing how many breaches there really are. I am absolutely seeing the healthcare community from large to small spend more, invest more, take it more seriously, train their employees and really try to comply with the HIPAA regulations.

Adrian Gropper: The question that should be asked is is there enough transparency for us to recognize the privacy aspect of what is going on. Using security and privacy in the same way sort of muddies that distinction.  Encryption is all about security, but the problem is that if all of the relevant communications are happening under treatment, payment and operations and are not accessible to patients to decide whether that was a warranted or unwarranted disclosure, then privacy isn’t really being served by accounting breaches.

JA: As an institution, we’ve tried to implement technologies to help us with that accounting concept. We implemented what I refer to as a self-audit where I can actually audit who is looking at my clinical record at the Partners institution where I am receiving care. On paper that sounds great, but the reality is when I get a report it’s not particularly meaningful to me because I don’t know. It’s hard to ascertain whether or not the individuals who looked at my record really were authorized. This is one of the struggles we’re going to face with trying to enforce privacy or facilitate privacy concepts through an accounting of disclosures mechanism. It’s an important thing for us to work on.

I do not feel that privacy and security are mutually exclusive concepts, but the Venn diagram is collapsing, if you will. One of the ways we’re trying to bridge the gap in terms of the accounting issue and all these steps is by implementing technologies to try to prospectively audit and enforce privacy by determining whether a co-worker is looking at a co-worker’s record or a clinician is just browsing the EHR.

How do you practically provide that information in a way that a patient can consume it and, for an institution like Partners that has tens of millions of electronic transactions a week, how are we going to distill those into a meaningful report? I’m concerned about the practicality of meeting the goal with the existing technology that’s on the market.

Phyllis Patrick: I think privacy and security do go together. If you have a security incident, chances are you have a privacy issue. The skillset for privacy and security officers are very similar in that officers are responsible for program development, policy and other key aspects of integrating privacy and security into an organization’s culture. Some technical skills are needed but the emphasis is more on management, planning, negotiation, presentation and other skills related to getting the message out and keeping the organization on track. It’s a myth that the security officer should be in IT. The role deals much more with skills in teaching, programs and budgets and being able to work with senior management, not fixing the computer or installing a new firewall.

What are the most important provisions of the Omnibus Rule?

PP: The rule represents the most significant changes since HIPAA was passed in 1996. The emphasis of changes to the privacy rule is on patient rights, engaging patients more and involving them in their own care. Changes to the breach notification rule emphasize the presumption of breach unless proven otherwise, again with emphasis on patients’ right to know about breaches that affect their information. The security rule did not change. There was nothing unexpected in the new rule as far as I’m concerned.

One area that is going to be difficult from an operational point of view for covered entities to implement is the one that deals with out-of-pocket payment. It’s a good thing to include as a right but when you think through how to achieve that, it will require a carefully constructed, multidisciplinary effort. EHRs are quite complex so being able to do that in a way that flows from scheduling and registration all the way through to billing and making sure the information is not given out at a later date.

Achieving that within an institution is going to be difficult if not impossible. Most organizations have not yet realized that they need to take their process apart and make it work for this out-of-pocket patient experience. I’m not sure how many people will avail themselves of this right but I think it will catch on over time, particularly as we see health insurance changes.

LFF: The new standard after September 23 is that an unauthorized access, use or disclosure that is not permitted by HIPAA is presumed to be a breach unless the covered entity can show that there was a low probability of compromise of that information. From my perspective, that’s a pretty big change in the standard from a practical point of view. I’m trying to help my clients figure out whether notification is required on a case-by-case basis. There is no definition in the Omnibus Rule of the word "compromise" and many covered entities now are grappling with—and I am too frankly—when you need to notify individuals because compromise is a pretty broad word. I think people will notify patients more than they did before, and I’m concerned about that not only as a patient, but as someone who is a big advocate of privacy. People are getting really desensitized to these notification letters. We’re seeing about a 10 to 15 percent hit on credit monitoring, which means that people are just throwing these letters away.

Deborah C. Peel: We were very concerned about certain problems with the risk assessment, because breached entities now get to decide if the exposure to risk is low and there’s really an inherent conflict of interest. We would have much preferred seeing external auditing and external risk assessments by someone who doesn’t have an interest in deciding that the probability of risk is low. 

The other problem is the new rule allows covered entities and business associates to determine the likelihood of re-identification. That is actually out of step with computer science. Re-identification is really easy, again allowing the holder of the data to determine risk. Our other problem is that the Notice of Privacy Practices doesn’t require covered entities to explain how patients can exercise their rights to control their protected health information (PHI) under stronger state laws, common law, tort law and federal laws, other than HIPAA 42CR part 2 and 7332. It really doesn’t give patients an adequate explanation of all the situations that really should require their authorization or their informed consent. 

People are going to be desensitized because there are so many risks, but the public feels very, very differently about public health data compared to other data so we need to do a much different job. It’s in the entities’ interest to make things hard to understand and not alarming so that people won’t take them up on credit monitoring and other services.

LFF: We’re required under Massachusetts law to send out a very specific notice, and I don’t blame the entities for the legal mumbo jumbo. You said there are certain things that aren’t required in a Notice of Privacy Practices. I disagree with you. Both the HITECH Act and the Omnibus Rule require covered entities to put in their Notices of Privacy Practices that patients have the right to be notified of a breach of unsecured PHI and that they have the right to restrict disclosure of their information if they pay for the service in full. That’s in every one of my new Notices of Privacy Practices.

JA: You said organizations make things difficult in terms of the notifications and I have to disagree with that as someone who’s involved in the decision making around breaches. I simply disagree that people are doing it with some kind of malicious intent or even subconsciously having some kind of intent to withhold information from patients. I think that would be the most unethical thing that I could do as the chief information security and privacy officer for Partners.

PP: I think that privacy is in the eye of the beholder. Letters don’t mean anything to some people. There are consumers paying more attention and wondering why is my medical information less well protected than my financial information? The security provisions and controls in the financial industry are just coming into healthcare.

Many of the HIPAA audits conducted by the Office of Civil Rights indicated that covered entities aren’t performing risk assessments, aren’t performing them correctly or aren’t acting on identified vulnerabilities.

LFF: Many entities haven’t done security risk assessments since the security rule came out in 2013 but that’s one of the first things the OCR asks for in every investigation. I’m advising all of my clients to get that piece high on their radar screen. I’m seeing much more activity in this area and it’s being taken very seriously. Organizations are following up and making sure that compliance programs, including the required security rule policies and procedures, are in place. I’ve seen a dramatic change in compliance with the security rule over the past two years.

DP: HIPAA always required covered entities to assess their security risks but no one did anything because there was never any enforcement. Now that we finally have some meaningful penalties in the new rules and Leon Rodriguez has taken a very different approach, we’re finally seeing institutions that hold health data just now beginning to wake up.

PP: Our clients are doing risk assessments. This is an integral part of an organization’s information security program and provides the foundation for protecting confidential information in the most appropriate ways—administrative, physical and technical—based on the uniqueness of the environment. As far as protecting information contained and stored in digital copiers, I advise clients to look at the FTC website which has good, clear and specific documentation that can easily be turned into a policy to protect against a breach of information related to copiers.

A lot of people think conducting a risk assessment is a more onerous process than it needs to be. They make the mistake that it’s more technical but risk is risk. It doesn’t have to be a long document but you need a multidisciplinary team because other people understand risk and bring a lot to the table. It can be an interesting and almost fun process, if done correctly. Once you have an established process, updating the risk analysis and implementing risk mitigation plans become easier to do. Risk is a mindset and the concept is pretty well understood in healthcare today.

How is the boom in mobile health impacting healthcare facilities and their privacy and security practices?

PP: We work with not just hospitals, but start-ups, pharma, physician practices and claims companies. Some of these companies are creating apps that include this type of monitoring that involve PHI. We have developed some models and we always start with a discussion of PHI mapping. If you don’t know where your PHI is then you can’t protect it. With mHealth, it’s a question of tracing it out. If PHI is on a mobile device or something a person is wearing, the information may be sitting on a server or hitting different cell towers. You need to look at that process and chain of events and figure out vulnerable points.

We’re going to see a lot more mHealth and we’ll also see some good solutions for mobile security that we do not yet have.

JA: We’ve developed a few mobile apps for use by our clinicians. They’ve been developed by teams that work closely with my office to make sure they align with our policies. We test the apps to make sure they can’t be compromised.

In general, healthcare is still struggling to come to a consensus as to how to deal with this. There’s been such a dramatic transformation in terms of how people expect to communicate. The question of texting was raised to me. Should we have our patients text from a secure, dedicated app to their clinicians or should they use a texting app on a device? It’s a real challenge to work through. When we talk about the plethora of devices people are bringing into healthcare and research scenarios, we don’t have a complete idea of how to deal with that. The regulating bodies are struggling with this as well.

DP: A lot of the foundation of this technology came from commerce. In commerce, the standards for patient control and patient access to information about themselves are not as high—not nearly the same as in healthcare. Technology developers have no idea about patient rights. It’s shocking to me that vendors come into this field with no idea of the laws and rights of people they’re designing the technology for.

JA: We need a dialogue outside of the regulatory realm between patient advocacy groups and provider groups. There’s only so much Washington can do to facilitate this discussion. My fear is that the practical aspects aren’t understood but, at the same time, there are things we can do to listen better to the patient community. No one side has all of the insight. The most important thing is dialogue.

The National Institute of Standards and Technology (NIST) released a preliminary cybersecurity framework in July. How will this impact privacy and security efforts?

PP: Cybersecurity is a new threat and a term organizations are just waking up to. NIST is working with various stakeholders to develop this voluntary framework. I’m curious to see the outline of the framework as it progresses. I think it will probably fit in very nicely with other security frameworks.

I’m starting to include this in information I share with my clients. Hospitals are not paying attention to this yet. Some may see it as a small risk or are not aware of possible risks, but I’d like to see it included in the risk assessment process. The academic side where a lot of research is being done is a prime target.

The framework is going to be technology neutral and flexible. It follows some of the mantras included in the security rule. The final version is expected by next February and I’m anxiously awaiting it to see how we might integrate it into existing work.

JA: Partners HealthCare is currently reviewing the draft framework to determine how it may play a role in our planned information security roadmap. We currently leverage NIST Special Publications and the ISO 27000 series as our frameworks for that roadmap. Ideally, the new draft cybersecurity framework will tie into our current efforts.

Additionally, Partners has joined the National HealthCare Information Security and Analysis Center (NH-ISAC) and, as a member, plans on contributing to the national dialogue related to security frameworks in healthcare. We believe these are important initiatives to mature the information security programs in the healthcare industry.

Beth Walsh,

Editor

Editor Beth earned a bachelor’s degree in journalism and master’s in health communication. She has worked in hospital, academic and publishing settings over the past 20 years. Beth joined TriMed in 2005, as editor of CMIO and Clinical Innovation + Technology. When not covering all things related to health IT, she spends time with her husband and three children.

Around the web

The tirzepatide shortage that first began in 2022 has been resolved. Drug companies distributing compounded versions of the popular drug now have two to three more months to distribute their remaining supply.

The 24 members of the House Task Force on AI—12 reps from each party—have posted a 253-page report detailing their bipartisan vision for encouraging innovation while minimizing risks. 

Merck sent Hansoh Pharma, a Chinese biopharmaceutical company, an upfront payment of $112 million to license a new investigational GLP-1 receptor agonist. There could be many more payments to come if certain milestones are met.