HIPAA Cyber Incident Response Requirements

So you’ve had a breach. To be straightforward HIPAA requires you to “Identify and respond to suspected or known security incidents; mitigate, to the extent practicable, harmful effects of security incidents that are known to the covered entity or business associate; and document security incidents and their outcomes”.

HIPAA then requires covered entities and Business Associates to report any security incidents to the Office for Civil Rights (OCR) within the U.S Department of Health and Human Services (HHS). HIPAA defines “security incidents” as  “attempted or successful unauthorized access, use, disclosure, modification, or destruction of information or interference with system operations in an information system” 

These requirements apply to electronic protected health information (ePHI) which includes Electronic Health Records (EHR), Electronic Medical Records (EMR), and Personal Health Records (PHRs) if you are a covered entity that's offering one to your patients.

Ok…So what do you need to do to comply with these requirements?

The HIPAA Cyber Incident Response Process:

  1. Detection: 

In the Ideal Word: The HIPAA Administrative Safeguards requires “Information System Activity Review” and “Audit Controls” So ideally you have (as required by HIPAA/HITECH) “Implemented hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use ePHI” Then in your “login monitoring” and “regularly reviewing records of information system activity, such as audit logs, access reports, and security incident tracking report” you detected the suspicious activity.

^This is important, more on this later.

We understand that's now always how things happen. Maybe you received the ransom note or some other unusual behavior.

Either way now that the “security incident” has been detected, a countdown to the reporting deadline has started. More on this later.

IMPORTANT be careful not to delete or overwrite evidence, this can complicate compliance and make the breach more costly. 

2. Investigation & Containment: Possibly the Most Important Step.

There’s a lot of data suggesting that the shorter the breach lifecycle, the less costly it is; especially in healthcare. We suggest that you hire cyber incident response experts (like us) who specialize specifically in these types of incidents. 

We can look through and perform forensics on all the devices, accounts, etc and determine what happened, what vulnerabilities were exploited, what information was accessed, etc. While we are performing our investigation we will use forensic tools to pull the evidence that is needed for documentation off of the systems. We will also take additional steps to contain the breach while we are investigating.

The lack of a good investigation, documentation, and report may force you to assume that all the information in the network in question was exposed. THIS CAN BE MUCH MORE COSTLY.

3. Remediate the Vulnerabilities that Were Exploited:

Not much of a point in kicking the hackers out of the network if they can easily walk right back in. That’s why we’ll need to remediate any vulnerabilities that were exploited, and any other vulnerabilities that we discovered that the hackers likely did too.

4. Eradicate the Threat & Revoke Unauthorized Access:

Hackers can be good at engineering persistence and backdoors into your IT environment. So in more complex environments and attacks, after we analyze the information from the investigation, we’ll create and execute a plan to boot them from the network. A Methodical approach to this is important to prevent the response from becoming a game a “whack-a-mole”.

5. Documentation: A Must for Compliance.

Documenting the incident is crucial for compliance. Organizations should maintain records of security incidents, response actions taken, and the outcomes of the investigation. This documentation is critical for reporting and potential audits.

It's also important to document the required technical, administrative, and physical security safeguards that were in place.

6. Notification and Reporting: 60 Day Deadline

HIPAA mandates that covered entities and their business associates must report breaches of unsecured protected health information to affected individuals, the Department of Health and Human Services (HHS), and, in certain cases, the media.

The timeline for notifications depends on the number of individuals affected by the breach. If more than 500 people’s records are affected you have 60 days to report. If its under 500, you have 60 days from the end of the calendar year in which the breach was discovered.

This doesn’t include state breach disclosure and breach notification laws. These vary by state and are based on where the patients are reside.

We are not attorneys, so don’t take this as legal advice. We know a lot of great cybersecurity / data privacy attorneys that specialize in HIPAA; we can refer you to them.

Prepare for a HHS Audit Post HIPAA Cyber Incident

After reporting a data breach, you are now on HHS’ radar. There's a good chance you’ll receive some regulatory scrutiny. HHS can pretty much audit covered entities and business associates any time they want. Under the “Administrative Safeguards” rule HIPAA requires organizations to conduct or implement:

  1.  “Risk analysis”  Which they define as “Conducting an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information held by the covered entity or business associate.” 

  2. Risk management” which they define as “Implementing security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level to comply with the HIPAA security standards.

  3. A Sanction policy” which they define as “Apply appropriate sanctions against workforce members who fail to comply with the security policies and procedures” 

  4. As we mentioned earlier, “Information system activity review” which they define as “Implementing procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports.”

While HIPAA does allow some flexibility in its security requirements, we strongly recommend beefing up security in preparation for a potential audit. This means reviewing HIPAA technical, physical, and administrative security requirements. HIPAA requires the above Administrative Actions which is a good place to start;

  1. Perform a formal security risk review, internal and external. Document it.

  2. Implement risk mitigation measures; new security measures. Document those.

  3. This one is tricky; we don’t recommend punishing employees for honest security mistakes; that’ll make them more likely to hide them instead of reporting them to whoever is responsible for IT, and it can ruin the culture of the company. But for example if employees consistently click phishing emails you may want to revoke access to PHI. Document that.

  4. Ongoing monitoring. In addition to it being required. Over half of victims of cyber attacks are re-targeted within a few months. So having a professional monitor is recommended.

Conclusion:

Ensuring the security and privacy of patient Protected Health Information (PHI) is an absolute requirement under the Health Insurance Portability and Accountability Act (HIPAA).  HIPAA sets specific guidelines for handling cyber incidents for the healthcare sector. Complying with HIPAA cyber incident response and reporting requirements is a serious legal obligation. Failure report can result in fines ranging from hundreds of thousands to multi-million dollar fines.

Previous
Previous

Email was Hacked? Here’s What You Should Do

Next
Next

What to do When Hit by Ransomware