Skip to main content

HIPAA Part 3: The Security Rule

The Health Insurance Portability and Accountability Act (HIPAA) set in motion many reforms to the healthcare industry. One category of reform was related to the protection of electronic health information, and within that category HIPAA has given us the Transaction Code Sets Rule and the Privacy Rule. April 21 marked the third HIPAA compliance deadline related to electronic health information. This time the deadline was for the Security Rule.

The Security Rule is a distinct and independent rule, different from its predecessors. While the Transaction Code Sets Rule got everyone’s attention because it required computers to all speak the same language if you wanted to get paid for claims, and the Privacy Rule got everyone buzzing about protecting patient privacy, the Security Rule has gone relatively unnoticed. Perhaps it’s because we are all tired of HIPAA. Maybe it’s because many people view the Privacy Rule as a lot of busy work for nothing. It could be that HIPAA has become a generic term for all things health-information-related, and therefore companies think they are already compliant with HIPAA. Or maybe the Security Rule is just too technical for people to even know where to begin. Regardless of why you’re not ready, the deadline has now passed, and implementation of this rule is much more complex and time-consuming than was implementation of the Privacy Rule.

For those who think HIPAA is a lot of to-do about nothing, many services are finding out the hard way that compliance with it is not optional when patient complaints lead to audits of the services’ policies. If the Office of Civil Rights (OCR) gets a patient complaint (and the more patients realize there’s an avenue for them to make complaints, the more seem to be filed), the first thing it will do is check to see if the service is complying fully with HIPAA. This will include a review of written policies. Whether the complaint is valid or not, if the policies are not in place, services are discovering, the OCR will, in fact, issue warnings and fines for noncompliance.

So, what does it take to comply with the Security Rule? A risk analysis of 24 areas, some with sub-parts called “specifications” that each have to be analyzed for potential risk of improper health information disclosure. About half of the specifications are “required,” and the other half are “addressable.” Then you have to implement security measures to prevent the types of improper disclosures your risk analysis identified. (The rule has built-in security measures, so you don’t have to come up with your own.)

But only about half of those are “required,” so there’s some good news, right? Wrong: The “addressable” areas are not exactly optional. Addressable means you have to execute your risk analysis, determine if there exists some risk of improper disclosure, and then, if there is none, create a written summary of your risk analysis, what you found and why you reached the conclusion that no security measure was required. Or, if you find there is a risk of improper disclosure, you have to implement the security measure. Alternatively, if the recommended security measure is overkill for your particular circumstances, you can set up your own security measure in its place. So the “addressable” specifications will take as much or more time and effort as the “required” ones.

This isn’t simple, and many of you will find that a computer or information technology professional is necessary to develop and implement many of the security measures you must have in order to be in compliance. But what are the areas that have to be analyzed, documented and secured?

These areas are broken down first into three major categories: administrative safeguards, physical safeguards and technical safeguards. Each of these three is further broken down into “standards,” which are in turn broken down into two parts, the required and addressable specifications. Put another way, you have three areas that require security: administrative (mainly written policies and procedures), physical (literally securing physical access to health information) and technical (computer hardware and software security measures). For each of these areas, there will be standards, and for each standard there are security measures you must implement (the required specifications) and others that you may opt out of, alter or implement in whole (the addressable specifications), depending on how much risk of improper disclosures there is at your facility.

What follows is a summary of how the Security Rule is laid out, and which specifications are required and which are addressable:

Administrative Safeguards

1. Security Management Process

Implementation Specifications:

i. Risk analysis (required)

ii. Risk management (required)

iii. Sanction policy (required)

iv. Information system activity review (required)

2. Assigned Security Responsibility

Implementation Specifications: None

3. Workforce Security

Implementation Specifications:

i. Authorization and/or supervision (addressable)

ii. Workforce clearance procedure (addressable)

iii. Termination procedures (addressable)

4. Information Access Management

Implementation Specifications:

i. Clearinghouse functions (required, but not relevant for most EMS services)

ii. Access authorization (addressable)

iii. Access establishment and modification (addressable)

5. Security Awareness and Training

Implementation Specifications:

i. Security reminders (addressable)

ii. Protection from malicious software (addressable)

iii. Log-in monitoring (addressable)

iv. Password management (addressable)

6. Security Incident Procedures

Implementation Specifications:

i. Reporting and response (required)

7. Contingency Plans

Implementation Specifications:

i. Data backup plan (required)

ii. Disaster recovery plan (required)

iii. Emergency mode operation plan (required)

iv. Testing and revision procedures (addressable)

v. Applications and data-critical analysis (addressable)

8. Evaluations

Implementation Specifications: None

Physical Safeguards

1. Facility Access Control

Implementation Specifications:

i. Contingency operations (addressable)

ii. Facility security plan (addressable)

iii. Access control and validation procedures (addressable)

iv. Maintenance records (addressable)

2. Workstation Use

3. Workstation Security

4. Device and Media Controls

Implementation Specifications:

i. Disposal (required)

ii. Media reuse (required)

iii. Accountability (addressable)

iv. Data backup and storage (addressable)

Technical Safeguards

1. Access Control

Implementation Specifications:

i. Unique user identification (required)

ii. Emergency access procedure (required)

iii. Automatic log-off (addressable)

iv. Encryption and decryption (addressable)

2. Audit Controls

3. Integrity

Implementation Specifications:

i. Mechanism to authenticate electronic PHI (addressable)

4. Person or Entity Authentication

5. Transmission Security

Implementation Specifications:

i. Integrity controls (addressable)

ii. Encryption (addressable)

Security measures for each of these specifications had to be implemented by April 21. In the case of addressable specifications, that was the deadline for documenting why the security measure should not be implemented.

The above overview will likely leave you with more questions than answers. However, due to the complexity and nature of the rule and the requirement that each individual service has to conduct its own analysis of its particular circumstances, this article is not and cannot be an exhaustive guide to compliance with the Security Rule. Other resources are listed below.

• For a complete copy of the Federal Register Final Rule, go to and search in Volume 68 for page number 8,333.

• For CMS’s page on HIPAA Security, go to

• For the OCR’s HIPAA resource page, go to

• For more information on HIPAA and for help with your compliance program, go to

Back to Top