You are here

How Data Tokenization Can Help Healthcare Entities Improve Their Data Security

Whether you are a covered entity (CE) or business associate (BA), you need to comply with the Health Insurance Portability and Accountability Act (HIPAA) and the Payment Card Industry Data Security Standard (PCI DSS) requirements. To help achieve compliance with these requirements, healthcare organizations of all sizes and complexities should consider implementing a data tokenization strategy. Having data tokenization in place will not only reduce the overall cost associated with your HIPAA and PCI DSS compliance, but your organization’s security posture will improve as well. 

What is Data Tokenization? 
Data tokenization is the process of replacing sensitive data (such as Primary Account Numbers (PANs), electronic Personal Health Information (ePHI), and Nonpublic Personal Information (NPPI)) with a unique value that is not sensitive (e.g. 9aF1Yq169523nw4cvl3y2). This non-sensitive value acts as a unique identifier and is the “token” for a sensitive record. Tokenization can be used as a persistent value, and used by many end users. This allows users to interact with the tokenized data directly, without having to decrypt and re-encrypt data each time they access the information.

What Are The Benefits of Tokenization?
One of the most common benefits of using data tokenization is a reduction of scope in the compliance audits and validations for the organization. This also leads to a decrease in the regulatory compliance services costs, and allows the organization’s compliance efforts to focus more on other areas of sensitive and restricted data. 

Another benefit is the reduction in risk of unauthorized access. The actual token doesn’t contain extrinsic meaning or value, so there is very little risk tied to it. This is very valuable given that the exposure of sensitive data to unauthorized individuals can lead to legal and regulatory violations and penalties, as well as the potential exploitation of customer or patient data. 

Furthermore, implementing data tokenization entails training employees on how the data will be represented and used. Training, along with the daily use of tokens, can help employees acquire a deeper understanding of the importance of data protection and thus better equip them to identify control deficiencies that exist in other areas of the business.

What Are Some Guidelines to Follow for Implementation?
The American National Standards Institute (ANSI) originally developed the standard ANSI x9.119 that addresses both data tokenization and data encryption. The document is separated into two parts, with ANSI x9.119 Part 2 addressing data tokenization. 

The PCI also developed a tokenization guidelines document that provides a high level summary of how tokenization of Card Holder Data (CHD) such as a PAN can impact a merchant’s compliance with PCI DSS.

How Can Tokenization Help You Comply with PCI DSS 3.1?
Data tokenization can help organizations meet the following control requirements in PCI DSS 3.1:

Requirement 2.4 Maintain an inventory of system components that are in scope for PCI DSS 
By implementing data tokenization, the number of in-scope systems and users can significantly be reduced. Tokenized data does not meet the requirements of CHD as defined by PCI. The PCI 3.1 Glossary defines CHD as “At a minimum, cardholder data consists of the full PAN. Cardholder data may also appear in the form of the full PAN plus any of the following: cardholder name, expiration date and/or service code.” 

Requirement 9.3 Control physical access for onsite personnel to sensitive areas
Since data tokens do not meet the requirements of CHD, the areas processing sensitive data can be significantly reduced. This can be especially helpful in data processing departments in which purchased records can replace CHD with tokens. 

Requirement 11.3 Penetration Testing
PCI DSS version 3.1 has expanded the penetration testing requirements under Requirement 11.3. The testing is to be performed internally, externally, and against any scope-reducing controls. Once data tokenization is implemented, the control can be validated and the scope of the penetration test can be reduced. Penetration tests are resource intensive, so reducing the amount of in-scope systems will make for the most efficient penetration test.

How Can Tokenization Help You Comply With The HIPAA Security Rule?
Tokenization of ePHI may help mitigate the costs of a required Breach Notification. A token alone does not represent ePHI, and therefore if the token is stolen, the argument can be made that no ePHI was stolen. Reducing the footprint of ePHI throughout an organization by using tokenization will decrease the overcall compliance cost for the CE or BA. 

The scope of a Gap Analysis (a review of an organization’s controls) against HIPAA’s Security Rule can be reduced by implementing a tokenization process. CEs are responsible for any of their ePHI stored by their BA, so the CEs can be more comfortable with control reviews knowing that their BAs are only accessing their tokens and not their direct ePHI. 

Data tokenization of ePHI can also help your organization benefit in regards to the following HIPAA citations: 
164.308(a)(3)(i) Implement policies and procedures to ensure that all members of its workforce have appropriate access to electronic protected health information 

164.308(a)(4)(ii)(B) Implement policies and procedures for granting access to electronic protected health information 

164.312(a)(1) Implement technical policies and procedures for electronic information systems that maintain electronic protected health information to allow access only to those persons or software programs that have been granted access 

164.312(c)(2) Implement electronic mechanisms to corroborate that electronic protected health information has not been altered or destroyed in an unauthorized manner

164.312(e)(1) Implement technical security measures to guard against unauthorized access to electronic protected health information that is being transmitted over an electronic communications network

Sources
http://slideplayer.com/slide/3390154/ 
https://www.brighttalk.com/webcast/7497/175103?utm_campaign=viewing-history&utm_content=&utm_source=brighttalk-portal&utm_medium=web&utm_term= 
http://www.darkreading.com/risk/four-best-practices-for-tokenization/d/d-id/1134247 
https://usa.visa.com/visa-everywhere/security/tokenization-explained.html
http://healthitsecurity.com/news/how-hipaa-affects-healthcare-cloud-computing-decisions/ 
http://www.hhs.gov/ocr/privacy/hipaa/administrative/securityrule/securityruleguidance.html
 

For questions or assistance with implementing data tokenization at your organization, contact Michael Kanarellis, IT Assurance Senior Manager, at 617-428-5408 or mkanarellis@wolfandco.com.