Tokenization’s Role in PCI DSS Compliance
What is PCI DSS?
Any organization that accepts credit or debit card payments is required to keep that data as safe and secure as possible. The Payment Card Industry (PCI) Data Security Standard (PCI DSS) is the set of regulations established by credit card companies to help protect payment information and isare mandated by the credit card companies to ensure the security of credit card transactions.
How Does Tokenization Help Meet PCI DSS Requirements?
Tokenization addresses PCI DSS requirement set #3: protecting cardholder data (CHD) at rest. PCI DSS seeks to reduce retention of sensitive data and safely govern its storage and deletion. Tokenization satisfies this critical requirement by eliminating stored CHD from as many systems as possible. Credit card tokenization not only replaces sensitive data, but it also minimizes the amount of data an organization needs to keep on file, which ultimately helps to reduce the cost of compliance with industry standards and government regulations.
Additionally, the responsibility to protect stored CHD and maintain the proper level of PCI DSS certification then falls on the token (solution) provider, so it ultimately reduces the cost of PCI compliance. This makes an especially big difference for organizations that want to support card-on-file or recurring payments.
Tokenization and Data Privacy Compliance
While data privacy regulations do not mandate the type of technology adopted to secure data, they both discuss pseudonymization and encryption as relevant data security measures.
- Pseudonymization encodes personal data with artificial identifiers such as a random alias or code. While pseudonymization is a “false” anonymization because the data can be linked back to a person, the personal identifiers are stored outside of the company’s system or network. These personal identifiers would be required to re-identify the data subject, thus making it a secure practice. Tokenization is an advanced form of pseudonymization.
- Encryption renders data unintelligible to those who are not authorized to access it. Data encryption translates data into another form, or code, so that only those with access to the decryption key can read it.
One reason for tokens’ increasing use for sensitive, personal information is that they are versatile – they can be engineered to preserve the length and format of the data that was tokenized. Tokens can also be generated to preserve specific parts of the original data values; by adapting to the formats of conventional databases and applications, tokens can eliminate the need to change the database scheme or business processes. Organizations can treat tokens as if they were the actual data strings.
To learn more, read our article Understanding PHI, PII, Data Privacy & Tokenization.