Not logged in - Login


View History
R
e
q
u
e
s
t

a

d
e
m
o

Tokenization

Tokenization Definition

Tokenization is a data protection method whereby sensitive information is replaced with randomly generated non-sensitive information called a token, which has no value if breached. The format of the original data is retained, but there is no relationship between the randomly generated data for the token.

The mapping of the process is stored in a database called a token vault. A token has no value and cannot be decrypted.

Use Cases of Tokenization

Tokenization is often used to protect credit card information and other sensitive data handled by a payment processor. Credit card numbers are substituted with a random alphanumeric ID. The tokenization process removes any connection between the transaction and the sensitive data, which limits the risk of data breaches. Tokenization can be used to protect any kind of sensitive data including account numbers, social security numbers, email addresses and phone numbers.

Data Masking Vs Tokenization

Unlike data masking and encryption, which use algorithms to replace sensitive data elements, tokenization uses a database, called a token vault, which stores the relationship between the sensitive data elements and the token. The original data is securely stored in the vault and does not leave the organization. As with data masking, the format of the original data is maintained.

Data masking and tokenization are both irreversible processes. In the case of tokenization, there is no mathematical relationship to the original data.

Tokenization For Compliance

Tokenization meets certain compliance requirements of the PCI Data Security Standard (DSS), which governs the usage of credit card information.

PCI DSS has 12 requirements for compliance, namely:

  • Having a firewall in place

  • Having secure vendor system passwords and security parameters.

  • Protecting stored cardholder data.

  • Encrypting transmission of cardholder data over public networks.

  • Having regularly updated anti-virus and malware software in place.

  • Developing and maintaining secure systems and applications.

  • Restricting access to cardholder data.

  • Identifying and authenticating access to system components.

  • Restricting physical access to cardholder data.

  • Tracking and monitoring all access to cardholder data.

  • Testing security systems and processes regularly.

  • Having an information security policy in place.

Tokenization is used to safeguard credit card and bank account numbers so organizations can transmit this data safely. The tokenization process removes any connection between the transaction and the sensitive data, which limits exposure to breaches, making it useful for maintaining compliance when processing credit card data.

Download a Trial