Tokenization

Tokenization:

Tokenization is a data masking technique that replaces the field value with a ‘token’, a synthetic value that stands in for the real value. The pattern for the generated token is configurable and can be chosen to be of the same format as the source data to preserve data formats (e.g. for testing and development).

Tokenization can also be done consistently, meaning that the same value is always replaced with the same token, such that referential integrity is preserved in the dataset.

Ready to learn more?

Our team of data privacy experts are here to answer your questions and discuss how data privacy can fuel your business.