Consistent Tokenization:

Tokenization is a de-identification technique that replaces a value with a randomly generated value – a token. Consistent Tokenization ensures that the same input value is always consistently replaced by the same token. Where desired and safe, this technique can thus preserve referential integrity and relationships in the dataset that might be needed for analysis.

Return to glossary