CertNova
Menu
<- Back to glossary

Tokenization

noun

Definition

  1. 1.Replacing sensitive data with non-sensitive tokens while retaining usability.

Example

Payment systems may use tokenization to reduce card-data exposure.

Related Exams