Term: Tokenization Definition: Tokenization is replacing sensitive values with non-sensitive tokens, storing the mapping in a secured vault to reduce …
Data Protection Glossaryprotection-term-tokenization
Term: Tokenization Definition: Tokenization is replacing sensitive values with non-sensitive tokens, storing the mapping in a secured vault to reduce …