Tokenization
What is Tokenization?
Tokenization is the process of converting sensitive data, such as credit card numbers, into surrogate values, such as tokens. Even so, it is generally necessary to store sensitive data at a central location for reference that is securely protected. By maintaining the security of the sensitive values and using a complex algorithm and mapping process, tokenization creates a value and then maps it back to the initial value.