Tokenization and Encryption (1 janv. 2000 – 1 janv. 2017)
Description:
Tokenization & Encryption are processes used to protect information in transit and at rest. It involves either replacing or transforming the original text into a form that is unreadable to unauthorized people. These techniques are used to safeguard sensitive information stored and processed in the cloud/internet, and on mobile and wireless devices.
Ajouté au bande de temps:
Date: