Along with the many gains that come with going cashless today, financial digitisation also gives way to a massive influx of cyber crime. The upsurge in fraudulent transactions and security threats have posed a serious challenge to central banks and clearing houses. According to data breach statistics, 267,088 data records are lost or stolen every hour. Only 4% of the total breaches committed since 2013 were “secure breaches” where encryption was used which rendered the stolen data useless. The social media industry suffered the greatest impact, accounting for 56.18% of the total compromised records in 2018. Identity theft has been the most prevalent breach type since 2013, accounting for almost 3 billion compromised records last year. (Source)
Digital payments are categorized as card-not-present transactions, which refer to those transactions that are carried out without the presence of a card, and are usually associated with payments made over the internet. Internet transactions make it difficult for the merchant to verify if it’s the actual cardholder who’s making the purchase and makes it an easy target for cyber criminals.
In 2005, Shift4 payment, a pioneer in secure payment processing solutions, introduced Tokenization. Credit and debit card data often gets stored on computers and networks where you’re making online purchases. Your payment card information passes through various points in the authorization process which leaves it at risk of fraud, as this data can be intercepted at multiple points. Tokenization protects this data by replacing the actual card number with a random 16-digit alphanumeric globally unique ID called a ‘token’. A token can be simply defined as an algorithmically generated data element that substitutes a more valuable piece of information. Using this token, you can map back to the sensitive data through a tokenization system. This limits the data exposure of security breaches and restricts the parties that can receive information and in what context. As the token has no extrinsic value on its own, any tokenized data intercepted by thieves and hackers is useless. Once the transaction is successful, a confirmation is sent to the online seller with a randomly generated token ID that gets stored in place of the actual PAN data in their systems.
TOKENIZATION AND ENCRYPTION
Tokenized data cannot be reversed back to its original value. On the contrary, Encryption, takes a value, runs it through an algorithm and transforms the plain text information into a non-readable form called ciphertext. In order to retrieve the plain text information, the ciphertext can be decrypted using an algorithm and an encryption key. The encryption strength is based on the complexity of the algorithm used to secure the data. Tokenization on the other hand uses no such complex algorithm to transform the sensitive information into a token. Neither does it require any sort of encryption key to derive the original data from the token. Vault-based Tokenization uses what’s popularly known as a token vault, a database that stores the relationship between the original data and the token. The original data is secured in the vault via encryption. However, vaultless tokenization, a more recent and efficient technology, doesn’t require any token vaults. Whenever it receives a request, it generates a random number which may be in numeric/alpha-numeric form. During de-tokenization, it simply decodes this random number, and sends the actual card number.
Let’s take a typical example to understand how credit card payment is processed under vault-based tokenization -
As soon as the user punches in their card details at an eCommerce website, the PAN is passed to the credit card tokenization system.
A string of 16 random characters are generated by the tokenization system to replace the PAN, or retrieve the associated token, and records the correlation in the data vault.
The token returns to the eCommerce site and is used to represent the customer’s credit card in the system
The token is then sent to the payment processor who uses the same technology to de-tokenize the token and fetch the original credit card number which is used for authorization. In case the organization is using a third party tokenization solution, the token is sent to the third party, who then de-tokenizes it and sends it to the payment processor for card processing.
For the end user, this operation seems to be performed nearly instantaneously by the browser or application. For cloud-based tokenization, the data is stored in the cloud in a different format. At no point of time is the original card details stored within the retailer’s environment, preventing the user’s sensitive information from being compromised in any way. The token data may be fully or partially visible during the transaction which helps in speeding up the process and alleviating strain on system resources, however the original data remains completely hidden at all times.
Single-use tokens usually represent a single transaction and have a faster processing rate than multi-use tokens. A unique token is created every time a repeat customer purchases something. For this reason, single use tokens pose problems during recurring transactions and refund/return processing. Single use-tokens also contribute to more number of token collision scenario than multi-use tokens. A token collision scenario arises when two identical tokens represent two different pieces of data. Validating previously existing tokens is critical in order to avoid token collision
Multi-use tokens as the name suggests, may be used to track an individual PAN across multiple transactions. The same token corresponds to a payment card used for internet shopping and purchases made from the same retailer.
TOKENIZATION AND PCI DSS STANDARDS
The payment card industry data security standards issues a set of guidelines that must be complied with by any organization that stores, transmits or processes cardholder information. The basic idea behind this initiative is to bolster security around credit card and debit card transactions and safeguard cardholder information. All parties associated with facilitating a transaction, fall within the PCI scope. PCI scoping is defined as the identification of people, processes and technologies that interact with or could otherwise impact the security of the cardholder data (CHD). Tokenization is applied to payment card data with an intention to reduce the PCI scope by eliminating electronic CHD being stored in the environment. Tokenization certainly reduces the risk of data breaches, however it’s important to ensure that the payment processors you use are reliable, and comply with the PCI DSS.
VAULTED VS VAULTLESS TOKENIZATION
Vaulted tokenization requires a database, or “vault,” to store the relationship between the card information, and it’s corresponding token. There are some limitations to this concept - To avoid data loss, every new transaction should have a continuous back up. As the credit card count increases, the database may choke up. This slows down the processing speed and accounts for an efficiency drop. Evidently, vaulted tokenization would require high maintenance and costly synchronization capabilities to ensure smooth transactions and consistency across data centers. Storing all sensitive content in one database also leaves it susceptible to theft. To overcome the challenges associated with vault-based tokenization, alternative solutions like stateless and vaultless tokenization are being used. Both these technologies are independently validated to reduce PCI DSS compliance scope.
Vaultless tokenization is a lightweight and a more powerful alternative to vault-based tokenization. It eliminates token databases and the need for storage of cardholder or other sensitive data. This method corresponds to faster token generation and quick recovery of token data when needed. Stateless tokenization allows random mapping of live data elements to substitute values without the need of a database while retaining the isolation properties of tokenization.
COMPATIBILITY WITH OTHER TECHNOLOGIES
Tokenization also works with alternate payment systems such as NFC payments (near field communication), ACH transactions and Apple pay. Building alternate payment systems would require multiple entities to work together in order to deliver payment services to the end user. To ensure interoperability between different players, there arises a need of a trusted service manager (TSM) which establishes a link between mobile network operators and service providers. Tokenization can help mediate such services. Apple pay uses a proprietary tokenization system, however most other NFC wallets rely on a payment tokenization standard called EMVCo.