Internal Tokenization Business Benefits

We’ve had a ton of interest in the tokenization broker lately, and I wanted to share some of the interesting use cases that are popping up in the field as well as put some boundaries on the definition of internal tokenization, which is becoming an important selling point in the market.

First, let’s remember we are in the context of PCI DSS, and by tokenization we are talking about the tokenization of credit card numbers or PAN data with format-preserving tokens. The use cases we are seeing revolve around replacing PANs in documents, files or API calls with tokens, mainly for the purpose of reducing the scope of a PCI audit as well as reducing the chance of a breach.

When we say internal tokenization, we’re talking about allowing the Enterprise to own, store, and manage their own tokens on-premise. This contrasts external tokenization which generally pairs the token generation along with a payment processor and involves multiple vendors from the point of sale out to the bank. I will argue that both schemes have value and which one you ultimately choose will depend on the nature of your business.

In the age of cloud, it seems natural to want to outsource everything, including PCI DSS and tokenization. This point alone makes it very tempting to look for an external tokenization solution. However, we’re finding that many Enterprises are interested in internal tokenization for a number of reasons:

Internal Tokenization Business Benefits

1. Payment Processor Agnostic – Many external tokenization solutions are bound up with a specific payment processor. Internal tokenization allows the Enterprise to be flexible with payment processors – what if they need to make a change in the future? Most external tokenization solutions are advertised with a single payment processor.

2. Avoid Token Migration Issues- Once tokens are stored by a third-party, what happens if you need the original PANs back at a later date? What if you have a post-payment application that needs to make a token to PAN exchange for some reason? Ideally this shouldn’t be required, but there may be audit or loss prevention applications that require the original PAN data.

3. Control your own Token Generation – Internal tokenization solutions allow the Enterprise and QSA to tightly control how tokens are generated. For example, some applications may need to preserve parts of the original PAN and this policy may change over time.

4. Remove Datacenter Scope – Large Enterprises and retailers may have legacy business processes that involve PAN data – for better or worse. External tokenization plays a big role in reducing scope for payment applications, but generally internal tokenization solutions are required to take existing databases and mainframes out of scope – especially when they already contain live PANs. For example, many large enterprises have data warehousing, customer loyalty and HR applications that may be infected with PANs – an internal tokenization solution is ideal for cleaning up these applications and saving assessment costs.

Ultimately, the issue comes down to control as well as the amount of customer service options that Enterprises and wish to offer their customers. Do you ever wonder why you have to re-enter your credit card number over and over again in some applications? It’s because the Enterprise is outsourcing the payment processing to comply with PCI DSS.

For small merchants wishing to offload as much of the PCI responsibility as possible should look toward external tokenization, but larger merchants and Enterprises with legacy business processes that involve live PANs in documents, files, databases, mainframes or data warehouse applications may want to look toward internal tokenization to reduce PCI scope and increase security.

Blake

Blake Dournaee

About Blake Dournaee

Passionate, energetic product manager that lives at the intersection of business, innovation and technology. I currently work at Intel in the Datacenter Software Division working on products and technologies relating to mobile, APIs, cloud services and big data.

Comments are closed.