Complexity Management with Tokenization

Tokenization is a major trend in application and data security and Gateways are an ideal location to deploy tokenization services. Tokenization replaces sensitive data with benign data. The classic example here is PCI DSS, and the business value of tokenization is summed up here:

Thumb_Tokenization

Now I am no graphic designer, but let me take advantage of the Chinese saying that “1,001 words is worth more than a picture.” As much as I like the graphic above it does not tell the whole story. The 2/3 of the graphic starting from the left is “PCI Scope”, the 1/3 on the right is outside PCI scope. In my experience the value of tokenization and gateways is that its more like 10-20% of the system is isolated down to in scope for PCI and the remaining 80-90% is “out of PCI scope” – *this* is the value of tokenization – it abstracts away a ton of complexity. As we discussed in the last post, complexity is the main enemy for security people. Tokenization services are a good way to not eliminate but massively reduce the sprawl of sensitive data and in doing so reduces the burden of complexity across the system because the rest of the system isn’t dragged into scope.

The reality is that most of the system does not need to access sensitive data such as payment information, it only needs to be able to reference authorization codes and the like. There are so many ways to mess up code that simply removing the sensitive data in as many places as possible is frequently the single most effective security mechanism. To quote Ken Thompson, “when in doubt use brute force.” What’s simpler – A) exhaustive audits, quarterly vulnerability assessments, section 10 level audit logging, and the full compliance check box olympics across your whole systems or B) brute force – isolate sensitive data, audit that island and expose only tokens and authorization codes to the rest? Its not even close.

The counterargument to the above is that a gateway introduces a new layer in the system and so its another middleboxen for the app server to talk to, another system on the critical path. Fair enough, but its there for a reason same as the app server is in the middle tier. The appserver is in the middle tier so that business logic and rules are centralized and reused. This is the same rationale for tokenization on a gateway – centralize the token generation and verification. Do you want all your developers writing code for generating and verifying tokens? Not bloody likely.

Tokenization is a major trend in security because it allows systems to reduce the sprawl of sensitive data and the attendant vulnerability and audit issues. Gateways are the ideal way to deploy tokenization because

1) the internal core operations of token generation and verification are too important to be left to individual developers. They are generic enough that they can be reused.

2) the external interfaces to the tokenization servicess- generate token and verify a token – are very simple

This is a mix that solves important security problem in a simple way and in a way that scales. Frankly there are not too many times in security architecture where this is the case and that makes tokenization on gateways is a design pattern for the long haul.

Comments are closed.