Tokenization Vs. End-to-End Encryption: Experts Weigh in

Pros and Cons of the Emerging Technologies Eyed to Improve Data Security
Tokenization Vs. End-to-End Encryption: Experts Weigh in
Tokenization or end to end encryption - which solution will win the hearts of data protectors in the race to secure data?

A recent study conducted by PriceWaterhouseCoopers on behalf of the Payment Card Industry Security Standards Council shows that end to end encryption and tokenization are the top choices for companies seeking to employ new emerging technologies to protect payment card and other critical data. And both approaches have their public proponents, including Heartland Payment Systems (HPY) CEO Robert Carr, who's been encryption's most vocal supporter in the wake of his organization's historic breach.

But what are the pros and cons of each approach? We turned to a panel of information security experts for their analyses of tokenization vs. end to end encryption.

Defining the Solutions
A quick look at the essence of these two solutions:

Tokenization replaces sensitive card data information with unique id symbols that keep all the essential data, without compromising its security. This approach has become popular as a way to increase security of credit card and e-commerce transactions, while minimizing the cost and complexity of industry regulations and standards - especially the Payment Card Industry Data Security Standard (PCI).

End to end encryption, also defined by Visa as data field encryption, is continuous protection of the confidentiality and integrity of transmitted data by encrypting it at the origin, then decrypting at its destination. The encrypted data travels safely through vulnerable channels such as public networks to its recipient, where it can be decrypted. One example is a virtual private network (VPN) that uses end to end encryption.

The question for many organizations is not either/or, but rather which approach best fits into the organization's existing security architecture?

Pros and Cons
Size is a factor for organizations weighing tokenization and end to end encryption, says Dave Shackleford, former chief security strategist at EMC, and now principal at Blue Heron Group. "I would probably choose tokenization for smaller organizations, but larger ones will likely benefit more in the long run from looking to implement robust encryption practices and technologies," Shackleford says. Tokenization may not encompass all the data that needs to be protected by larger organizations, he adds.

Anton Chuvakin¸ author of the books "Security Warrior" and "PCI Compliance," says he would choose tokenization over end to end encryption any day. "I just don't believe that anybody can roll out end to end encryption and have it be usable and secure at the same time," Chuvakin says.

Chuvakin is concerned about recent breaches that indicate "in-application" sniffer malware might have been in use. Such malware will grab the data from memory after decryption, he says, completely defeating an end to end encryption solution such as Heartland's E3. "Admittedly, getting such malware on the processing server is very hard, but such a scenario still worries me," Chuvakin says.

Both end to end encryption and tokenization are about scope reduction, and will "most likely fall under the purview of the Payment Card Industry Security Standards Council Scoping Special Interest Group," says David Taylor, founder of PCI KnowledgeBase and a member of the group he referenced. "Both end to end encryption and tokenization are based on the whole idea that not actually having credit card data available on 'as many' or 'any' systems will move those systems out of scope," Taylor says. However, he notes, there are concerns on several fronts, which the Visa best practices thus far do not address.

Among the concerns that experts weigh:

Tokenization Pros:

The data is not stored or sent in its "real" form at all (possibly on the first initial transaction but not afterward).
Easier to establish and maintain than encryption

Tokenization Cons:

May not address all data in use by larger organizations;
May not work with applications and processing technologies;

End to end encryption Pros:

Data is secured all the way from each endpoint to the processing destination;
May integrate better with existing technology.

End to end encryption Cons:

May introduce more overhead in processing;
Key management and other encryption processes may be hard to manage.

Where is the Industry Headed?
The future of these two emerging technologies depends on several factors and forces, say the experts. Standards and acceptance of general best practices are still being hashed out, as seen by the recent announcement by Visa of a best practices guideline for data field encryption.

"We'll see both used for quite some time," says Shackleford. "I think smaller organizations that can really outsource everything will be more likely to choose tokenization, but bigger enterprises that have lots of data and applications will continue to look to encryption."

Kevin Nixon, an independent security consultant, says he frequently hears customers, colleagues and news media say "We need end to end encryption ...," but he isn't sold on the approach. "For a security practitioner, ironically, it is a very bad idea," Nixon says. "Proponents of end to end encryption are simply looking for a way to do security 'on the cheap.'"

A proponent of "Defense in Depth," Nixon favors tokenization as a solution. "Quite simply, if the creator of the data is already infected (and contains a worm in the payload), anyone using end to end encryption successfully manages to protect the worm with the end to end encryption solution," he says. "Defense in depth is comprised of multiple layers of offensive and defensive tools intended to slow or stop the progression of the aggressor. It is a natural, logical migration to deploy the concept to the IT computing environment."

Avivah Litan, a noted Gartner analyst, says time my favor tokenization. She envisions another two-to-three years to get a final end to end encryption standard, "and that's an aggressive timetable," Litan says. Tokenization, on the other hand, "is more robust, more adapted by merchants and vendors providing solutions, and will probably be more accepted before end to end encryption."

About the Author

Linda McGlasson

Linda McGlasson

Managing Editor

Linda McGlasson is a seasoned writer and editor with 20 years of experience in writing for corporations, business publications and newspapers. She has worked in the Financial Services industry for more than 12 years. Most recently Linda headed information security awareness and training and the Computer Incident Response Team for Securities Industry Automation Corporation (SIAC), a subsidiary of the NYSE Group (NYX). As part of her role she developed infosec policy, developed new awareness testing and led the company's incident response team. In the last two years she's been involved with the Financial Services Information Sharing Analysis Center (FS-ISAC), editing its quarterly member newsletter and identifying speakers for member meetings.

Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing, you agree to our use of cookies.