Data States Security Experts Unhappy With Traditional Tokenization
Cybersecurity experts may look to Portal26 for all the benefits of traditional tokenization and none of the tradeoffs.
Portal26’s 2022 State of Enterprise Tokenization Survey shows that the vast majority of cybersecurity experts are dissatisfied with their current tokenization tools. In fact, despite spending 1 million dollars annually on tokenization security tools, 99% of respondents indicated that they were unhappy with the tradeoffs they have to make when utilizing traditional tokenization. Security tools such as traditional tokenization may soon take a backseat to newer and more innovative solutions that are designed to deliver the strong security that users require but without losing the use of the underlying data.
The 2022 State of Enterprise Tokenization Survey revealed:
- Almost 40% of respondents spend over $1,000,000 on tokenization every year, yet 70% have experienced sensitive data theft by an external adversary in the last 12 months.
- Of the 70% who experienced data theft over the last year, nearly all, 98.63%, said they believe that this could have been prevented with a more modern data security solution.
Tokenization Explained
Tokenization is a security method originally designed for payment card numbers where the original number would be swapped for a token, simply a secondary string of numbers. This is a 1-1 transaction that was meant to correlate each token with the original card number while having no monetary value by itself. This process allowed the token to flow through transactions seamlessly and minimize the potential for payment card information to be stolen. This method is still the best practice for payment card numbers since they are utilized only for monetary transactions and for no other purpose. Today many companies utilize it for other types of sensitive data as well with the idea of providing the same level of security for other types of PII as is available for payment card numbers.
Tokenization Benefits and Limitations
Unfortunately, tokenization was never meant to secure data more complex than payment card numbers, and cybercriminals have begun seeking a wide variety of data that goes well beyond payment card numbers. These same attackers are now identifying and targeting Personally Identifiable Information (PII) and Protected Health Information (PHI), meaning they are stealing addresses, social security numbers, medical information and far more. In response, some organizations have chosen to invest in the previously tried and true security method of tokenization.
However, these companies quickly discovered tokenization’s limits. After spending millions of dollars on integrating tokenization systems into their workflows, companies are now faced with the realization that it is only useful if they truly have no need to analyze the underlying data. The actual reality on the ground is that the same sensitive data they seek to protect is also required daily to understand customer behaviors, needs and pain points. Information such as names and addresses must be indexed and readily available for rich searches that contribute valuable insight to customer relations.
Some companies process this sensitive data in clear text format while accepting the risk of a data breach, and others go through the process of detokenizing data or reversing the tokenization process, in mass quantities to ultimately be deleted.
Neither solution is secure, and the detokenization process is time-consuming.
The Current State of Tokenization
Once you consider tokenization’s original intent and the complexity of today’s data, it brings to light glaring issues in security plans that use tokenization as the main shield. The survey also shows that the top three types of data housed today include: employee data, customer data and payment card data. While all of these could result in dire consequences if stolen, tokenization, again, was only designed to secure payment card data.
The same report concluded that nearly half (47%) of companies using tokenization can’t even tokenize all of their necessary data due to insufficient insight, and some are unable to lack of performance (44%) or lack of context (41%). Not only are companies spending millions of dollars on a security tool intended for entirely different data, but the same tool cannot be relied upon to prevent data loss.
In fact, the survey disclosed that 99% of companies surveyed are unhappy with the invasive and disruptive nature of tokenization. While tokenization has its time and place, protecting PII and PHI, which make up the majority of compromised data, just isn’t it. Companies must begin looking ahead to the future and let go of traditional security tools that are no longer mitigating or preventing modern-day ransomware attacks.
The Next Generation of Tokenization
The bottom line is: traditional tokenization simply cannot provide the coverage that is required in today’s threat environment. Cybercriminals are already looking ahead and incorporating data exfiltration and extortion tactics in their breach attempts, and current tools aren’t keeping up.
Traditional tokenization is simply inadequate!
According to the survey, over 85% of companies using traditional tokenization methods bring the data back into clear text format, negating the security benefits. This puts companies at a disadvantage and at a higher risk of data loss during a breach.
Companies seeking to invest in more modern and advanced technologies should consider products like Portal26. Portal26 delivers all the benefits of tokenization without the severe data usability and performance restrictions that organizations have had to live with in the past. Built for high-performance, petabyte-scale, analytic use cases, Portal26 enables full-featured search and analytics without any decryption or detokenization.
To further explore how Portal26 works to protect your data against data exfiltration, ransomware, and insider attacks, please schedule a demo on our website.
Related Resources
Data States Security Experts Unhappy With Traditional Tokenization
Data States Security Experts Unhappy With Traditional Tokenization Cybersecurity experts may look to Portal26 for all the benefits of traditional tokenization and none of the
Best Practices in Data Tokenization
Essential Data Tokenization Best Practices: A Complete Guide Tokenization is the process of replacing sensitive data with unique identifiers (tokens) that do not inherently have