Data Privacy

Best Practices in Data Tokenization

Essential Data Tokenization Best Practices: A Complete Guide Tokenization is the process of replacing sensitive data with unique identifiers (tokens) that do not inherently have any meaning. Doing this helps secure the original underlying data against unauthorized access or usage. Tokenization was invented in 2001 to secure payment card data and quickly became the dominant […]

Best Practices in Data Tokenization Read More »

Securing elasticsearch

Keeping Elasticsearch data secure from attacks and exposure

Keeping Elasticsearch data secure from attacks and exposure In the last few years enterprises have seen an unprecedented amount of data lost from vulnerable Elasticsearch clusters. Tens of billions of records of highly sensitive data across financial services, healthcare, high-tech, retail, and hospitality segments have been exposed or stolen. A simple web search for Elasticsearch

Keeping Elasticsearch data secure from attacks and exposure Read More »