Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization

Similar documents
Encrypting Critical Data In Databases. An Overview of the Database Integration Process

Comprehensive Database Security

Overview: Compliance and Security Management PCI-DSS Control Compliance Suite Overview

INTELLIGENCE DRIVEN GRC FOR SECURITY

Charting Your Path to Enterprise Key Management

HIPAA Regulatory Compliance

SECURITY PRACTICES OVERVIEW

white paper SMS Authentication: 10 Things to Know Before You Buy

Tokenisation for PCI-DSS Compliance

Introduction to the PCI DSS: What Merchants Need to Know

Google Cloud Platform: Customer Responsibility Matrix. April 2017

HIPAA Compliance Checklist

Security Update PCI Compliance

Oracle Database Vault

Google Cloud Platform: Customer Responsibility Matrix. December 2018

SQL Security Whitepaper SECURITY AND COMPLIANCE SOLUTIONS FOR PCI DSS PAYMENT CARD INDUSTRY DATA SECURITY STANDARD

ADDRESSING PCI DSS 3.0 REQUIREMENTS WITH THE VORMETRIC DATA SECURITY PLATFORM

A QUICK PRIMER ON PCI DSS VERSION 3.0

Total Security Management PCI DSS Compliance Guide

The Current State of Encryption and Key Management

6 Vulnerabilities of the Retail Payment Ecosystem

Encrypt Everything. How to unshare and secure your sensitive data wherever it resides SAFENET-INC.COM

Compliance in 5 Steps

Enhancing Security With SQL Server How to balance the risks and rewards of using big data

Protegrity Vaultless Tokenization

QuickBooks Online Security White Paper July 2017

Daxko s PCI DSS Responsibilities

Introduction to AWS GoldBase

Secure Government Computing Initiatives & SecureZIP

Your guide to the Payment Card Industry Data Security Standard (PCI DSS) banksa.com.au

Escaping PCI purgatory.

01.0 Policy Responsibilities and Oversight

Best Practices in Securing a Multicloud World

PCI Data Security. Meeting the Challenges of PCI DSS Payment Card Security

Security and PCI Compliance for Retail Point-of-Sale Systems

PCI DSS COMPLIANCE 101

USING QUALYSGUARD TO MEET SOX COMPLIANCE & IT CONTROL OBJECTIVES

SafeNet ProtectApp APPLICATION-LEVEL ENCRYPTION

Weighing in on the Benefits of a SAS 70 Audit for Third Party Administrators

Complying with PCI DSS 3.0

Simplify PCI Compliance

EBOOK The General Data Protection Regulation. What is it? Why was it created? How can organisations prepare for it?

Kenna Platform Security. A technical overview of the comprehensive security measures Kenna uses to protect your data

Data Protection and PCI Scope Reduction for Today s Businesses

Watson Developer Cloud Security Overview

Best Practices for PCI DSS Version 3.2 Network Security Compliance

Data Protection. Plugging the gap. Gary Comiskey 26 February 2010

Don t Be the Next Headline! PHI and Cyber Security in Outsourced Services.

The Nasuni Security Model

Tokenisation: Reducing Data Security Risk

Introduction. Deployment Models. IBM Watson on the IBM Cloud Security Overview

EMC Ionix IT Compliance Analyzer Application Edition

Who s Protecting Your Keys? August 2018

WHITE PAPERS. INSURANCE INDUSTRY (White Paper)

How to Dramatically Lower the Cost and Pain of the Yearly PCI DSS Audit

Section 3.9 PCI DSS Information Security Policy Issued: November 2017 Replaces: June 2016

SYMANTEC: SECURITY ADVISORY SERVICES. Symantec Security Advisory Services The World Leader in Information Security

IBM Internet Security Systems October Market Intelligence Brief

SECURITY & PRIVACY DOCUMENTATION

Maximizing IT Security with Configuration Management WHITE PAPER

AuthAnvil for Retail IT. Exploring how AuthAnvil helps to reach compliance objectives

SOLUTION BRIEF RSA ARCHER IT & SECURITY RISK MANAGEMENT

Compliance and Privileged Password Management

DIGITAL TRUST AT THE CORE

Security Architecture

The Realities of Data Security and Compliance: Compliance Security

Enabling compliance with the PCI Data Security Standards December 2007

Secure Access & SWIFT Customer Security Controls Framework

What is Penetration Testing?

Navigating the PCI DSS Challenge. 29 April 2011

Choosing the level that works for you!

VANGUARD WHITE PAPER VANGUARD INSURANCE INDUSTRY WHITEPAPER

2 The IBM Data Governance Unified Process

PCI DSS. Compliance and Validation Guide VERSION PCI DSS. Compliance and Validation Guide

IBM Global Technology Services Provide around-the-clock expertise and protect against Internet threats.

GUIDE TO STAYING OUT OF PCI SCOPE

The Honest Advantage

WHITE PAPER. The General Data Protection Regulation: What Title It Means and How SAS Data Management Can Help

Automating the Top 20 CIS Critical Security Controls

SIP Trunks. PCI compliance paired with agile and cost-effective telephony

The Top 6 WAF Essentials to Achieve Application Security Efficacy

Choosing the Right Solution for Strategic Deployment of Encryption

Continuous protection to reduce risk and maintain production availability

SOLUTION BRIEF BIG DATA SECURITY

locuz.com SOC Services

COMPLETING THE PAYMENT SECURITY PUZZLE

Service. Sentry Cyber Security Gain protection against sophisticated and persistent security threats through our layered cyber defense solution

Transparent Solutions for Security and Compliance with Oracle Database 11g. An Oracle White Paper September 2008

Run the business. Not the risks.

Business white paper Data Protection and PCI Scope Reduction for Today s Businesses

Total Protection for Compliance: Unified IT Policy Auditing

Using GRC for PCI DSS Compliance

Clearing the Path to PCI DSS Version 2.0 Compliance

Oracle Database Security Assessment Tool

PCI DSS and VNC Connect

Deliver Data Protection Services that Boost Revenues and Margins

Security Solutions. Overview. Business Needs

Ensuring Desktop Central Compliance to Payment Card Industry (PCI) Data Security Standard

Criminal Justice Information Security (CJIS) Guide for ShareBase in the Hyland Cloud

PCI DSS and the VNC SDK

Transcription:

Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization WHITE PAPER Tokenization is gaining increased adoption in a range of organizations and industries. By effectively taking PCI data out of scope, tokenization presents a host of benefits, helping organizations both boost security and reduce PCI compliance efforts and costs. This paper offers a detailed look at tokenization and offers practical guidelines for helping organizations successfully employ tokenization so they can maximize the potential benefits. Introduction: Challenges of Compliance How good is good enough? When it comes to security, the question continues to be a vexing one for just about any organization. For companies regulated by the Payment Card Industry Data Security Standard (PCIDSS), the question remains, even after a successfully completed audit. The very next day a new system may be installed, a new threat discovered, a new user added, a new patch released. If an audit is passed and a breach occurs, the impact would still potentially be devastating. IT infrastructures, security solutions, threats, regulations, and their interpretation continue to evolve. That s why, when it comes to security, organizations need to take a defense-indepth approach, and the work is never done. This holds true for organizations in virtually any industry. A company needs to maintain vigilance in securing the personally identifiable information of employees, whether national IDs, social security numbers, etc. Organizations complying with Sarbanes-Oxley, the Health Insurance Portability and Accountability Act (HIPAA), HITECH, the EU Data Privacy Directive, or any other regulation have a fundamental requirement to secure sensitive data. Within this context, business and security leaders must constantly strive to find a balance, weighing budget allocations, staffing, new investments, and ongoing costs vs. security objectives. Given that, it is incumbent upon security teams to refine their approaches in order to maximize efficiency while they maximize security. That s why many organizations have looked to tokenization. This paper offers a detailed look at tokenization and how it can support organizations PCI compliance efforts. The paper compares tokenization to encryption and other approaches, including some of the factors to consider in choosing which approach is best for a given deployment scenario. In addition, the paper describes an approach from SafeNet, transparent tokenization, and it reveals some of the specific advantages and benefits this solution offers to organizations looking to safeguard sensitive data in the most effective and efficient manner possible. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 1

Weighing Tokenization Alternatives: Encryption, Data Masking, and Other Approaches In today s security landscape, there are many alternatives organizations can choose from as they set out to ensure optimal security and sustain compliance. Following is an overview of several approaches that represent an alternative or a complement to tokenization. Encrypted data may be deemed out of scope if, and only if, it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it. --PCI SSC Issues Statement on Scope of Encrypted Data via FAQ 10359, Issued 11/10/2009 Encryption In most PCI-regulated organizations, cardholder data will need to be retrieved in the clear at some point. Given that, encryption will be a fundamental requirement, a way to ensure sensitive payment information is only accessible by authorized users for authorized purposes. When plotting security strategies, however, it is important to factor in the degree to which encryption affects the scope of an organization s PCI compliance efforts. As the PCI Security Standards Council makes clear, encrypted data is still in scope in any organization that has mechanisms in place for decrypting that data. In other words, if a merchant uses an off-site storage facility, and encrypts payment data before it is transported off site, that facility s operations would not be in scope as long as there were no capabilities within the facility to decrypt that data. In this way, encryption can help reduce the scope of compliance. However, within an organization that is employing encryption mechanisms, and so has the ability to decrypt data, care should be taken to minimize the occurrence of systems that store or access encrypted data. This is true for several reasons: Scope of compliance and costs. It is important to bear in mind that the systems managing encryption, and the housing and transmission of encrypted data, are very much in scope of PCI, and so must adhere to the spectrum of PCI regulations, including malware protection, multi-factor authentication, and, perhaps most importantly, rigorous key protection mechanisms. Further, each of these systems will be under the purview of a PCI audit, and the more such systems audited, the higher the overall audit expense will be. Application integration. All the applications that need to access encrypted data will typically need to be modified to accommodate the changes in data type and field size that accompany the move from clear text and binary data to accommodate the lengthier field sizes of cipher text. Depending on the number and type of applications involved, these changes can represent a significant investment in time and money. Format Preserving Encryption Format preserving encryption has been introduced by several vendors in recent years in order to minimize the implications of encryption on associated applications. However, at the time of the publication of this paper, the PCI Security Standards Council has not issued a formal policy around format preserving encryption, leaving open whether, and which of, these techniques are acceptable to meet compliance mandates. Further, many algorithms and modes may not have been approved by standards bodies, such as the National Institute of Standards and Technology (NIST). Because format preserving encryption must return a shorter value than strong encryption algorithms would normally create, the strength of the ciphertext is reduced in comparison to transparent tokenization which is based on proven algorithms. Additionally, if a malicious attack results in the capture of the key used for the format preserving encryption and its associated algorithm, then the clear text could be derived whereas, a token cannot be derived by the systems interacting with the tokenized data which is why those systems remain out of audit scope. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 2

Comparison Transparent Tokenization Format Preserving Encryption Reduce Audit Scope a x Not vulnerable to decryption a x Higher security strength a x Proven algorithms a x Data Masking Data masking is another approach to consider when it comes to many enterprise s security and compliance objectives. Data masking is an approach typically used in testing and development environments, and is particularly useful when outsourcing application development. Data masking is used to ensure that application development environments don t compromise the security of real customer data. With data masking, sensitive data is replaced with realistic, but not real, data. While data masking may be a useful technique, development organizations need to ensure such aspects as referential integrity are addressed, and that the mechanism used to mask data isn t susceptible to reverse engineering techniques that could uncover real data. Given the characteristics and considerations of the alternatives above, tokenization is an approach that is gaining increased market acceptance. The following section offers a range of insights and considerations for employing tokenization most effectively. Transparent tokenization is a very useful technique to remove sensitive data from a database system, by replacing it with similarly formatted data that is not sensitive in any way, explained Alexandre Pinto, CISSP-ISSAP and PCI QSA, CIPHER Security. Keys to Successful Tokenization In recent years, tokenization has increasingly become an integral approach for PCI compliance, helping organizations both strengthen the security of payment data while reducing overall security and PCI audit costs. Employed for online credit card transactions or transmission of other sensitive data, tokenization works by replacing sensitive data with tokens that retain the characteristics of the original data. With tokenization, security teams can ensure that databases, applications, and users cannot access sensitive data, and only interact with placeholders for that sensitive data. Tokenization systems convert the sensitive data to an encrypted token in the same format as the original data, allowing associated applications to continue operating seamlessly. Masking features can also be maintained if a subset of the data needs to be available for authentication. Effectively implemented tokenization can significantly reduce an organization s security and PCI compliance costs. When applications have access to tokenization, but have no means to reverse tokenization and access cardholder data in the clear, those applications are considered out of scope. As a result, organizations don t need to employ the range of PCI-mandated security mechanisms on these systems. Further, these approaches thus reduce the cost of ongoing PCI audits. According to Simon Sharp, Director, Illumis, An assessor will also inspect samples of all systems to ensure that cardholder data is not present, particularly where personal account numbers (PANs) used to be in order to ensure that tokenization is working. Therefore, it is important to make sure there is a distinction between the tokenized values and the PAN so the system can be removed from scope. Following are some important considerations and strategies to consider when planning new tokenization implementations. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 3

One of the biggest areas of value we can provide is in helping reduce audit scope, both by consolidating systems and processes and really ensuring that there s a good business reason for keeping sensitive payment data accessible to a given system or process, explained Brian Serra, PCI Program Manager, CISSP, QSA, and ISO ISMS Lead Auditor, Accuvant. Practically, for every system taken out of audit scope, a business generally saves about two hours of auditing time plus a great deal of expense in applying and maintaining all the security mechanisms required by the PCI standard. Minimize Instances of Sensitive Cardholder Data Whether through Deletion or Tokenization Before employing encryption, tokenization, or any other security mechanism, organizations should start by ensuring cardholder data is only stored and accessible where there s an absolute business need to do so. If there isn t, eliminating the sensitive data completely, and the inherent exposure, is a critical first step. It is critical to assess the impact of removing, encrypting, or tokenizing the data that resides on a given system. Once sensitive data has been discovered, it needs to be analyzed in terms of the associations and interdependencies of other systems. For example, if a business process requires access to the sensitive data, will those processes be affected by encrypting or tokenizing that sensitive data? If not accounted for, the impact of tokenization on those associated processes may cause significant problems for the business. Next, security teams need to determine where and how tokenization can be employed. Today, tokenization is typically employed in one of two ways: Outsourced. Within an e-commerce scenario, a retailer can outsource tokenization entirely so they never have the potential to access cardholder data in the clear within their systems. For example, after an online transaction is completed, the card information can be transparently redirected to the service provider, who then converts the card data into a token and returns the token to the retailer. The downside with this approach is that it can be very difficult for a retailer to change service providers, given the complexity of migrating tokens and payment data. Further, this approach may not be an option for retailers that use multiple card processors. In house. Here, the merchant would manage converting card numbers into tokens so associated downstream applications would not be able to access cardholder data in the clear. While this approach does not reduce the scope of compliance nearly as much as the first scenario, the trade-off is that the merchant will have more ongoing flexibility and will avoid the potential for being locked into a given service provider. In either case, these approaches can provide substantial benefits. On the other hand, security teams may not want to use tokenization in cases in which users or applications need capabilities to access payment data in the clear. If systems or users need to be authorized to use cardholder data in clear text, encryption may be a better alternative or complement to tokenization. Particularly in cases in which there is unstructured data, for example, the data in spreadsheets and Word documents, encryption would be complementary to tokenization employed with structured data. One of the areas that is often a focus for our auditing efforts is the security of the lookup table, which relates the token to the original PAN, Benj Hosack, Director, Foregenix. This is fundamental to the solution and needs to be protected accordingly. That s why working with reputable suppliers with experience and expertise in this area is recommended. Leverage Proven Third-Party Solutions When PCI auditors are verifying the compliance of encryption and tokenization, a critical first question stems around the types of technologies used. If a merchant or financial institution has employed an internally developed system for all or part of these areas, the scope of an audit will inherently grow each facet of the implementation, everything from access controls to key rotation will need to be inspected and verified. Consequently, internally developed systems can significantly increase audit costs, not to mention increased upfront investments and ongoing development. On the other hand, if organizations employ compliant commercial solutions that are already vetted by PCI auditors, they simplify the audit process, enabling auditors to focus on the manner in which the security systems are implemented, rather than the mechanisms themselves. Further, it is important to view the tokenization infrastructure in a cohesive fashion and ensure all aspects are secured. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 4

Centrally Manage Encryption and Tokenization Whenever possible, organizations should leverage systems that offer integrated capabilities for both encryption and tokenization on one platform. These solutions offer a range of benefits: Cost savings. If tokenization solutions operate independently of encryption, the cost of upfront purchase, initial integration, and ongoing maintenance will typically be much higher. Simplified auditing and remediation. When logs and policies are gathered and tracked across various point solutions, demonstrating and maintaining compliance grows more complex. Centralized key management. By leveraging key management from a common platform, administrators can establish best practices for tokenized data in accordance with PCI DSS or VISA, as well as for encrypted data. For instance, having the flexibility to use the strongest encryption keys for the components of the token vault, such as AES256 for the ciphertext of the PAN and SHA256 for the protecting the associated hash or token value. Consistent Enforcement of Policy. It is also important to centrally enforce protection policies to control not only what data is protected in which manner (tokenized or encrypted) and where, but to also manage the permissions for privileged users and systems. To optimize these benefits, organizations should look for solutions that offer the scalability required to accommodate high transaction volumes. Further, they should employ solutions that offer the broadest support for industry standards, tokenization and encryption approaches, and more, to ensure initial investments can be maximized in the long term. This is especially important knowing compliance is not a static event but an ongoing effort for as long as an organization has to manage sensitive data. I m a big proponent that tokenization, key management, and encryption should be done in hardware wherever possible, stated Simon Sharp, Illumis. No matter how many malware mechanisms may be employed, ultimately, software may still be vulnerable a hacker using an inline key logger may still be able to compromise access controls. Hardware-based solutions offer an additional layer of security that is critical for these vital systems. Optimize Security with Hardware-based Platforms Whenever possible, organizations should leverage hardware-based platforms, which provide a vital layer of protection for sensitive business assets. Robust hardware-based encryption and tokenization platforms feature capabilities like centralized, secure backup, and more limited access points, which can significantly strengthen overall security. SafeNet Transparent Tokenization SafeNet offers the robust, comprehensive, and flexible solutions that enable organizations to boost security, ensure PCI compliance, and reduce security costs. With SafeNet, security teams get the capabilities they need to maximize the benefits of tokenization in reducing audit scope and strengthening security. Through its integrated tokenization and encryption capabilities, SafeNet gives security teams the flexibility they need to apply tokenization and encryption in ways that yield the biggest benefit for their business and security objectives. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 5

Benefits Ensure PCI compliance and strengthen security Reduced audit costs. Streamline security administration and integration Alignment with VISA best practices Benefits By employing SafeNet tokenization, organizations can enjoy a range of benefits: Ensure PCI compliance and strengthen security. With SafeNet, organizations can address PCI rules by securing credit card information with format-preserving tokenization. Further, they can optimize the security of sensitive data through the hardened DataSecure appliance, which features secure key storage and backup, granular administrative controls, and more. Further, SafeNet enables businesses to protect a wide range of data types in addition to credit card information, including bank transaction data, personnel records, and more. Reduced audit costs. SafeNet helps security teams save time and money by restricting the number of devices that need to be audited. When facing an audit for PCI compliance, many organizations must certify regulatory compliance for each server where sensitive data resides. Because SafeNet Tokenization replaces sensitive data in databases and applications with tokens, there are fewer servers to audit. Reducing the scope of audits helps save time and money. Streamline security administration and integration. With SafeNet, organizations can leverage a central platform for managing policies, lifecycle key management, maintenance, and auditing through a single solution for both tokenization and encryption. Further, they can deploy tokenization with full application transparency, which eliminates the need to customize applications to accommodate tokenized data. Alignment with VISA best practices. SafeNet Transparent Tokenization is in alignment with the recently published VISA Best Practices for Tokenization version 1.0 in regards to token generation, token mapping, use of a data vault as a cardholder data repository using encryption, and strong cryptographic key management. (http://usa.visa.com/download/ merchants/tokenization_best_practices.pdf) SafeNet Tokenization offers a variety of integration options, providing customers with the flexibility to choose the right security technique for their environment, while enabling them to protect more data types without affecting business logic, database architecture, storage systems, or other critical enterprise components. SafeNet Tokenization also enables development teams to move or replicate production data to test environments without having to de-identify or mask data. With SafeNet Tokenization, organizations can keep data protected with optimal efficiency and cost-effectiveness. Features Format-preserving tokenization Token variations Support for an array of data types Broad platform support Features SafeNet offers a range of critical features: Format-preserving tokenization. Ensure transparent interactions with applications and users by defining the format of the unique value or token during assignment. By preserving the format of the data in the token values, applications that interact with the data will not require customization. SafeNet supports various data formats, including partially masked data, such as XXXXX6789. Token variations. Choose from a range of token variations by tokenizing random digits, sequential numbers, preserving the first two or six digits, or the first two and the last four. Support for an array of data types. Protect a full array of data, ranging from credit card numbers and member IDs to social security numbers and driver s license numbers. Broad platform support. Enjoy complete deployment flexibility through SafeNet s support for a wide range of applications and Web servers, including Oracle, IBM, BEA, J2EE, Apache, Sun ONE, JBoss. In addition, SafeNet offers data and token storage for Oracle and Microsoft SQL Server. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 6

SafeNet Transparent Tokenization Deployment Following is an overview of how the tokenization process works: 1. Sensitive data comes in through an Ecommerce system. 2. Sensitive data is passed to the Tokenization Manager. 3. Tokenization encrypts the sensitive data, stores it, and returns a token, 4. Other enterprise systems are passed tokens transparently. 5. PCI Auditor only needs to inspect the tokenized database or data vault and sample any active applications to ensure proper tokenization technique; otherwise, the systems be removed from scope. How Tokenization Works 1 Sensitive data comes in through an Ecommerce system Enterprise Application 2 Sensitive data is passed to Tokenization Manager 3 Tokenization encrypts the sensitive data, stores it and returns a token Tokenization Manager PCI Auditor 4 Other Enterprise systems pass tokens to Tokenization Manager DataSecure 5 6 Tokenization decrypts and returns sensitive data PCI Auditor only needs to inspect tokenized database and active applications Order Processing Systems Payment Systems Customer Service Systems Conclusion For organizations tasked with ensuring PCI compliance, the battle is never over. In this effort, tokenization is becoming an increasingly prevalent approach, one that can take PCI data out of scope, and so both strengthen security and reduce compliance costs. Today, SafeNet offers leading transparent tokenization solutions that enable organizations to fully maximize the benefits of tokenization. About SafeNet Founded in 1983, SafeNet is a global leader in information security. SafeNet protects its customers most valuable assets, including identities, transactions, communications, data and software licensing, throughout the data lifecycle. More than 25,000 customers across both commercial enterprises and government agencies and in over 100 countries trust their information security needs to SafeNet. Contact Us: For all office locations and contact information, please visit www.safenet-inc.com Follow Us: www.safenet-inc.com/connected 2010 SafeNet, Inc. All rights reserved. SafeNet and SafeNet logo are registered trademarks of SafeNet. All other product names are trademarks of their respective owners. WP (EN)-08.10.10 Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 7