SlideShare ist ein Scribd-Unternehmen logo
1 von 7
Downloaden Sie, um offline zu lesen
Reducing PCI Compliance Costs and
Effort with SafeNet Transparent Tokenization
WHITE PAPER




Tokenization is gaining increased adoption in a range of organizations and
industries. By effectively taking PCI data out of scope, tokenization presents a host
of benefits, helping organizations both boost security and reduce PCI compliance
efforts and costs. This paper offers a detailed look at tokenization and offers
practical guidelines for helping organizations successfully employ tokenization so
they can maximize the potential benefits.

Introduction: Challenges of Compliance
How good is good enough? When it comes to security, the question continues to be a vexing one
for just about any organization. For companies regulated by the Payment Card Industry Data
Security Standard (PCIDSS), the question remains, even after a successfully completed audit.
The very next day a new system may be installed, a new threat discovered, a new user added, a
new patch released.

If an audit is passed and a breach occurs, the impact would still potentially be devastating. IT
infrastructures, security solutions, threats, regulations, and their interpretation continue to
evolve. That’s why, when it comes to security, organizations need to take a defense-in-depth
approach, and the work is never done. This holds true for organizations in virtually any industry.
A company needs to maintain vigilance in securing the personally identifi able information of
employees, whether national IDs, social security numbers, etc. Organizations complying with
Sarbanes-Oxley, the Health Insurance Portability and Accountability Act (HIPAA), HITECH, the
EU Data Privacy Directive, or any other regulation have a fundamental requirement to secure
sensitive data.

Within this context, business and security leaders must constantly strive to fi nd a balance,
weighing budget allocations, staffi ng, new investments, and ongoing costs vs. security
objectives. Given that, it is incumbent upon security teams to refi ne their approaches in order
to maximize effi ciency while they maximize security. That’s why many organizations have
looked to tokenization.

This paper offers a detailed look at tokenization and how it can support organizations’ PCI
compliance efforts. The paper compares tokenization to encryption and other approaches,
including some of the factors to consider in choosing which approach is best for a given
deployment scenario.

In addition, the paper describes an approach from SafeNet, transparent tokenization, and it
reveals some of the specifi c advantages and benefi ts this solution offers to organizations
looking to safeguard sensitive data in the most effective and effi cient manner possible.


Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper    1
Weighing Tokenization Alternatives: Encryption, Data Masking, and Other
                                        Approaches
                                        In today’s security landscape, there are many alternatives organizations can choose from as they
                                        set out to ensure optimal security and sustain compliance. Following is an overview of several
                                        approaches that represent an alternative or a complement to tokenization.
“Encrypted data may be deemed
     out of scope if, and only if, it   Encryption
                                        In most PCI-regulated organizations, cardholder data will need to be retrieved in the clear at
     has been validated that the
                                        some point. Given that, encryption will be a fundamental requirement, a way to ensure sensitive
entity that possesses encrypted
                                        payment information is only accessible by authorized users for authorized purposes. When
  cardholder data does not have         plotting security strategies, however, it is important to factor in the degree to which encryption
         the means to decrypt it.”      affects the scope of an organization’s PCI compliance efforts.
 --PCI SSC Issues Statement on          As the PCI Security Standards Council makes clear, encrypted data is still in scope in any
Scope of Encrypted Data via FAQ         organization that has mechanisms in place for decrypting that data. In other words, if a merchant
     10359, Issued 11/10/2009           uses an off-site storage facility, and encrypts payment data before it is transported off site, that
                                        facility’s operations would not be in scope—as long as there were no capabilities within the
                                        facility to decrypt that data.

                                        In this way, encryption can help reduce the scope of compliance. However, within an organization
                                        that is employing encryption mechanisms, and so has the ability to decrypt data, care should be
                                        taken to minimize the occurrence of systems that store or access encrypted data. This is true for
                                        several reasons:

                                          •	 Scope of compliance and costs. It is important to bear in mind that the systems managing
                                             encryption, and the housing and transmission of encrypted data, are very much in scope of
                                             PCI, and so must adhere to the spectrum of PCI regulations, including malware protection,
                                             multi-factor authentication, and, perhaps most importantly, rigorous key protection
                                             mechanisms. Further, each of these systems will be under the purview of a PCI audit, and the
                                             more such systems audited, the higher the overall audit expense will be.

                                          •	 Application integration. All the applications that need to access encrypted data will
                                             typically need to be modifi ed to accommodate the changes in data type and fi eld size
                                             that accompany the move from clear text and binary data to accommodate the lengthier fi
                                             eld sizes of cipher text. Depending on the number and type of applications involved, these
                                             changes can represent a signifi cant investment in time and money.

                                        Format Preserving Encryption
                                        Format preserving encryption has been introduced by several vendors in recent years in order
                                        to minimize the implications of encryption on associated applications. However, at the time of
                                        the publication of this paper, the PCI Security Standards Council has not issued a formal policy
                                        around format preserving encryption, leaving open whether, and which of, these techniques are
                                        acceptable to meet compliance mandates. Further, many algorithms and modes may not have
                                        been approved by standards bodies, such as the National Institute of Standards and Technology
                                        (NIST). Because format preserving encryption must return a shorter value than strong encryption
                                        algorithms would normally create, the strength of the ciphertext is reduced in comparison to
                                        transparent tokenization which is based on proven algorithms. Additionally, if a malicious attack
                                        results in the capture of the key used for the format preserving encryption and its associated
                                        algorithm, then the clear text could be derived whereas, a token cannot be derived by the
                                        systems interacting with the tokenized data which is why those systems remain out of audit
                                        scope.

                                         Comparison                     Transparent Tokenization Format Preserving Encryption
                                         Reduce Audit Scope                                                                     
                                         Not vulnerable to decryption                                                           
                                         Higher security strength                                                               
                                         Proven algorithms                                                                      




                                        Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper   2
Data Masking
                                       Data masking is another approach to consider when it comes to many enterprise’s security and
   “Transparent tokenization is a      compliance objectives. Data masking is an approach typically used in testing and development
 very useful technique to remove       environments, and is particularly useful when outsourcing application development. Data
  sensitive data from a database       masking is used to ensure that application development environments don’t compromise the
      system, by replacing it with     security of real customer data. With data masking, sensitive data is replaced with realistic, but
                                       not real, data. While data masking may be a useful technique, development organizations need
  similarly formatted data that is
                                       to ensure such aspects as referential integrity are addressed, and that the mechanism used to
         not sensitive in any way,”
                                       mask data isn’t susceptible to reverse engineering techniques that could uncover real data.
    “Although this means that no       Given the characteristics and considerations of the alternatives above, tokenization is an
changes in the database schema         approach that is gaining increased market acceptance. The following section offers a range of
  have to be made, there are still     insights and considerations for employing tokenization most effectively.
      some changes that may be
 required on associated systems        Keys to Successful Tokenization
                                       In recent years, tokenization has increasingly become an integral approach for PCI compliance,
    in order for them to integrate
                                       helping organizations both strengthen the security of payment data while reducing overall
   properly with the tokenization
                                       security and PCI audit costs.
     solution. In addition, special
    care has to be taken to make       Employed for online credit card transactions or transmission of other sensitive data, tokenization
   sure that the systems are not       works by replacing sensitive data with tokens that retain the characteristics of the original data.
                                       With tokenization, security teams can ensure that databases, applications, and users cannot
   misinterpreting the new token
                                       access sensitive data, and only interact with placeholders for that sensitive data. Tokenization
     data, for example, using it as
                                       systems convert the sensitive data to an encrypted token in the same format as the original data,
 basis for a business intelligence     allowing associated applications to continue operating seamlessly. Masking features can also be
        solution. It is important to   maintained if a subset of the data needs to be available for authentication.
understand that most operations
                                       Effectively implemented tokenization can signifi cantly reduce an organization’s security and
 would still need to be performed
                                       PCI compliance costs. When applications have access to tokenization, but have no means to
               on the actual data.”    reverse tokenization and access cardholder data in the clear, those applications are considered
                Alexandre Pinto,       out of scope. As a result, organizations don’t need to employ the range of PCI-mandated security
                                       mechanisms on these systems. Further, these approaches thus reduce the cost of ongoing PCI
   Senior Technical Manager and
                                       audits. According to Simon Sharp, Director, Illumis, “An assessor will also inspect samples of
       PCI QSA, CIPHER Security
                                       all systems to ensure that cardholder data is not present, particularly where personal account
                                       numbers (PANs) used to be in order to ensure that tokenization is working. Therefore, it is
                                       important to make sure there is a distinction between the tokenized values and the PAN so the
                                       system can be removed from scope.”

                                       Following are some important considerations and strategies to consider when planning new
                                       tokenization implementations.

                                       Minimize Instances of Sensitive Cardholder Data—Whether through Deletion or Tokenization
                                       Before employing encryption, tokenization, or any other security mechanism, organizations
                                       should start by ensuring cardholder data is only stored and accessible where there’s an absolute
                                       business need to do so. If there isn’t, eliminating the sensitive data completely, and the inherent
                                       exposure, is a critical fi rst step.

                                       It is critical to assess the impact of removing, encrypting, or tokenizing the data that resides
                                       on a given system. Once sensitive data has been discovered, it needs to be analyzed in terms
                                       of the associations and interdependencies of other systems. For example, if a business
                                       process requires access to the sensitive data, will those processes be affected by encrypting
                                       or tokenizing that sensitive data? If not accounted for, the impact of tokenization on those
                                       associated processes may cause signifi cant problems for the business.




                                       Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper    3
Next, security teams need to determine where and how tokenization can be employed. Today,
      “One of the biggest areas of     tokenization is typically employed in one of two ways:
value we can provide is in helping       •	 Outsourced. Within an e-commerce scenario, a retailer can outsource tokenization
      reduce audit scope, both by           entirely so they never have the potential to access cardholder data in the clear within their
       consolidating systems and            systems. For example, after an online transaction is completed, the card information can be
 processes—and really ensuring              transparently redirected to the service provider, who then converts the card data into a token
     that there’s a good business           and returns the token to the retailer. The downside with this approach is that it can be very
                                            diffi cult for a retailer to change service providers, given the complexity of migrating tokens
     reason for keeping sensitive
                                            and payment data. Further, this approach may not be an option for retailers that use multiple
      payment data accessible to
                                            card processors.
       a given system or process,”
        explained Brian Serra, PCI       •	 In house. Here, the merchant would manage converting card numbers into tokens so
                                            associated downstream applications would not be able to access cardholder data in the
   Program Manager, CISSP, QSA,
                                            clear. While this approach does not reduce the scope of compliance nearly as much as the
       and ISO ISMS Lead Auditor,
                                            fi rst scenario, the trade-off is that the merchant will have more ongoing fl exibility and will
  Accuvant. “Practically, for every         avoid the potential for being locked into a given service provider.
 system taken out of audit scope,
                                       In either case, these approaches can provide substantial benefi ts. On the other hand, security
a business generally saves about
                                       teams may not want to use tokenization in cases in which users or applications need capabilities
two hours of auditing time—plus
                                       to access payment data in the clear. If systems or users need to be authorized to use cardholder
        a great deal of expense in     data in clear text, encryption may be a better alternative or complement to tokenization.
 applying and maintaining all the      Particularly in cases in which there is unstructured data, for example, the data in spreadsheets
security mechanisms required by        and Word documents, encryption would be complementary to tokenization employed with
                the PCI standard.”     structured data.

                                       Leverage Proven Third-Party Solutions
                                       When PCI auditors are verifying the compliance of encryption and tokenization, a critical first
                                       question stems around the types of technologies used. If a merchant or fi nancial institution
                                       has employed an internally developed system for all or part of these areas, the scope of an audit
                                       will inherently grow—each facet of the implementation, everything from access controls to key
                                       rotation will need to be inspected and verifi ed. Consequently, internally developed systems can
                                       signifi cantly increase audit costs, not to mention increased upfront investments and ongoing
                                       development.

                                       On the other hand, if organizations employ compliant commercial solutions that are already
                                       vetted by PCI auditors, they simplify the audit process, enabling auditors to focus on the manner
                                       in which the security systems are implemented, rather than the mechanisms themselves.

                                       Further, it is important to view the tokenization infrastructure in a cohesive fashion and ensure
                                       all aspects are secured.

   “One of the areas that is often     Centrally Manage Encryption and Tokenization
  a focus for our auditing efforts     Whenever possible, organizations should leverage systems that offer integrated capabilities for
     is the security of the lookup     both encryption and tokenization on one platform. These solutions offer a range of benefi ts:

 table, which relates the token to       •	 Cost savings. If tokenization solutions operate independently of encryption, the cost of
  the original PAN,” Benj Hosack,           upfront purchase, initial integration, and ongoing maintenance will typically be much higher.
       Director, Foregenix. “This is     •	 Simplifi ed auditing and remediation. When logs and policies are gathered and tracked
     fundamental to the solution            across various point solutions, demonstrating and maintaining compliance grows more
       and needs to be protected            complex.
  accordingly. That’s why working
                                         •	 Centralized key management. By leveraging key management from a common platform,
    with reputable suppliers with           administrators can establish best practices for tokenized data in accordance with PCI
 experience and expertise in this           DSS or VISA, as well as for encrypted data. For instance, having the fl exibility to use the
           area is recommended.”            strongest encryption keys for the components of the token vault, such as AES256 for the
                                            ciphertext of the PAN and SHA256 for the protecting the associated hash or token value.

                                         •	 Consistent Enforcement of Policy. It is also important to centrally enforce protection policies
                                            to control not only what data is protected in which manner (tokenized or encrypted) and
                                            where, but to also manage the permissions for privileged users and systems.




                                       Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper      4
To optimize these benefi ts, organizations should look for solutions that offer the scalability
                                      required to accommodate high transaction volumes. Further, they should employ solutions that
                                      offer the broadest support for industry standards, tokenization and encryption approaches,
                                      and more, to ensure initial investments can be maximized in the long term. This is especially
                                      important knowing compliance is not a static event but an ongoing effort for as long as an
                                      organization has to manage sensitive data.

                                      Optimize Security with Hardware-based Platforms
                                      Whenever possible, organizations should leverage hardware-based platforms, which provide a
                                      vital layer of protection for sensitive business assets. Robust hardware-based encryption and
          “I’m a big proponent that
                                      tokenization platforms feature capabilities like centralized, secure backup, and more limited
   tokenization, key management,      access points, which can signifi cantly strengthen overall security.
   and encryption should be done
  in hardware wherever possible,”     SafeNet Transparent Tokenization
      stated Simon Sharp, Illumis.    SafeNet offers the robust, comprehensive, and fl exible solutions that enable organizations to
   “No matter how many malware        boost security, ensure PCI compliance, and reduce security costs. With SafeNet, security teams
  mechanisms may be employed,         get the capabilities they need to maximize the benefi ts of tokenization in reducing audit scope
                                      and strengthening security. Through its integrated tokenization and encryption capabilities,
 ultimately, software may still be
                                      SafeNet gives security teams the fl exibility they need to apply tokenization and encryption in
    vulnerable—a hacker using an
                                      ways that yield the biggest benefi t for their business and security objectives.
inline key logger may still be able
  to compromise access controls.      Benefits
                                      By employing SafeNet tokenization, organizations can enjoy a range of benefits:
Hardware-based solutions offer
  an additional layer of security       •	 Ensure PCI compliance and strengthen security. With SafeNet, organizations can address
   that is critical for these vital        PCI rules by securing credit card information with format-preserving tokenization. Further,
                         systems.”         they can optimize the security of sensitive data through the hardened DataSecure appliance,
                                           which features secure key storage and backup, granular administrative controls, and more.

                                      Further, SafeNet enables businesses to protect a wide range of data types in addition to credit
                                      card information, including bank transaction data, personnel records, and more.

                                        •	 Reduced audit costs. SafeNet helps security teams save time and money by restricting
                                           the number of devices that need to be audited. When facing an audit for PCI compliance,
                                           many organizations must certify regulatory compliance for each server where sensitive
                                           data resides. Because SafeNet Tokenization replaces sensitive data in databases and
                                           applications with tokens, there are fewer servers to audit. Reducing the scope of audits
                                           helps save time and money.

                                        •	 Streamline security administration and integration. With SafeNet, organizations can
                                           leverage a central platform for managing policies, lifecycle key management, maintenance,
                                           and auditing through a single solution for both tokenization and encryption. Further, they
                                           can deploy tokenization with full application transparency, which eliminates the need to
                                           customize applications to accommodate tokenized data.

                                        •	 Alignment with VISA best practices. SafeNet Transparent Tokenization is in alignment
                                           with the recently published VISA Best Practices for Tokenization version 1.0 in regards to
                                           token generation, token mapping, use of a data vault as a cardholder data repository using
                                           encryption, and strong cryptographic key management. (http://usa.visa.com/download/
                                           merchants/tokenization_best_practices.pdf)

                                      SafeNet Tokenization offers a variety of integration options, providing customers with the
                                      fl exibility to choose the right security technique for their environment, while enabling
                                      them to protect more data types—without affecting business logic, database architecture,
                                      storage systems, or other critical enterprise components. SafeNet Tokenization also enables
                                      development teams to move or replicate production data to test environments without having to
                                      de-identify or mask data. With SafeNet Tokenization, organizations can keep data protected with
                                      optimal efficiency and cost-effectiveness.




                                      Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper   5
Features
                                    SafeNet offers a range of critical features:
Benefits
•	 Ensure PCI compliance and          •	 Format-preserving tokenization. Ensure transparent interactions with applications and
   strengthen security                   users by defi ning the format of the unique value or token during assignment. By preserving
•	 Reduced audit costs                   the format of the data in the token values, applications that interact with the data will not
•	 Streamline security                   require customization. SafeNet supports various data formats, including partially masked
   administration and integration        data, such as XXXXX6789.
•	 Alignment with VISA best
                                      •	 Token variations. Choose from a range of token variations by tokenizing random digits,
   practices
                                         sequential numbers, preserving the fi rst two or six digits, or the fi rst two and the last four.

                                      •	 Support for an array of data types. Protect a full array of data, ranging from credit card
                                         numbers and member IDs to social security numbers and driver’s license numbers.

                                      •	 Broad platform support. Enjoy complete deployment fl exibility through SafeNet’s support
                                         for a wide range of applications and Web servers, including Oracle, IBM, BEA, J2EE, Apache,
                                         Sun ONE, JBoss. In addition, SafeNet offers data and token storage for Oracle and Microsoft
                                         SQL Server.

                                    SafeNet Transparent Tokenization Deployment
                                    Following is an overview of how the tokenization process works:

                                              1. Sensitive data comes in through an Ecommerce system.

                                              2. Sensitive data is passed to the Tokenization Manager.

                                              3. Tokenization encrypts the sensitive data, stores it, and returns a token,

                                              4. Other enterprise systems are passed tokens transparently.

                                              5. PCI Auditor only needs to inspect the tokenized database or data vault and sample
                                              any active applications to ensure proper tokenization technique; otherwise, the systems
                                              be removed from scope.


Features                            How Tokenization Works
•	 Format-preserving                                                      Sensitive data comes in through
   tokenization                                                      1    an Ecommerce system

•	 Token variations
•	 Support for an array of data           Enterprise                                                   2
                                                                                                             Sensitive data is passed
                                                                                                             to Tokenization Manager
   types                                  Application

•	 Broad platform support                                                                                   Tokenization
                                                           Tokenization encrypts the                          Manager                                       PCI
                                                      3    sensitive data, stores it
                                                           and returns a token
                                                                                                                                                           Auditor

                                                                                                            DataSecure                  6   PCI Auditor only needs to
                                                                                                                                            inspect tokenized database
                                               4   Other Enterprise systems                                                                 and active applications
                                                   pass tokens to Tokenization
                                                   Manager
                                                                                                                               Tokenization decrypts and
                                                                                                                           5   returns sensitive data




                                                            Order
                                                          Processing
                                                           Systems                                          Payment
                                                                                                            Systems

                                                                                 Customer
                                                                                  Service
                                                                                 Systems




                                    Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper                                    6
Conclusion
For organizations tasked with ensuring PCI compliance, the battle is never over. In this effort,
tokenization is becoming an increasingly prevalent approach, one that can take PCI data out
of scope, and so both strengthen security and reduce compliance costs. Today, SafeNet offers
leading transparent tokenization solutions that enable organizations to fully maximize the
benefits of tokenization.

About SafeNet
Founded in 1983, SafeNet is a global leader in information security. SafeNet protects its
customers’ most valuable assets, including identities, transactions, communications, data
and software licensing, throughout the data lifecycle. More than 25,000 customers across
both commercial enterprises and government agencies and in over 100 countries trust their
information security needs to SafeNet.




Contact Us: For all office locations and contact information, please visit www.safenet-inc.com
Follow Us: www.safenet-inc.com/connected
©2011 SafeNet, Inc. All rights reserved. SafeNet and SafeNet logo are registered trademarks of SafeNet.
All other product names are trademarks of their respective owners. WP (EN) A4-3.07.11

Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper                7

Weitere ähnliche Inhalte

Mehr von SafeNet

Cloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business Model
Cloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business ModelCloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business Model
Cloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business ModelSafeNet
 
SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...
SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...
SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...SafeNet
 
A Single Strong Authentication Platform for Cloud and On-Premise Applications
A Single Strong Authentication Platform for Cloud and On-Premise ApplicationsA Single Strong Authentication Platform for Cloud and On-Premise Applications
A Single Strong Authentication Platform for Cloud and On-Premise ApplicationsSafeNet
 
Securing Digital Identities and Transactions in the Cloud Security Guide
Securing Digital Identities and Transactions in the Cloud Security GuideSecuring Digital Identities and Transactions in the Cloud Security Guide
Securing Digital Identities and Transactions in the Cloud Security GuideSafeNet
 
Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...
Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...
Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...SafeNet
 
Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...
Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...
Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...SafeNet
 
Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...
Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...
Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...SafeNet
 
Hardware Security Modules: Critical to Information Risk Management
Hardware Security Modules: Critical to Information Risk ManagementHardware Security Modules: Critical to Information Risk Management
Hardware Security Modules: Critical to Information Risk ManagementSafeNet
 
Strong Authentication: Securing Identities and Enabling Business
Strong Authentication: Securing Identities and Enabling BusinessStrong Authentication: Securing Identities and Enabling Business
Strong Authentication: Securing Identities and Enabling BusinessSafeNet
 
Building Trust into eInvoicing: Key Requirements and Strategies
Building Trust into eInvoicing: Key Requirements and StrategiesBuilding Trust into eInvoicing: Key Requirements and Strategies
Building Trust into eInvoicing: Key Requirements and StrategiesSafeNet
 
A Question of Trust: How Service Providers Can Attract More Customers by Deli...
A Question of Trust: How Service Providers Can Attract More Customers by Deli...A Question of Trust: How Service Providers Can Attract More Customers by Deli...
A Question of Trust: How Service Providers Can Attract More Customers by Deli...SafeNet
 
Payment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNet
Payment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNetPayment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNet
Payment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNetSafeNet
 
E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...
E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...
E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...SafeNet
 
SafeNet DataSecure vs. Native SQL Server Encryption
SafeNet DataSecure vs. Native SQL Server EncryptionSafeNet DataSecure vs. Native SQL Server Encryption
SafeNet DataSecure vs. Native SQL Server EncryptionSafeNet
 
Building Trust into DNS: Key Strategies
Building Trust into DNS: Key StrategiesBuilding Trust into DNS: Key Strategies
Building Trust into DNS: Key StrategiesSafeNet
 
Charting Your Path to Enterprise Key Management
Charting Your Path to Enterprise Key ManagementCharting Your Path to Enterprise Key Management
Charting Your Path to Enterprise Key ManagementSafeNet
 
Secure PIN Management How to Issue and Change PINs Securely over the Web
Secure PIN Management How to Issue and Change PINs Securely over the WebSecure PIN Management How to Issue and Change PINs Securely over the Web
Secure PIN Management How to Issue and Change PINs Securely over the WebSafeNet
 
An Enterprise Guide to Understanding Key Management
An Enterprise Guide to Understanding Key ManagementAn Enterprise Guide to Understanding Key Management
An Enterprise Guide to Understanding Key ManagementSafeNet
 
4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...
4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...
4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...SafeNet
 
Securing the Smart Grid with SafeNet HSMs
Securing the Smart Grid with SafeNet HSMsSecuring the Smart Grid with SafeNet HSMs
Securing the Smart Grid with SafeNet HSMsSafeNet
 

Mehr von SafeNet (20)

Cloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business Model
Cloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business ModelCloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business Model
Cloud Monetization: A Step-by-Step Guide to Optimizing Your SaaS Business Model
 
SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...
SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...
SafeWord 2008 Migration Bundle Building a Fully Trusted Authentication Enviro...
 
A Single Strong Authentication Platform for Cloud and On-Premise Applications
A Single Strong Authentication Platform for Cloud and On-Premise ApplicationsA Single Strong Authentication Platform for Cloud and On-Premise Applications
A Single Strong Authentication Platform for Cloud and On-Premise Applications
 
Securing Digital Identities and Transactions in the Cloud Security Guide
Securing Digital Identities and Transactions in the Cloud Security GuideSecuring Digital Identities and Transactions in the Cloud Security Guide
Securing Digital Identities and Transactions in the Cloud Security Guide
 
Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...
Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...
Securing Network-Attached HSMs: The SafeNet Luna SA Three-Layer Authenticatio...
 
Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...
Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...
Introduction to PKI & SafeNet Luna Hardware Security Modules with Microsoft W...
 
Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...
Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...
Cloud Computing and the Federal Government: Maximizing Trust Supporting the M...
 
Hardware Security Modules: Critical to Information Risk Management
Hardware Security Modules: Critical to Information Risk ManagementHardware Security Modules: Critical to Information Risk Management
Hardware Security Modules: Critical to Information Risk Management
 
Strong Authentication: Securing Identities and Enabling Business
Strong Authentication: Securing Identities and Enabling BusinessStrong Authentication: Securing Identities and Enabling Business
Strong Authentication: Securing Identities and Enabling Business
 
Building Trust into eInvoicing: Key Requirements and Strategies
Building Trust into eInvoicing: Key Requirements and StrategiesBuilding Trust into eInvoicing: Key Requirements and Strategies
Building Trust into eInvoicing: Key Requirements and Strategies
 
A Question of Trust: How Service Providers Can Attract More Customers by Deli...
A Question of Trust: How Service Providers Can Attract More Customers by Deli...A Question of Trust: How Service Providers Can Attract More Customers by Deli...
A Question of Trust: How Service Providers Can Attract More Customers by Deli...
 
Payment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNet
Payment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNetPayment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNet
Payment Card Security: 12-Steps to Meeting PCI-DSS Compliance with SafeNet
 
E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...
E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...
E-Passport: Deploying Hardware Security Modules to Ensure Data Authenticity a...
 
SafeNet DataSecure vs. Native SQL Server Encryption
SafeNet DataSecure vs. Native SQL Server EncryptionSafeNet DataSecure vs. Native SQL Server Encryption
SafeNet DataSecure vs. Native SQL Server Encryption
 
Building Trust into DNS: Key Strategies
Building Trust into DNS: Key StrategiesBuilding Trust into DNS: Key Strategies
Building Trust into DNS: Key Strategies
 
Charting Your Path to Enterprise Key Management
Charting Your Path to Enterprise Key ManagementCharting Your Path to Enterprise Key Management
Charting Your Path to Enterprise Key Management
 
Secure PIN Management How to Issue and Change PINs Securely over the Web
Secure PIN Management How to Issue and Change PINs Securely over the WebSecure PIN Management How to Issue and Change PINs Securely over the Web
Secure PIN Management How to Issue and Change PINs Securely over the Web
 
An Enterprise Guide to Understanding Key Management
An Enterprise Guide to Understanding Key ManagementAn Enterprise Guide to Understanding Key Management
An Enterprise Guide to Understanding Key Management
 
4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...
4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...
4 Steps to Financial Data Security Compliance Technologies to Help Your Finan...
 
Securing the Smart Grid with SafeNet HSMs
Securing the Smart Grid with SafeNet HSMsSecuring the Smart Grid with SafeNet HSMs
Securing the Smart Grid with SafeNet HSMs
 

Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization

  • 1. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization WHITE PAPER Tokenization is gaining increased adoption in a range of organizations and industries. By effectively taking PCI data out of scope, tokenization presents a host of benefits, helping organizations both boost security and reduce PCI compliance efforts and costs. This paper offers a detailed look at tokenization and offers practical guidelines for helping organizations successfully employ tokenization so they can maximize the potential benefits. Introduction: Challenges of Compliance How good is good enough? When it comes to security, the question continues to be a vexing one for just about any organization. For companies regulated by the Payment Card Industry Data Security Standard (PCIDSS), the question remains, even after a successfully completed audit. The very next day a new system may be installed, a new threat discovered, a new user added, a new patch released. If an audit is passed and a breach occurs, the impact would still potentially be devastating. IT infrastructures, security solutions, threats, regulations, and their interpretation continue to evolve. That’s why, when it comes to security, organizations need to take a defense-in-depth approach, and the work is never done. This holds true for organizations in virtually any industry. A company needs to maintain vigilance in securing the personally identifi able information of employees, whether national IDs, social security numbers, etc. Organizations complying with Sarbanes-Oxley, the Health Insurance Portability and Accountability Act (HIPAA), HITECH, the EU Data Privacy Directive, or any other regulation have a fundamental requirement to secure sensitive data. Within this context, business and security leaders must constantly strive to fi nd a balance, weighing budget allocations, staffi ng, new investments, and ongoing costs vs. security objectives. Given that, it is incumbent upon security teams to refi ne their approaches in order to maximize effi ciency while they maximize security. That’s why many organizations have looked to tokenization. This paper offers a detailed look at tokenization and how it can support organizations’ PCI compliance efforts. The paper compares tokenization to encryption and other approaches, including some of the factors to consider in choosing which approach is best for a given deployment scenario. In addition, the paper describes an approach from SafeNet, transparent tokenization, and it reveals some of the specifi c advantages and benefi ts this solution offers to organizations looking to safeguard sensitive data in the most effective and effi cient manner possible. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 1
  • 2. Weighing Tokenization Alternatives: Encryption, Data Masking, and Other Approaches In today’s security landscape, there are many alternatives organizations can choose from as they set out to ensure optimal security and sustain compliance. Following is an overview of several approaches that represent an alternative or a complement to tokenization. “Encrypted data may be deemed out of scope if, and only if, it Encryption In most PCI-regulated organizations, cardholder data will need to be retrieved in the clear at has been validated that the some point. Given that, encryption will be a fundamental requirement, a way to ensure sensitive entity that possesses encrypted payment information is only accessible by authorized users for authorized purposes. When cardholder data does not have plotting security strategies, however, it is important to factor in the degree to which encryption the means to decrypt it.” affects the scope of an organization’s PCI compliance efforts. --PCI SSC Issues Statement on As the PCI Security Standards Council makes clear, encrypted data is still in scope in any Scope of Encrypted Data via FAQ organization that has mechanisms in place for decrypting that data. In other words, if a merchant 10359, Issued 11/10/2009 uses an off-site storage facility, and encrypts payment data before it is transported off site, that facility’s operations would not be in scope—as long as there were no capabilities within the facility to decrypt that data. In this way, encryption can help reduce the scope of compliance. However, within an organization that is employing encryption mechanisms, and so has the ability to decrypt data, care should be taken to minimize the occurrence of systems that store or access encrypted data. This is true for several reasons: • Scope of compliance and costs. It is important to bear in mind that the systems managing encryption, and the housing and transmission of encrypted data, are very much in scope of PCI, and so must adhere to the spectrum of PCI regulations, including malware protection, multi-factor authentication, and, perhaps most importantly, rigorous key protection mechanisms. Further, each of these systems will be under the purview of a PCI audit, and the more such systems audited, the higher the overall audit expense will be. • Application integration. All the applications that need to access encrypted data will typically need to be modifi ed to accommodate the changes in data type and fi eld size that accompany the move from clear text and binary data to accommodate the lengthier fi eld sizes of cipher text. Depending on the number and type of applications involved, these changes can represent a signifi cant investment in time and money. Format Preserving Encryption Format preserving encryption has been introduced by several vendors in recent years in order to minimize the implications of encryption on associated applications. However, at the time of the publication of this paper, the PCI Security Standards Council has not issued a formal policy around format preserving encryption, leaving open whether, and which of, these techniques are acceptable to meet compliance mandates. Further, many algorithms and modes may not have been approved by standards bodies, such as the National Institute of Standards and Technology (NIST). Because format preserving encryption must return a shorter value than strong encryption algorithms would normally create, the strength of the ciphertext is reduced in comparison to transparent tokenization which is based on proven algorithms. Additionally, if a malicious attack results in the capture of the key used for the format preserving encryption and its associated algorithm, then the clear text could be derived whereas, a token cannot be derived by the systems interacting with the tokenized data which is why those systems remain out of audit scope. Comparison Transparent Tokenization Format Preserving Encryption Reduce Audit Scope   Not vulnerable to decryption   Higher security strength   Proven algorithms   Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 2
  • 3. Data Masking Data masking is another approach to consider when it comes to many enterprise’s security and “Transparent tokenization is a compliance objectives. Data masking is an approach typically used in testing and development very useful technique to remove environments, and is particularly useful when outsourcing application development. Data sensitive data from a database masking is used to ensure that application development environments don’t compromise the system, by replacing it with security of real customer data. With data masking, sensitive data is replaced with realistic, but not real, data. While data masking may be a useful technique, development organizations need similarly formatted data that is to ensure such aspects as referential integrity are addressed, and that the mechanism used to not sensitive in any way,” mask data isn’t susceptible to reverse engineering techniques that could uncover real data. “Although this means that no Given the characteristics and considerations of the alternatives above, tokenization is an changes in the database schema approach that is gaining increased market acceptance. The following section offers a range of have to be made, there are still insights and considerations for employing tokenization most effectively. some changes that may be required on associated systems Keys to Successful Tokenization In recent years, tokenization has increasingly become an integral approach for PCI compliance, in order for them to integrate helping organizations both strengthen the security of payment data while reducing overall properly with the tokenization security and PCI audit costs. solution. In addition, special care has to be taken to make Employed for online credit card transactions or transmission of other sensitive data, tokenization sure that the systems are not works by replacing sensitive data with tokens that retain the characteristics of the original data. With tokenization, security teams can ensure that databases, applications, and users cannot misinterpreting the new token access sensitive data, and only interact with placeholders for that sensitive data. Tokenization data, for example, using it as systems convert the sensitive data to an encrypted token in the same format as the original data, basis for a business intelligence allowing associated applications to continue operating seamlessly. Masking features can also be solution. It is important to maintained if a subset of the data needs to be available for authentication. understand that most operations Effectively implemented tokenization can signifi cantly reduce an organization’s security and would still need to be performed PCI compliance costs. When applications have access to tokenization, but have no means to on the actual data.” reverse tokenization and access cardholder data in the clear, those applications are considered Alexandre Pinto, out of scope. As a result, organizations don’t need to employ the range of PCI-mandated security mechanisms on these systems. Further, these approaches thus reduce the cost of ongoing PCI Senior Technical Manager and audits. According to Simon Sharp, Director, Illumis, “An assessor will also inspect samples of PCI QSA, CIPHER Security all systems to ensure that cardholder data is not present, particularly where personal account numbers (PANs) used to be in order to ensure that tokenization is working. Therefore, it is important to make sure there is a distinction between the tokenized values and the PAN so the system can be removed from scope.” Following are some important considerations and strategies to consider when planning new tokenization implementations. Minimize Instances of Sensitive Cardholder Data—Whether through Deletion or Tokenization Before employing encryption, tokenization, or any other security mechanism, organizations should start by ensuring cardholder data is only stored and accessible where there’s an absolute business need to do so. If there isn’t, eliminating the sensitive data completely, and the inherent exposure, is a critical fi rst step. It is critical to assess the impact of removing, encrypting, or tokenizing the data that resides on a given system. Once sensitive data has been discovered, it needs to be analyzed in terms of the associations and interdependencies of other systems. For example, if a business process requires access to the sensitive data, will those processes be affected by encrypting or tokenizing that sensitive data? If not accounted for, the impact of tokenization on those associated processes may cause signifi cant problems for the business. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 3
  • 4. Next, security teams need to determine where and how tokenization can be employed. Today, “One of the biggest areas of tokenization is typically employed in one of two ways: value we can provide is in helping • Outsourced. Within an e-commerce scenario, a retailer can outsource tokenization reduce audit scope, both by entirely so they never have the potential to access cardholder data in the clear within their consolidating systems and systems. For example, after an online transaction is completed, the card information can be processes—and really ensuring transparently redirected to the service provider, who then converts the card data into a token that there’s a good business and returns the token to the retailer. The downside with this approach is that it can be very diffi cult for a retailer to change service providers, given the complexity of migrating tokens reason for keeping sensitive and payment data. Further, this approach may not be an option for retailers that use multiple payment data accessible to card processors. a given system or process,” explained Brian Serra, PCI • In house. Here, the merchant would manage converting card numbers into tokens so associated downstream applications would not be able to access cardholder data in the Program Manager, CISSP, QSA, clear. While this approach does not reduce the scope of compliance nearly as much as the and ISO ISMS Lead Auditor, fi rst scenario, the trade-off is that the merchant will have more ongoing fl exibility and will Accuvant. “Practically, for every avoid the potential for being locked into a given service provider. system taken out of audit scope, In either case, these approaches can provide substantial benefi ts. On the other hand, security a business generally saves about teams may not want to use tokenization in cases in which users or applications need capabilities two hours of auditing time—plus to access payment data in the clear. If systems or users need to be authorized to use cardholder a great deal of expense in data in clear text, encryption may be a better alternative or complement to tokenization. applying and maintaining all the Particularly in cases in which there is unstructured data, for example, the data in spreadsheets security mechanisms required by and Word documents, encryption would be complementary to tokenization employed with the PCI standard.” structured data. Leverage Proven Third-Party Solutions When PCI auditors are verifying the compliance of encryption and tokenization, a critical first question stems around the types of technologies used. If a merchant or fi nancial institution has employed an internally developed system for all or part of these areas, the scope of an audit will inherently grow—each facet of the implementation, everything from access controls to key rotation will need to be inspected and verifi ed. Consequently, internally developed systems can signifi cantly increase audit costs, not to mention increased upfront investments and ongoing development. On the other hand, if organizations employ compliant commercial solutions that are already vetted by PCI auditors, they simplify the audit process, enabling auditors to focus on the manner in which the security systems are implemented, rather than the mechanisms themselves. Further, it is important to view the tokenization infrastructure in a cohesive fashion and ensure all aspects are secured. “One of the areas that is often Centrally Manage Encryption and Tokenization a focus for our auditing efforts Whenever possible, organizations should leverage systems that offer integrated capabilities for is the security of the lookup both encryption and tokenization on one platform. These solutions offer a range of benefi ts: table, which relates the token to • Cost savings. If tokenization solutions operate independently of encryption, the cost of the original PAN,” Benj Hosack, upfront purchase, initial integration, and ongoing maintenance will typically be much higher. Director, Foregenix. “This is • Simplifi ed auditing and remediation. When logs and policies are gathered and tracked fundamental to the solution across various point solutions, demonstrating and maintaining compliance grows more and needs to be protected complex. accordingly. That’s why working • Centralized key management. By leveraging key management from a common platform, with reputable suppliers with administrators can establish best practices for tokenized data in accordance with PCI experience and expertise in this DSS or VISA, as well as for encrypted data. For instance, having the fl exibility to use the area is recommended.” strongest encryption keys for the components of the token vault, such as AES256 for the ciphertext of the PAN and SHA256 for the protecting the associated hash or token value. • Consistent Enforcement of Policy. It is also important to centrally enforce protection policies to control not only what data is protected in which manner (tokenized or encrypted) and where, but to also manage the permissions for privileged users and systems. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 4
  • 5. To optimize these benefi ts, organizations should look for solutions that offer the scalability required to accommodate high transaction volumes. Further, they should employ solutions that offer the broadest support for industry standards, tokenization and encryption approaches, and more, to ensure initial investments can be maximized in the long term. This is especially important knowing compliance is not a static event but an ongoing effort for as long as an organization has to manage sensitive data. Optimize Security with Hardware-based Platforms Whenever possible, organizations should leverage hardware-based platforms, which provide a vital layer of protection for sensitive business assets. Robust hardware-based encryption and “I’m a big proponent that tokenization platforms feature capabilities like centralized, secure backup, and more limited tokenization, key management, access points, which can signifi cantly strengthen overall security. and encryption should be done in hardware wherever possible,” SafeNet Transparent Tokenization stated Simon Sharp, Illumis. SafeNet offers the robust, comprehensive, and fl exible solutions that enable organizations to “No matter how many malware boost security, ensure PCI compliance, and reduce security costs. With SafeNet, security teams mechanisms may be employed, get the capabilities they need to maximize the benefi ts of tokenization in reducing audit scope and strengthening security. Through its integrated tokenization and encryption capabilities, ultimately, software may still be SafeNet gives security teams the fl exibility they need to apply tokenization and encryption in vulnerable—a hacker using an ways that yield the biggest benefi t for their business and security objectives. inline key logger may still be able to compromise access controls. Benefits By employing SafeNet tokenization, organizations can enjoy a range of benefits: Hardware-based solutions offer an additional layer of security • Ensure PCI compliance and strengthen security. With SafeNet, organizations can address that is critical for these vital PCI rules by securing credit card information with format-preserving tokenization. Further, systems.” they can optimize the security of sensitive data through the hardened DataSecure appliance, which features secure key storage and backup, granular administrative controls, and more. Further, SafeNet enables businesses to protect a wide range of data types in addition to credit card information, including bank transaction data, personnel records, and more. • Reduced audit costs. SafeNet helps security teams save time and money by restricting the number of devices that need to be audited. When facing an audit for PCI compliance, many organizations must certify regulatory compliance for each server where sensitive data resides. Because SafeNet Tokenization replaces sensitive data in databases and applications with tokens, there are fewer servers to audit. Reducing the scope of audits helps save time and money. • Streamline security administration and integration. With SafeNet, organizations can leverage a central platform for managing policies, lifecycle key management, maintenance, and auditing through a single solution for both tokenization and encryption. Further, they can deploy tokenization with full application transparency, which eliminates the need to customize applications to accommodate tokenized data. • Alignment with VISA best practices. SafeNet Transparent Tokenization is in alignment with the recently published VISA Best Practices for Tokenization version 1.0 in regards to token generation, token mapping, use of a data vault as a cardholder data repository using encryption, and strong cryptographic key management. (http://usa.visa.com/download/ merchants/tokenization_best_practices.pdf) SafeNet Tokenization offers a variety of integration options, providing customers with the fl exibility to choose the right security technique for their environment, while enabling them to protect more data types—without affecting business logic, database architecture, storage systems, or other critical enterprise components. SafeNet Tokenization also enables development teams to move or replicate production data to test environments without having to de-identify or mask data. With SafeNet Tokenization, organizations can keep data protected with optimal efficiency and cost-effectiveness. Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 5
  • 6. Features SafeNet offers a range of critical features: Benefits • Ensure PCI compliance and • Format-preserving tokenization. Ensure transparent interactions with applications and strengthen security users by defi ning the format of the unique value or token during assignment. By preserving • Reduced audit costs the format of the data in the token values, applications that interact with the data will not • Streamline security require customization. SafeNet supports various data formats, including partially masked administration and integration data, such as XXXXX6789. • Alignment with VISA best • Token variations. Choose from a range of token variations by tokenizing random digits, practices sequential numbers, preserving the fi rst two or six digits, or the fi rst two and the last four. • Support for an array of data types. Protect a full array of data, ranging from credit card numbers and member IDs to social security numbers and driver’s license numbers. • Broad platform support. Enjoy complete deployment fl exibility through SafeNet’s support for a wide range of applications and Web servers, including Oracle, IBM, BEA, J2EE, Apache, Sun ONE, JBoss. In addition, SafeNet offers data and token storage for Oracle and Microsoft SQL Server. SafeNet Transparent Tokenization Deployment Following is an overview of how the tokenization process works: 1. Sensitive data comes in through an Ecommerce system. 2. Sensitive data is passed to the Tokenization Manager. 3. Tokenization encrypts the sensitive data, stores it, and returns a token, 4. Other enterprise systems are passed tokens transparently. 5. PCI Auditor only needs to inspect the tokenized database or data vault and sample any active applications to ensure proper tokenization technique; otherwise, the systems be removed from scope. Features How Tokenization Works • Format-preserving Sensitive data comes in through tokenization 1 an Ecommerce system • Token variations • Support for an array of data Enterprise 2 Sensitive data is passed to Tokenization Manager types Application • Broad platform support Tokenization Tokenization encrypts the Manager PCI 3 sensitive data, stores it and returns a token Auditor DataSecure 6 PCI Auditor only needs to inspect tokenized database 4 Other Enterprise systems and active applications pass tokens to Tokenization Manager Tokenization decrypts and 5 returns sensitive data Order Processing Systems Payment Systems Customer Service Systems Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 6
  • 7. Conclusion For organizations tasked with ensuring PCI compliance, the battle is never over. In this effort, tokenization is becoming an increasingly prevalent approach, one that can take PCI data out of scope, and so both strengthen security and reduce compliance costs. Today, SafeNet offers leading transparent tokenization solutions that enable organizations to fully maximize the benefits of tokenization. About SafeNet Founded in 1983, SafeNet is a global leader in information security. SafeNet protects its customers’ most valuable assets, including identities, transactions, communications, data and software licensing, throughout the data lifecycle. More than 25,000 customers across both commercial enterprises and government agencies and in over 100 countries trust their information security needs to SafeNet. Contact Us: For all office locations and contact information, please visit www.safenet-inc.com Follow Us: www.safenet-inc.com/connected ©2011 SafeNet, Inc. All rights reserved. SafeNet and SafeNet logo are registered trademarks of SafeNet. All other product names are trademarks of their respective owners. WP (EN) A4-3.07.11 Reducing PCI Compliance Costs and Effort with SafeNet Transparent Tokenization White Paper 7