The use of data "tokens" as a substitute for sensitive cardholder data has achieved widespread acceptance among merchants seeking to minimize or eliminate cardholder data in their information processing systems. In response to the many different "tokenization" solutions offered by the myriad of competing vendors, the PCI Security Standards Council has issued an Informational Supplement entitled "PCI DSS Tokenization Guidelines" to provide guidance for payment industry stakeholders when developing, evaluating, or implementing data replacement technologies.
In its most basic form, data replacement technology, or "tokenization" is a process whereby, when card data is first received by the merchant, the card number is stored in a secure database environment, and a link, or "token" referencing that entry in the database is generated and used in place of the actual card data for all subsequent processing operations. For example, merchants who keep card numbers "on file" for use in future transactions traditionally simply retained sensitive card data together in the same database with other information such as customer names and addresses, an approach that places the entire customer database–and all of the systems that access it–into "scope" for PCI compliance. With tokenization, however, the actual card data is securely stored in a separate database, and only the corresponding "tokens" are stored with the customer data, removing the customer database and related systems from "scope" for PCI compliance since they no longer will contain any sensitive cardholder data.
The method by which a tokenization solution generates tokens, the manner and location in which the actual card data is stored, and the process by which tokens can be used to retrieve or otherwise reproduce the corresponding card data can all impact the security provided by any particular tokenization solution and the steps required to ensure PCI compliance. This document describes the tokenization features of the Monetra transaction processing software, and explains how those features align with the PCI Tokenization guidelines, and how they can be used to reduce the scope of work needed for achieving and maintaining compliance with PCI requirements.
Tokenization features have been part of Monetra since 2002. The Monetra transaction tracking identifier (TTID), for instance, may be used instead of the card number when performing subsequent operations such as tip adjustments, voids, refunds, and reversals. Application programs typically use Monetra TTIDs instead of the actual card numbers for performing those operations. Monetra TTID tokens remain valid until the transaction batch data has been purged, or "secured," which typically occurs after 30-90 days, depending on the merchant business requirements.
The Monetra Data Security Shield subsystem (Monetra DSS) was added to Monetra in 2006 to provide recurring billing and secure "card-on-file" features. The Monetra DSS generates unique token values that can be stored and used in place of actual card numbers. The tokens provided by the Monetra DSS remain valid until they are specifically deleted, or when the underlying credit card expires, whichever occurs first. This may be up to 24 months or more.
The Monetra CardShield® system, introduced in 2010, provides support for "end-to-end encryption" and, in some configurations, produces special short-term tokens, or "tickets," that enable "tokenization" of data from card swipe readers, isolating legacy applications from sensitive cardholder data. CardShield "tickets" generally expire within a few minutes.
In the outline below we have taken highlights directly from each section of the PCI Tokenization Guidelines and explained how the data replacement features of Monetra compare against that guidance.
Please contact us if you have any questions regarding Monetra features or how to achieve and maintain PCI compliance when using our products.
|The security of an individual token relies predominantly on the in-feasibility of determining the original PAN knowing only the surrogate value.||Monetra has been validated to securely generate tokens via the use of incremental values that conform to documented specifications.
Because Monetra tokens are generated incrementally it is impossible for an attacker to derive/determine the original PAN value from a Monetra generated token.
|One of the primary goals of a tokenization solution should be to replace sensitive PAN values with non-sensitive token values. For a token to be considered non-sensitive, and thus not require any security or protection, the token must have no value to an attacker.||Key points to consider when evaluating Monetra tokenization:
Original PAN: 5405 2222 2222 2226
Monetra token: 1001 0000 0000 0015
Note: In the example token shown above you can see a default BIN range of 1001, the number is 16 digits long and it will pass a MOD-10 check. This format affords merchants, wishing to maintain the use of current in-house data-stores, the ability to simply replace out the sensitive PAN's with secure tokens.
|Tokens can be generally identified as either single-use or multi-use. A single-use token is typically used to represent a specific, single transaction. A multi-use token represents a specific PAN, and may be used to track an individual PAN across multiple transactions.||Tokens generated by Monetra DSS system are considered multi-use tokens and may be used until either A) The actual PAN on record expires or B) The token is removed from the system.|
|Token mapping is the process of assigning a token to the original PAN value. When a PAN is submitted for tokenization, the generated token and the original PAN are typically stored in the card-data vault. Token mapping provides the ability to retrieve either a particular PAN or a particular token, depending on how the solution is implemented and the type of request.||A key point regarding Monetra token mapping:
comment=testing the token system
This architecture assures there is no way to programmatically retrieve a PAN from a Monetra token.
|In a tokenization system, the card data vault (or "data vault") is the central repository for PANs and tokens and is used by the token-mapping process. Wherever PAN data exists, it must be managed and protected in accordance with PCI DSS requirements.||Monetra has been validated to use the same, proven and highly secure database for token storage as it does for general transaction data. All sensitive data is stored only in encrypted format using approved, high-security cryptographic ciphers.|
|In a tokenization solution, cryptographic key management applies to keys used for encrypting PAN in the card data vault, as well as any keys used in the generation of the tokens themselves.||Monetra tokens are not cryptographically generated, however the PAN is stored encrypted within the card data vault and cryptographic keys are managed with secure, PA-DSS validated procedures. Tokens are not cryptographically generated, so no keys are used for that purpose.|
|There are numerous ways to implement a tokenization solution. As a general principle, tokenization and de-tokenization operations should occur only within a clearly defined tokenization system that includes a process for approved applications to submit tokenization and de-tokenization requests.||All tokenization requests within Monetra happen at the transaction level and are governed by the currently validated security infrastructure. Note: Monetra does not provide a programmatic way to de-tokenize stored PAN's.|
|Only authenticated users and system components should be allowed access to the tokenization system and tokenization/de-tokenization processes.||All tokenization requests within Monetra happen at the transaction level and are governed by the currently validated security infrastructure including full authentication requirements.|
|The tokenization system should provide comprehensive and robust monitoring.||All tokenization requests within Monetra happen at the transaction level and are governed by the currently validated security infrastructure including robust logging and reporting.|
|The tokenization solution should include a mechanism for distinguishing between tokens and actual PANs.||By design, Monetra allows the BIN's used in the token system to be configurable by the end user. This makes the process of identifying tokens from actual PAN's very simple.|
|Because the tokenization system stores, processes and/or transmits cardholder data, it must be installed, configured, and maintained in a PCI DSS compliant manner.||Monetra's tokenization system is built-in and turned on by default. There is nothing extra or special required to use the DSS/token system from within Monetra.|
|Roles and responsibilities for a tokenization solution may be distributed between the various stakeholders—typically the merchant and tokenization service provider (TSP)— depending on its particular implementation or deployment model.||All Tokenization features happen within the core of Monetra. Like all other features they are governed by strong 'role-based' access controls and extensive logging/audit tools.|
|The TSP has the overall responsibility for the design of an effective tokenization solution.||The Monetra tokenization features are designed to be simple to use and make it easy for merchants and TSPs to comply with PCI requirements.|
|In general, tokenization can provide a model to centralize cardholder data storage and minimize the number of cardholder data occurrences in an environment.||The Monetra tokenization features enable merchants and gateway providers to centralize cardholder data storage and can greatly reduce or even eliminate cardholder data from the merchant environment.|
When tokens are used to replace PAN in the merchant environment, both the tokens and the systems they reside on must be evaluated to determine whether they require protection and should be in scope for PCI DSS. To be considered out of scope for PCI DSS, both the tokens and the systems they reside on would need to have no value to an attacker attempting to retrieve PAN, nor should they in any way be able to influence the security of cardholder data or the CDE. Monetra's proven tokenization features enable achieving those objectives easily, with little or no disruption to existing systems.