issa chicago next generation tokenization ulf mattsson apr 2011
DESCRIPTION
Data tokenization for PCI DSS, HIPAA HITECH and US state lawsTRANSCRIPT
Next Generation Tokenization for Compliance and Cloud Data
Protection
Ulf MattssonCTO Protegrity
ulf . mattsson AT protegrity . com
Ulf Mattsson
20 years with IBM Development & Global Services
Inventor of 22 patents – Encryption and Tokenization
Co-founder of Protegrity (Data Security)
Research member of the International Federation for Information Processing (IFIP) WG 11.3 Data and Application Security
Member ofMember of
• PCI Security Standards Council (PCI SSC)
• American National Standards Institute (ANSI) X9
• Cloud Security Alliance (CSA)
• Information Systems Security Association (ISSA)
• Information Systems Audit and Control Association (ISACA)
02
03
ISACA - Articles About Compliance
05
PCI DSS & Visa USA – 10 Years
06
Cloud SecurityCloud Security
07
No Confidence in Cloud Security (Oct 2010)
CSO Magazine Survey: Cloud Security Still a Struggle for Many Companies
A recent article written by Bill Brenner, senior editor at CSO Magazine, reveals that companies are still a bit scared of putting critical data in the cloud. Results from the 8th Annual Global Information Security Survey conducted by CSO, along with CIO and PriceWaterhouseCoopers, conducted by CSO, along with CIO and PriceWaterhouseCoopers, cites: 62% of companies have little to no confidence in their ability to secure any assets put in the cloud. Also, of the 49% of respondents who have ventured into cloud computing, 39% have major qualms about security.
08
Source, CSO. October, 2010 : http://www.csoonline.com/
Risks Associated with Cloud Computing
Uptime/business continuity
Weakening of corporate network security
Threat of data breach or loss
Handing over sensitive data to a third party
09
The evolving role of IT managers and CIOs Findings from the 2010 IBM Global IT Risk Study
0 10 20 30 40 50 60 70
Inability to customize applications
Financial strength of the cloud computing provider
Uptime/business continuity
%
Cloud Computing to Fuel Security Market (Oct 2010)
1. "Concerns about cloud security have grown in the past year”
2. "In 2009, the fear was abstract: a general concern as there is with all new technologies when they're introduced ...
3. “Today, however, concerns are both more specific and more weighty”
4. “We see organizations placing a lot more scrutiny on cloud providers as to their controls and security processes; and they are more likely to defer adoption controls and security processes; and they are more likely to defer adoption because of security inadequacies than to go ahead despite them."
5. Opportunities in the cloud for vendors are data security, identity and access management, cloud governance, application security, and operational security.
http://www.eweek.com/c/a/Security/Forrester-Cloud-Computing-to-Fuel-Security-Market-170677/
010
What Amazon AWS’s PCI Compliance Means
1. Just because AWS is certified doesn't mean you are. You still need to deploy a PCI compliant application/service and anything on AWS is still within your assessment scope.
2. The open question? PCI-DSS 2.0 doesn't address multi-tenancy concerns
3. AWS is certified as a service provider doesn't mean all cloud IaaS providers will be
4. You can store PAN data on S3, but it still needs to be encrypted in accordance with PCI-DSS requirements
5. Amazon doesn't do this for you -- it's something you need to implement yourself; including 5. Amazon doesn't do this for you -- it's something you need to implement yourself; including key management, rotation, logging, etc.
6. If you deploy a server instance in EC2 it still needs to be assessed by your QSA
7. What this certification really does is eliminate any doubts that you are allowed to deploy an in-scope PCI system on AWS
8. This is a big deal, but your organization's assessment scope isn't necessarily reduced
9. it might be when you move to something like a tokenization service where you reduce your handling of PAN data
securosis.com011
Data BreachesData Breaches
012
The Changing Threat Landscape (Aug, 2010)
Some issues have stayed constant:
1. Threat landscape continues to gain sophistication 2. Attackers will always be a step ahead of the defenders
Different motivation, methods and tools today:
• We're fighting highly organized, well-funded crime syndicates and nations
Move from detective to preventative controls needed:
• Several layers of security to address more significant areas of risks
Source: http://www.csoonline.com/article/602313/the-changing-threat-landscape?page=2
013
Six years, 900+ breaches, and over 900 million compromised records
The majority of cases have not yet been disclosed and may never be
Over half of the breaches occurred outside of the U.S.
Data Breaches
Online Data is Compromised Most Frequently:Online Data is Compromised Most Frequently:
%
Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS
014
Compromised records 1. 90 % lost in highly sophisticated attacks2. Hacking and Malware are more dominant
Threat Action Categories
Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS
015
Patching Software vs. Locking Down Data
SoftwarePatching
User
Database
Application
Attacker
Not a Single Intrusion
ExploitedOS File System
Storage System
Backup
Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS
Exploiteda Patchable Vulnerability
AttackerPublicNetworkS
SL
Private Network
EncryptedData
(PCI DSS)
Not Enough to Encrypt the Pipe & Files
OS File System
Database
Storage System
Application
Data At Rest
(PCI DSS)
Clear Text Data
EncryptedData
(PCI DSS)
Clear Text Data
017
Protecting the Data Flow - Choose Your Defenses
Data ProtectionData Protection
019
Data Security Today is a Catch-22
We need to protect both data and the business processes that rely on that data
Enterprises are currently on their own in deciding how to apply emerging technologies for PCI data protection
Data Tokenization - an evolving technology
How to reduce PCI audit scope and exposure to dataHow to reduce PCI audit scope and exposure to data
020
EvaluatingOptionsOptions
021
Different Ways to Render the PAN Unreadable
Two-way cryptography with associated key management processes
One-way cryptographic hash functions
Index tokens and pads
Truncation
022
Positioning Different Protection Options
Area Evaluation Criteria Strong Field Encryption
Formatted Encryption
DistributedToken
Security
High risk data
Compliance to PCI, NIST
InitialCost
Transparent to applications
Expanded storage size
Transparent to databases schema
Performance impact when loading data
Operational Cost
Performance impact when loading data
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
Disconnected environments
Distributed environments
23
Best Worst
Web Application
Firewall
Applications
DatabaseColumns
Database Activity
Monitoring
Choose Your Defenses – Different Approaches
Database Server
Database Activity
Monitoring
Data Loss
Prevention
DataFiles
Database Log Files
ApplicationsMonitoring
Encryption/Tokenization
Choose Your Defenses – Cost Effective PCI DSS
Correlation or event management systems
Identity & access management systems
Access governance systems
Encryption for data in motion
Anti-virus & anti-malware solution
Encryption/Tokenization for data at rest
Firewalls
WAF
Source: 2009 PCI DSS Compliance Survey, Ponemon Institute
0 10 20 30 40 50 60 70 80 90
ID & credentialing system
Database scanning and monitoring (DAM)
Intrusion detection or prevention systems
Data loss prevention systems (DLP)
Endpoint encryption solution
Web application firewalls (WAF) WAF
DLP
DAM
%Encryption/Tokenization
Current, Planned Use of Enabling Technologies
Strong interest in database encryption, data masking, tokenization
47%
35%
39%
16%
10%
21%
30%
18%
1% 91% 5%
4%
Access controls
Database activity monitoring
Database encryption
Backup / Archive encryption 39%
28%
29%
23%
7%
7%
13%22%
7%
28%
21% 4%Backup / Archive encryption
Data masking
Application-level encryption
Tokenization
Evaluating Current Use Planned Use <12 Months
026
Risk Adjusted Data Protection
Data Protection Methods Performance Storage Security Transparency
System without data protection
Monitoring + Blocking + Masking
Format Controlling Encryption
Downstream Masking
Strong EncryptionStrong Encryption
Tokenization
Hashing
Best Worst
27
Protection Method Extensibility: Data Protection Methods must evolve with
changes in the security industry and with compliance requirements. The Protegrity
Data Security Platform can be easily extended to meet security and compliance
requirements.
Hiding Data in Plain Sight – Data Tokenization
400000 123456 7899
Y&SFD%))S( Tokenization Server
Data Token
Data Entry
Application Databases
400000 123456 7899
400000 222222 7899
028
Aggregating Hub for Store
Channel
StoresStoresAuthorization
Token Servers
Token Servers
PCI Case Study - Large Chain Store
Loss Prevention
Settlement
Analysis - EDWSettlement ERP
Servers
: Integration point
029
Case Study
Large Chain Store Uses Tokenization to Simplify PCI Compliance
By segmenting cardholder data with tokenization, a regional chain of 1,500 local convenience stores is reducing its PCI audit from seven to three months
“ We planned on 30 days to tokenize our 30 million “ We planned on 30 days to tokenize our 30 million card numbers. With Protegrity Tokenization, the whole process took about 90 minutes”
030
Qualified Security Assessors had no issues with the effective segmentation provided by Tokenization
“With encryption, implementations can spawn dozens of questions”
Case Study
“There were no such challenges with tokenization”
031
Faster PCI audit – half that time
Lower maintenance cost – don’t have to apply all 12 requirements of PCI DSS to every system
Better security – able to eliminate several business processes such as generating daily reports for data requests and access
Case Study
requests and access
Strong performance – rapid processing rate for initial tokenization, sub-second transaction SLA
032
Ramon Krikken:
Ask vendors what their token-generating algorithms are
Be sure to analyze anything other than strong random number generators for security.
What Exactly Makes a “Secure Tokenization” Algorithm?
033
Best Practices for Tokenization *
One way Irreversible Function**
Unique SequenceNumber
Hash
Randomly generated value
����
����
Secret per merchant
*: Published July 14, 2010
**: Multi-use tokens
034
Best Practices from Visa
Best Practices for Token GenerationToken type
Single-Use Multi-Use
Algorithm
and key
Known
strong
algorithm
ANSI or ISO
approved
algorithm
Unique
One way
irreversible
function
Unique
sequence
number
HashingSecret per
transaction
Secret per
merchant
Randomly
generated
value
OK OK
OK OK
Visa recommendations should have been simply to use a random number
You should not write your own 'home-grown' token servers
Comments on Visa’s Tokenization Best Practices
036
Centralized vs. Distributed Tokenization
� � �
037
Large companies may need to utilize the tokenization
services for locations throughout the world
How do you deliver tokenization to many locations
without the impact of latency?
Area ImpactDatabase
File Encryption
DatabaseColumn
Encryption
CentralizedTokenization
(old)
DistributedTokenization
(new)
Scalability
Availability
Latency
CPU Consumption
Evaluating Encryption & Tokenization Approaches
EncryptionEvaluation Criteria Tokenization
Best Worst
Security
Data Flow Protection
Compliance Scoping
Key Management
Randomness
Separation of Duties
038
Area ImpactDatabase
File Encryption
DatabaseColumn
Encryption
DynamicTokenization
(old)
StaticTokenization
(new)
Scalability
Availability
Latency
CPU Consumption
Evaluating Encryption Granularity & Tokenization
EncryptionEvaluation Criteria Tokenization
Best Worst
Security
Data Flow Protection
Compliance Scoping
Key Management
Randomness
Separation of Duties
039
!@#$%a^///&*B()..,,,gft_+!@4#$2%p^&*Hashing -
Strong Encryption -
Intrusiveness
(to Applications and Databases)
!@#$%a^.,mhu7/////&*B()_+!@
StandardEncryption
Evaluating Field Encryption & Tokenization
123456 777777 1234
123456 123456 1234
Alpha -
Partial -
Clear Text Data -
I
Original
I
Longer
Tokenizing orFormatted Encryption
Data
Length
Encoding
040
123456 aBcdeF 1234
Visa Best Practices for Tokenization Version 1
Token Generation Token Types
Single Use Token Multi Use Token
Algorithm and Key Reversible
Known strong algorithm (NIST Approved)
Unique Sequence
���� -
���� ����
Published July 14, 2010.
One way IrreversibleFunction
Unique SequenceNumber
Hash
Randomly generated value
���� ����
���� ����
Secret per transaction
Secret per merchant
Making Data Unreadable – Protection Methods (Pro’s & Con’s)
Evaluating Different Tokenization ImplementationsIO Interface Protection Method
System Layer Granularity AES/CBC, AES/CTR
C
Formatted Encryption
DataTokenization
Hashing DataMasking
ApplicationColumn/Field
Record
Column
Database Table
Table Space
OS File IO Block
StorageSystem
IO Block
Best Worse
Area ImpactFormatted Encryption
StrongEncryption
DynamicTokenization
(old)
StaticTokenization
(new)
Scalability
Availability
Latency
CPU Consumption
Evaluating Column Encryption & Tokenization
Database Column EncryptionEvaluation Criteria Tokenization
Best Worst
Security
Data Flow Protection
Compliance Scoping
Key Management
Randomness
Separation of Duties
043
Tokenization Server Location
Tokenization Server Location
Evaluation Aspects Mainframe Remote
Area Criteria DB2 Work Load
Manager
Separate Address Space
In-house Out-sourced
Availability
Best Worst
Operational Latency
Performance
SecuritySeparation
PCI DSS Scope
Positioning Different Protection Options
Area Evaluation Criteria Strong Encryption
Formatted Encryption
DistributedToken
Security
High risk data
Compliance to PCI, NIST
InitialCost
Transparent to applications
Expanded storage size
Transparent to databases schemaCost
Operational Cost
Performance impact when loading data
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
Disconnected environments
Distributed environments
45
Best Worst
Positioning Different Protection Options
Evaluation Criteria Strong Encryption
Formatted Encryption
Tokens
Security & Compliance
Total Cost of Ownership
Use of Encoded Data
46
Best Worst
User
Strong Column Encryption (AES CBC [)
User User
123456 123456 1234123456 123456 1234 123456 123456 1234
047
ApplicationDatabases
Protected sensitive information
Unprotected sensitive information:
@#$#@$$/// @#$#@$$/// @#$#@$$///
Crypto Service
Crypto Service
Crypto Service
User
Formatted Column Encryption (FCE, FPE [)
User User
123456 123456 1234123456 123456 1234
123456 999999 1234
048
Crypto Service
Protected sensitive information
Unprotected sensitive information:
Crypto Service
123456 999999 1234
123456 999999 1234123456 999999 1234
Tokenization ServiceTokenization
User
Data Tokens
User User
123456 123456 1234123456 999999 1234123456 123456 1234
049
Service
ApplicationDatabases
: Data Token
Service
Protected sensitive information
Unprotected sensitive information:
123456 999999 1234 123456 999999 1234123456 999999 1234
TokenizationServers
User
Issues with Traditional Tokenization Approaches
User
Replication:High cost, size &
complexity
Dynamic token tables:
Low performance
High latency
Tokenization
Dynamic token tables:
Low
050 : Data Token
Servers
Encryption based tokens:Keys must be managed &
algorithm could be breached
performance
Dynamic token tables:Token collisions and
duplications
TokenizationServers
UserUser
Low performance
TokenizationServer
User User
Low latency
No token collisions andno duplications
Static token tables:
High performance
TokenizationServer
Benefits with a Pre-generated Tokenization Approach
051 : Data Token
No replication, low cost, small size &
simple configuration
no duplications
TokenizationServer
UserUser
TokenizationServer
TokenizationServer
User
Pre-generated Tokenization Approach
User
TokenizationServer
SecurityAdmin
InstallationPush
InstallationPush
052 : Data Token
TokenizationServer
UserUser
TokenizationServer
PushPush
Data Protection Challenges
Actual protection is not the challenge
Management of solutions
• Key management
• Security policy
• Auditing, Monitoring and reporting
Minimizing impact on business operations
• Transparency
053
• Transparency
• Performance vs. security
Minimizing the cost implications
Maintaining compliance
Implementation Time
Best Practices - Data Security Management
Database Protector
File System Protector
Policy
AuditLog
Secure Archive
Application Protector
Tokenization Server
EnterpriseData SecurityAdministrator
: Encryption service054
User / Client
DatabaseNative
User Access Patient Health Record
x Read a xxx
DBA Read b xxx
z Write c xxx
User Access Patient Health Record
z Write c xxx
Possible DBA manipulation
CompleteLog
No ReadLog
3rd Party DatabaseEncryption
Case Study: Granularity of Reporting and Separation of Duties
OS File System
Encryption
NativeEncryption
z Write c xxx
UserAcces
sPatient
Health Data
Record
Health Data File
Database Process
0001Read ? ? PHI002
Database Process
0001Read ? ? PHI002
Database Process
0001Write ? ? PHI002
Log
No Information
On Useror Record
Possible DBA manipulation
: Encryption service
Cost
Optimal
Risk
Expected Losses
from the RiskCost of Aversion –
Protection of Data
Total Cost
Choose Your Defenses – Total Cost of Ownership
Risk
LevelI
Weak
Protection
I
Strong
Protection
X
Developing a Risk-adjusted Data Protection Plan
1. Know Your Data
2. Find Your Data
3. Understand Your Enemy
4. Understand the New Options in Data Protection
5. Deploy Defenses5. Deploy Defenses
6. Crunch the Numbers
Matching Data Protection Solutions with Risk Level
Risk Level Solution
Monitor
Monitor, mask,
Low Risk(1-5)
Data
Field
Risk
Level
Credit Card Number 25
Social Security Number 20
CVV 20
Deploy Defenses
Monitor, mask, access control limits, format
control encryption
Replacement, strong
encryption
At Risk(6-15)
High Risk(16-25)
CVV 20
Customer Name 12
Secret Formula 10
Employee Name 9
Employee Health Record 6
Zip Code 3
Please contact me for more information
Ulf Mattsson
Ulf . Mattsson AT protegrity . com