Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
How We Did It: The Case of the Credit Card Breach
1. How We Did The Investigations “ The Case of the Credit Card Breach” Brought to you by and
2.
3.
4.
5. Taylor & Swift PCI DSS Security Audit Taylor & Swift passed an external PCI DSS audit that showed their systems to be compliant with the PCI DSS industry standard for protecting credit cards. Bob Shields gave a copy of the report to both Frazier and Lola to study
6. Taylor & Swift System Architecture (From PCI Audit) Lola notes that this is a common retail system architecture where credit card transactions collected at stores or the web flow through data centers to an EDW and ultimately to Back Up.
7. T&S’s Front-End (Store and Web, Data Center) Data Flow Processes and System Architecture T&S mini-batch loads POS data every hour to the data centers. Web transactions drop immediately into the data centers. The multiple data centers offer high availability as well as disaster recovery, in addition to workload balancing.
8. Taylor and Swift Back-Office Data Flows Data centers are used for inventory and ERP financial applications. Data flows at 1 hour intervals into Teradata for marketing and merchandising purposes.
9.
10.
11.
12. Lola Investigated the Front-End Systems Lola worked with the Store and Web IT Groups. The system had a full PCI audit review and approval. This led the team to investigate the Front-End Systems. All systems came out clean with no intrusions and the data is protected from the swipe through the point where the transaction data moves to the data centers.
13. Frazier Investigated the Back-End Systems Frazier worked with the back-end team to investigate how credit cards are handled in the back-end processing at the data centers. The back-end system also had a full PCI audit review and approval. The back-end systems came out clean with no intrusions and the data is protected inside the back-end systems.
14. Frazier Checked the Role - and Time-Based Security Access Controls That Were Set Up In the case of the Credit Card Field, Protegrity’s tool defined 2 roles – one with High clearance, and one with Low, with access permission to the data only for High clearance only during daytime working hours. Frazier found that the Protegrity policy controls that were initially set up were not changed – no security hole there.
15. Frazier Ran A Protegrity Detailed Report on the Card Number Column – Decrease was the Clue! Frazier dug into the Protegrity reports on key data elements. In this case, the Production Credit Card Number – and found a suspicious dip in the number of daily touches.
16. Frazier Inspected Credit Card Column Access Frazier drilled down on each of the repositories and found that the touches of the Credit Card data in the SQL Servers dropped to 0
17. Core Problem: IT Swapped Out Protected Operational Data Stores at the Data Center, Forgot to Protect Lola and Frazier had a call with the Data Center IT Manager and found out that the staging databases had been changed - from SQL Server databases to another 3 rd party database. No PCI audit was done after the switch and the new databases had not been protected.
18. Audit Logs in SQL Servers in the Data Centers Show Suspicious Activity by a DBA Lola went back to the Log activity on the unprotected system and found some unusual SELECT * activities on Orders and Customers. The queries were executed by a DBA at the Las Vegas Data Center by the name of Joe Nagle.
19. Records from DBA Query Matched the List Pull Against the Complaining Customers Frazier ran a query to JOIN the 500 records with the complaints with the credit card transactions. All customers who were breached had shown the fraudulent activity in the unprotected database. They all matched!
20. The Culprit: Joe Nagel, DBA Lola worked with the data center IT people to pull the security tapes to make sure Joe was working that day … here he is exiting the facility in the early morning hours. Bob then confiscated his PC and found customer credit card information on his laptop. NABBED!
21.
22.
23.
Hinweis der Redaktion
Much of the power in the Protegrity solution comes from way security policies capture all of the details on data protection within an organization. By bringing all of this information together in one place, the environment becomes quite simple to manage, with all of the requisite transparency security regulations require. We’ll see how each of these things are handled by DPS.
Key Talking Points: Opportunity to discuss the complexities and importance of protecting data throughout the enterprise – from acquisition to archive or deletion. This slide builds on the previous graphic which introduced Protegrity’s ‘comprehensive’ Solutions and prepares for the next slide which discusses the different data protection options provided by Protegrity. [1] Collection Begin to paint the ‘real-life’ picture about data being collected from multiple access points (Web, POS, remote locations, applications, etc.). Point out that the data being collected may also have already been encrypted using different ‘keys’ from different systems. Explain that Protegrity provides an API to help bring that diverse information together for aggregation. [2] Aggregation Continue painting the data flow process by explaining the need to protect and ‘normalize’ the data coming from multiple ‘key zones’ into one protected ‘key zone’ that prepares the data to advance to the necessary operational systems for use within the organization. These operational systems may also have their own separate ‘zone’ requirements. Explain that Protegrity can provide both database and application protection at this stage of the data flow process within the organization. Key management becomes increasingly important. [3] Operations Continue defining the data flow process by highlighting the complexity of managing the data and associated data security keys at the operational level, which may contain many different applications, databases and technologies that rely on independent data protection. Explain that Protegrity can provide ‘homogeneous’ data protection across those operational environments by leveraging database, file and application level protection, depending on their requirements and technology landscape. [4] Analysis Continue the data flow discussion by introducing the data flowing into the data warehouse. Mention the need to be able to protect the sensitive data at the column level, due to the potential volumes of data that might exist within these large data warehouses. Also mention the additional need for high performance and scalability, due to the business analysis that is derived from the DW. Explain that Protegrity provides database protection for some of the largest data warehouses, across most industries. This is a great time to introduce our Teradata relationship and the value proposition Protegrity brings to the TD EDW. [5]Storage Complete the ‘from acquisition to archive or deletion’ data flow by highlighting the need to eventually off-load and store the large volumes of historical collected data, outside of the data warehouse for efficiency purposes -- but in a protected environment. (You might mentions some wars stories about tapes falling off the back of a truck, at this point). Explain that Protegrity can provide a secure archived environment, by protecting data that has been encrypted throughout the organization and storing it in their preferred back-up devices using Protegrity’s database and file protection capabilities. Most importantly, the data in these archived systems can be restored if and when necessary . Finally, reintroduce the importance of having a centralized and comprehensive key management system to manage the encryption and decryption process at each stage of the enterprise data flow. Additionally, it is a great time to re-introduce the concept of Risk Adjusted Data Security and the need for having multiple methods to secure the data across each stage. It also will bridge to the next slide.