Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

What is tokenization in blockchain?

Tokenization on Blockchain is a steady trend of 2018. It seems that everything is being tokenized on Blockchain from paintings, diamonds and company stocks to real estate. Thus, we took an asset, tokenized it and created its digital representation that lives on Blockchain. Blockchain guarantees that the ownership information is immutable.
Unfortunately, some problems need to be solved before we can successfully tokenize real-world assets on Blockchain. Main problem stems from the fact that so far, no country has a solid regulation for cryptocurrency. For example, what happens if a company that handles tokenization sells the property? They have no legal rights on the property and thus are not protected by the law. Another problem is that this system brings us back some sort of centralization. The whole idea of Blockchain and especially smart contracts is to create a trustless environment.

Tokenization on Blockchain is a steady trend of 2018. Blockchain guarantees that the ownership information is immutable. Unfortunately, some problems need to be solved before we can successfully tokenize real-world assets on Blockchain. Main problem stems from the fact that so far, no country has a solid regulation for cryptocurrency.
Tokenization is a method that converts a digital value into a digital token. Tokenization can be used as a method that converts rights to an asset into a digital token. The tokenization system can be implemented local to the data that is tokenized or offloaded to cloud. Tokenization in cloud can provide a lower total cost of ownership by sharing resources implementation and administration. A high level of security can be achieved by separating the tokenization system into a container that can be run on-prem (for larger banks) or isolated in a remote private cloud.
Please join my session that will discuss tokenization, blockchain and tokenization in blockchain.

  • Loggen Sie sich ein, um Kommentare anzuzeigen.

What is tokenization in blockchain?

  1. 1. 1 What I Learned at Gartner Summit 2019 Ulf Mattsson www.TokenEx.com What is tokenization in Blockchain?
  2. 2. 2 • Head of Innovation at TokenEx • Chief Technology Officer at Protegrity • Chief Technology Officer at Atlantic BT Security Solutions • Chief Technology Officer at Compliance Engineering • Developer at IBM Research and Development • Inventor of 70+ issued US patents • Providing products and services for Data Encryption and Tokenization, Data Discovery, Cloud Application Security Broker, Web Application Firewall, Managed Security Services and Security Operation Center Ulf Mattsson 2
  3. 3. 3 Blockchain
  4. 4. 4 What does Blockchain Offer? (Gartner, 2019)
  5. 5. 5 Blockchain Strengths, Weaknesses, Opportunities and Threats (SWOT), Gartner
  6. 6. 6 Gartner Forecast: Blockchain Business Value, Worldwide
  7. 7. 7 Board-Level Opinions on Blockchain and Digital Currencies, Gartner
  8. 8. 8 The Gartner Top op Strategig Technology Trends for 2019 Exploiting AI in the Development Process
  9. 9. 9 Blockchain Security
  10. 10. 10 10Source: IBM Blockchain Security – What keeps your transaction data safe?
  11. 11. 11 11Source: IBM Blockchain is decentralized
  12. 12. 12 12Source: IBM Blockchain is virtually impossible to hack
  13. 13. 13 13Source: IBM Blockchains can be private or public
  14. 14. 14 14Source: IBM Blockchain offers validation, encryption and potentially tokenization
  15. 15. 15 Blockchain & Tokenization
  16. 16. 16 The idea behind asset tokenization • It allows to convert the rights to assets with economic value into a digital token. • Such tokens can be stored and managed on a blockchain network. • Tokenization on Blockchain is a steady trend of 2018. • It seems that everything is being tokenized on Blockchain from paintings, diamonds and company stocks to real estate. • Let us forget about Blockchain and smart contracts for a moment. • Imagine you want to invest in real estate but your initial investment is modest — say $5,000. • Perhaps you want to start small and increase your investment gradually. • For instance you decide to invest a couple thousand every three or four months. • Obviously, with traditional real estate market this is quite awkward to do. • How are you supposed to buy two or three square meters in an apartment?
  17. 17. 17 The idea behind asset tokenization • Let us reverse the situation. • Imagine that you have some property — say an apartment. • You need cash quickly. • The apartment is valued at $150,000 but you just need $10,000. • Can you do this quickly without much friction? • To my best knowledge, this is next to impossible.
  18. 18. 18 Enter tokenization • Tokenization is a method that converts rights to an asset into a digital token • Suppose there is a $200,000 apartment • Tokenization can transform this apartment into 200,000 tokens (the number is totally arbitrary we could have issued 2 million tokens) • Thus, each token represents a 0.0005% share of the underlying asset • Finally, we issue the token on some sort of a platform supporting smart contracts, for example on Ethereum, so that the tokens can be freely bought and sold on different exchanges
  19. 19. 19 New Requirements from Regulations
  20. 20. 20 Pseudonymisation Under the GDPR Within the text of the GDPR, there are multiple references to pseudonymisation as an appropriate mechanism for protecting personal data. Pseudonymisation—replacing identifying or sensitive data with pseudonyms, is synonymous with tokenization—replacing identifying or sensitive data with tokens. Article 4 – Definitions • (1) ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); …such as a name, an identification number, location data, an online identifier… • (5) ‘pseudonymisation’ means the processing personal data in such a manner that the data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately… What is Personal Data according to GDPR?
  21. 21. 21 GDPR
  22. 22. 22 Data sources Data Warehouse In Italy Complete policy- enforced de- identification of sensitive data across all bank entities Example of Cross Border Data-centric Security • Protecting Personally Identifiable Information (PII), including names, addresses, phone, email, policy and account numbers • Compliance with EU Cross Border Data Protection Laws • Utilizing Data Tokenization, and centralized policy, key management, auditing, and reporting
  23. 23. 23 Gartner Hype Cycle for Data Security Data Classification Blockchain for Data Security
  24. 24. 24 Source: IBM Encryption and TokenizationDiscover Data Assets Security by Design GDPR Security Requirements – Encryption and Tokenization
  25. 25. 25 Data Minimization • Increasingly organizations are adopting data minimization strategies for security and privacy reasons. By deleting or reducing inessential duplicate or unused data, organizations can minimize potential attack vectors. • Unlike prior discovery tools, BigID can both quickly report on duplicate data but also provide residency and usage detail so minimization strategies can be based on secondary factors like jurisdiction and activity history. • BigID is transforming enterprise protection and privacy of personal data. • Organizations are facing record breaches of personal information and proliferating global privacy regulations with fines reaching 10% of annual revenue. • Today enterprises lack dedicated purpose built technology to help them track and govern their customer data. • By bringing data science to data privacy, BigID aims to give enterprises the software to safeguard and steward the most important asset organizations manage: their customer data. Source: BigID (TokenEx partner)
  26. 26. 26 ML Driven Data Classification • The definition of sensitive data is no longer readily encapsulated in a regular expression. • Increasingly, companies need to classify data that is sensitive based on context to a person, or a thing like patent or account. • This requires a new approach to classification that can identify contextually sensitive data across all modern data stores - unstructured, structured, Big Data, Cloud and enterprise applications like SAP. • BigID provides a first of its kind approach that combines Machine Learning and Contextual Intelligence to deliver on advanced data classification, categorization, cataloging and correlation for privacy. Source: BigID (TokenEx partner)
  27. 27. 27 ML-Driven Classification • Traditional pattern matching approaches to discovery and classification still struggle with accurately identifying contextually sensitive data like Personal Information (PI) and disambiguating similar looking information. • Moreover, regular expression based classifiers which predominate in data loss prevention, database activity monitoring, and data access governance products tend to operate on a limited number of data sources, like relational databases or on-prem unstructured file shares. • BigID leverages machine learning to classify, categorize and compare data and files across structured, unstructured, semistructured and Big Data in the cloud or on-prem. • BigID can resolve similar looking entities and build association graphs to correlate data back to a specific entity or person - essential for meeting emerging privacy use cases like personal data rights Source: BigID (TokenEx partner)
  28. 28. 28 Correlation plus classification • Even with AI and ML classification approaches like clustering or random forest, classifiers can improve accuracy through smarter matching and comparison analysis - but lack the context to understand who the data relates to. • This is a common problem for privacy requirements and regulated industries. The capability to build a graph of connected or relevant data can be characterized as a correlation problem. • Correlation helps an organization find sensitive data because of its association to other sensitive data. • BigID provides a first of its kind model that can, not only match similar data within the same class based on ML analysis, but also match connected data of different classes based on relevancy and connectedness. • This correlation-based classification is critical to privacy. Source: BigID (TokenEx partner)
  29. 29. 29 Cataloging plus Classification • BigID's ML-based classifiers use advanced AI techniques to match data within a class and also correlate data of different classes that have a common sensitivity level owing to a shared association. • But, there is a third way sensitivity can be measured. Most data also has certain attributes associated with it, such as date of creation, last modification of ownership and access details. • Unlike traditional classifiers, BigID can also integrate meta-data analysis to provide a richer view of the data and its usage. • This meta-data input can be used to better and more automatically catalog data for easier discovery via search as well as measure sensitivity risk. • The combination of intelligent classification, correlation and cataloging give organizations the unique ability to find, inventory and map sensitive data by additional dimensions than just data class or category. • These include finding data by person, residency, application and ownership. Source: BigID (TokenEx partner)
  30. 30. 30 Intelligent labeling and tagging • Enforcement of security protection and privacy compliance requires data risk and sensitivity knowledge. • BigID helps organizations understand data sensitivity through advanced ML-based classification, correlation and cataloging to provide a complete view of data. • To simplify enforcement on classified data, BigID enables customers to automatically assign data tags for files and objects. • These classification tags can be consumed through Microsoft's Azure Information Protection framework as policy labels, BigID's labeling APIs or additional frameworks like Box. • Using these labels, organizations can classify or categorize data - such as Highly Sensitive, as well as Personal Data based on privacy, health or financial services compliance mandates. • These tags can then be used for more granular policy enforcement actions by DLP, information rights management, database activity monitoring or other enforcement products. Source: BigID (TokenEx partner)
  31. 31. 31 Tokens in Digital Business Ecosystems
  32. 32. 32 Main Purpose of Tokens in Digital Business Ecosystems (Value Proposition) While a large proportion of new token use cases focuses on monetary value representation enabled by blockchain technology, tokenization will achieve its real potential with value creation. An example of such value creation is enabling the design of new markets for data assets, autonomous organizations and labor.
  33. 33. 33 Encryption & Tokenization
  34. 34. 34
  35. 35. 35 What is the difference? • Encryption - A data security measure using mathematic algorithms to generate rule-based values in place of original data • Tokenization - A data security measure using mathematic algorithms to generate randomized values in place of original data Encryption alone is not a full solution • With encryption, sensitive data remains in business systems. With tokenization, sensitive data is removed completely from business systems and securely vaulted. Tokens are versatile • Format-preserving tokens can be utilized where masked information is required Encryption vs Tokenization
  36. 36. 36 Examples of Protected Data Field Real Data Tokenized / Pseudonymized Name Joe Smith csu wusoj Address 100 Main Street, Pleasantville, CA 476 srta coetse, cysieondusbak, CA Date of Birth 12/25/1966 01/02/1966 Telephone 760-278-3389 760-389-2289 E-Mail Address joe.smith@surferdude.org eoe.nwuer@beusorpdqo.org SSN 076-39-2778 076-28-3390 CC Number 3678 2289 3907 3378 3846 2290 3371 3378 Business URL www.surferdude.com www.sheyinctao.com Fingerprint Encrypted Photo Encrypted X-Ray Encrypted Healthcare / Financial Services Dr. visits, prescriptions, hospital stays and discharges, clinical, billing, etc. Financial Services Consumer Products and activities Protection methods can be equally applied to the actual data, but not needed with de- identification
  37. 37. 37 Type of Data Use Case I Structured How Should I Secure Different Types of Data? I Un-structured Simple – Complex – PCI PHI PII Encryption of Files Card Holder Data Tokenization of Fields Protected Health Information Personally Identifiable Information
  38. 38. 38 Balance Risk
  39. 39. 39 Access to Data High - Low - I I User Productivity Low High User Productivity, Creativity and Data
  40. 40. 40 Access to Tokenized DataLow High High - Low - I I Risk Exposure Risk Adjusted Data Security – Tokenized Data User Productivity and Creativity
  41. 41. 41 Minimization Devaluation/Pseudonymisation Data Hashing/Masking Encryption DataUtility Data Protection Max Utility Min Utility Min Protection Max Protection Source:TokenEx Data Security Approaches
  42. 42. 42 Reduction of Pain with Different Protection Techniques 1970 2000 2005 2010 High Low Pain & TCO Strong Encryption Output: AES, 3DES Format Preserving Encryption DTP, FPE Vault-based Tokenization Vaultless Tokenization Input Value: 3872 3789 1620 3675 !@#$%a^.,mhu7///&*B()_+!@ 8278 2789 2990 2789 8278 2789 2990 2789 Format Preserving Greatly reduced Key Management No Vault 8278 2789 2990 2789 Year
  43. 43. 43 Different Tokenization Approaches Property Dynamic Pre-generated Vault-based Vaultless
  44. 44. 44 10 000 000 - 1 000 000 - 100 000 - 10 000 - 1 000 - 100 - Transactions per second* I Format Preserving Encryption Local Speed of Fine Grained Protection Algorithms I Vaultless Data Tokenization I AES CBC Encryption Standard I Vault-based Data Tokenization *: Speed will depend on the configuration
  45. 45. 45 D E S C O P I N G A N E C O M M E R C E S O L U T I O N A PCI SAQ A contains 22 controls compared to more than 300 for the full PCI DSS • Use a hosted iFrame or payments page provided by a validated service provider to capture and tokenize CHD • Do not transmit, process or store CHD via any other acceptance channel and utilize payment services of tokenization provider to process transactions Minimize Cost of PCI Tokenization
  46. 46. 46 Cybercriminal Sweet Spot Source: calnet Cloud can Help Mid-size Business 46
  47. 47. 47 On Premise tokenization • Limited PCI DSS scope reduction - must still maintain a CDE with PCI data • Higher risk – sensitive data still resident in environment • Associated personnel and hardware costs Cloud-Based tokenization • Significant reduction in PCI DSS scope • Reduced risk – sensitive data removed from the environment • Platform-focused security • Lower associated costs – cyber insurance, PCI audit, maintenance Total Cost and Risk of Tokenization
  48. 48. 48 On-premises, public / private clouds
  49. 49. 49 • Verizon Data Breach Investigations Report • Enterprises are losing ground in the fight against persistent cyber-attacks • We simply cannot catch the bad guys until it is too late. This picture is not improving • Verizon reports concluded that less than 14% of breaches are detected by internal monitoring tools • JP Morgan Chase data breach • Hackers were in the bank’s network for months undetected • Network configuration errors are inevitable, even at the larges banks • Capital One data breach • A hacker gained access to 100 million credit card applications and accounts • Amazon Web Services, the cloud hosting company that Capital One was using Enterprises Losing Ground Against Cyber-attacks 49
  50. 50. 50 Cloud and Threat Vector Inheritance
  51. 51. 51 Cloud Data Security Operating System Security Controls OS File System Database Application Framework Application Source Code Application Data Network External Network Internal Network Application Server 51 Publi c Cloud Secure Cloud Security Separation Armor.com
  52. 52. 52 Security Separation in Cloud Internal Network Administrator Remote User Internal User Public Cloud Examples Each authorized field is in clear Cloud Gateway Data Security for including encryption, tokenization or masking of fields or files (at transit and rest) Secure Cloud Security Separation Armor.com
  53. 53. 53 Thank You! Ulf Mattsson, TokenEx www.TokenEx.com

×