SlideShare a Scribd company logo
DATA LEAKAGE DETECTION

ABSTRACT:


A data distributor has given sensitive data to a set of supposedly trusted agents
(third parties). Some of the data is leaked and found in an unauthorized place
(e.g., on the web or somebody’s laptop). The distributor must assess the
likelihood that the leaked data came from one or more agents, as opposed to
having been independently gathered by other means. We propose data allocation
strategies (across the agents) that improve the probability of identifying leakages.
These methods do not rely on alterations of the released data (e.g., watermarks).
In some cases we can also inject “realistic but fake” data records to further
improve our chances of detecting leakage and identifying the guilty party.


EXISTING SYSTEM:


    Traditionally, leakage detection is handled by watermarking, e.g., a unique
      code is embedded in each distributed copy.


    If that copy is later discovered in the hands of an unauthorized party, the
      leaker can be identified.


         o Disadvantages of Existing Systems:


    Watermarks can be very useful in some cases, but again, involve some
      modification of the original data. Furthermore, watermarks can sometimes
be destroyed if the data recipient is malicious. E.g. A hospital may give
     patient records to researchers who will devise new treatments.


   Similarly, a company may have partnerships with other companies that
     require sharing customer data. Another enterprise may outsource its data
     processing, so data must be given to various other companies. We call the
     owner of the data the distributor and the supposedly trusted third parties
     the agents.


PROPOSED SYSTEM:


   Our goal is to detect when the distributor’s sensitive data has been leaked

     by agents, and if possible to identify the agent that leaked the data.
   Perturbation is a very useful technique where the data is modified and
     made “less sensitive” before being handed to agents.


   We develop unobtrusive techniques for detecting leakage of a set of
     objects or records.


   We develop a model for assessing the “guilt” of agents.


   We also present algorithms for distributing objects to agents, in a way that
     improves our chances of identifying a leaker.
 Finally, we also consider the option of adding “fake” objects to the
      distributed set. Such objects do not correspond to real entities but appear
      realistic to the agents.


    In a sense, the fake objects acts as a type of watermark for the entire set,
      without modifying any individual members. If it turns out an agent was
      given one or more fake objects that were leaked, then the distributor can be
      more confident that agent was guilty.




Problem Setup and Notation:


A distributor owns a set T={t1,…,tm}of valuable data objects. The distributor wants
to share some of the objects with a set of agents U1,U2,…Un, but does not wish
the objects be leaked to other third parties. The objects in T could be of any type
and size, e.g., they could be tuples in a relation, or relations in a database. An
agent Ui receives a subset of objects, determined either by a sample request or an
explicit request:
1. Sample request
                2. Explicit request

Guilt Model Analysis:


Our model parameters interact and to check if the interactions match our
intuition, in this section we study two simple scenarios as Impact of Probability
p and Impact of Overlap between Ri and S. In each scenario we have a target
that has obtained all the distributor’s objects, i.e., T = S.


Algorithms:

1. Evaluation of Explicit Data Request Algorithms
In the first place, the goal of these experiments was to see whether fake objects in
the distributed data sets yield significant improvement in our chances of detecting
a guilty agent. In the second place, we wanted to evaluate our e-optimal
algorithm relative to a random allocation.

2. Evaluation of Sample Data Request Algorithms


With sample data requests agents are not interested in particular objects. Hence,
object sharing is not explicitly defined by their requests. The distributor is
“forced” to allocate certain objects to multiple agents only if the number of
requested objects exceeds the number of objects in set T. The more data objects
the agents request in total, the more recipients on average an object has; and the
more objects are shared among different agents, the more difficult it is to detect a
guilty agent.
MODULES:



1. Login / Registration
2. Data Distributor:
3. Data Allocation Module:
4. Fake Object Module:
5. Data Leakage protection Module:
6. Finding Guilty Agents Module:
7. Mobile Alert:


MODULES DESCRIPTION:


1. Login / Registration:


                This is a module mainly designed to provide the authority to a

user/agent in order to access the other modules of the project. Here a user/agent

can have the accessibility authority after the registration.


2. Data Distributor:


A data Distributor part is developed in this module. A data distributor has given
sensitive data to a set of supposedly trusted agents (third parties). Some of the
data is leaked and found in an unauthorized place (e.g., on the web or
somebody’s laptop). The distributor must assess the likelihood that the leaked
data came from one or more agents, as opposed to having been independently
gathered by other means.


3. Data Allocation Module:


The main focus of our project is the data allocation problem as how can the
distributor “intelligently” give data to agents in order to improve the chances of
detecting a guilty agent.


4. Fake Object Module:


Fake objects are objects generated by the distributor in order to increase the
chances of detecting agents that leak data. The distributor may be able to add
fake objects to the distributed data in order to improve his effectiveness in
detecting guilty agents. Our use of fake objects is inspired by the use of “trace”
records in mailing lists.


5. Data Leakage protection Module:


In this module, to protect the data leakage, a secret key is sent to the agent who
requests for the files. The secret key is sent through the email id of the registered
agents. Without the secret key the agent cannot access the file sent by the
distributor.
6. Finding Guilty Agents Module:


The Optimization Module is the distributor’s data allocation to agents has one
constraint and one objective. The distributor’s constraint is to satisfy agents’
requests, by providing them with the number of objects they request or with all
available objects that satisfy their conditions. His objective is to be able to detect
an agent who leaks any portion of his data. This module is designed using the
agent – guilt model.      Here a count value (also called as fake objects) are
incremented for any transfer of data occurrence when agent transfers data. Fake
objects are stored in database.




7. Mobile Alert:


In this module, an alert is sent to the distributor mobile, regarding the guilty
agents who leaked the files. It is developed using NOKIA SDK 5100. Its only
manual process, not an automatic triggered process.
Hardware Required:


                 System                :      Pentium IV 2.4 GHz
                 Hard Disk             :      40 GB
                 Floppy Drive          :      1.44 MB
                 Monitor               :      15 VGA colour
                 Mouse                 :      Logitech.
 Keyboard          :     110 keys enhanced.
                 RAM               :     256 MB




Software Required:


                 O/S               :     Windows XP.
                 Front End         :     Asp.Net, C#, Nokia SDK 5100.
                 Data Base         :     SQL Server 2005.
                 Browser           :     IE / Firefox with Internet connection




REFERENCE:


Panagiotis Papadimitriou, and Hector Garcia-Molina, “Data Leakage Detection”,
IEEE Transactions on Knowledge and Data Engineering, Vol. 23, NO.1,
January 2011.

More Related Content

What's hot

Data leakage detection (synopsis)
Data leakage detection (synopsis)Data leakage detection (synopsis)
Data leakage detection (synopsis)
Mumbai Academisc
 
Social engineering by-rakesh-nagekar
Social engineering by-rakesh-nagekarSocial engineering by-rakesh-nagekar
Social engineering by-rakesh-nagekar
Raghunath G
 

What's hot (20)

Data leakage detection (synopsis)
Data leakage detection (synopsis)Data leakage detection (synopsis)
Data leakage detection (synopsis)
 
Data leakage detection
Data leakage detectionData leakage detection
Data leakage detection
 
IRJET- A Literature Review on Deta Leakage Detection
IRJET-  	  A Literature Review on Deta Leakage DetectionIRJET-  	  A Literature Review on Deta Leakage Detection
IRJET- A Literature Review on Deta Leakage Detection
 
Social Engineering Attacks & Principles
Social Engineering Attacks & PrinciplesSocial Engineering Attacks & Principles
Social Engineering Attacks & Principles
 
Data Hiding Techniques
Data Hiding TechniquesData Hiding Techniques
Data Hiding Techniques
 
Social engineering
Social engineeringSocial engineering
Social engineering
 
Cloud-forensics
Cloud-forensicsCloud-forensics
Cloud-forensics
 
Data Security in Healthcare
Data Security in HealthcareData Security in Healthcare
Data Security in Healthcare
 
The Human Right to Privacy in the Digital Age
The Human Right to Privacy in the Digital Age The Human Right to Privacy in the Digital Age
The Human Right to Privacy in the Digital Age
 
Introduction to Social engineering | Techniques of Social engineering
Introduction to Social engineering | Techniques of Social engineeringIntroduction to Social engineering | Techniques of Social engineering
Introduction to Social engineering | Techniques of Social engineering
 
Social Engineering Basics
Social Engineering BasicsSocial Engineering Basics
Social Engineering Basics
 
Social engineering by-rakesh-nagekar
Social engineering by-rakesh-nagekarSocial engineering by-rakesh-nagekar
Social engineering by-rakesh-nagekar
 
Dark Web Forensics
Dark Web Forensics Dark Web Forensics
Dark Web Forensics
 
Computer crimes and forensics
Computer crimes and forensics Computer crimes and forensics
Computer crimes and forensics
 
CNS - Chapter1
CNS - Chapter1CNS - Chapter1
CNS - Chapter1
 
Information Security Awareness, Petronas Marketing Sudan
Information Security Awareness, Petronas Marketing SudanInformation Security Awareness, Petronas Marketing Sudan
Information Security Awareness, Petronas Marketing Sudan
 
Social Engineering
Social EngineeringSocial Engineering
Social Engineering
 
A brief Intro to Digital Forensics
A brief Intro to Digital ForensicsA brief Intro to Digital Forensics
A brief Intro to Digital Forensics
 
Technical Challenges in Cyber Forensics
Technical Challenges in Cyber ForensicsTechnical Challenges in Cyber Forensics
Technical Challenges in Cyber Forensics
 
Presentation of Social Engineering - The Art of Human Hacking
Presentation of Social Engineering - The Art of Human HackingPresentation of Social Engineering - The Art of Human Hacking
Presentation of Social Engineering - The Art of Human Hacking
 

Similar to Jpdcs1 data leakage detection

Data leakage detection
Data leakage detectionData leakage detection
Data leakage detection
bunnz12345
 
Data leakage detection
Data leakage detectionData leakage detection
Data leakage detection
Ajitkaur saini
 
164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf
164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf
164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf
Drog3
 
Privacy Preserving Based Cloud Storage System
Privacy Preserving Based Cloud Storage SystemPrivacy Preserving Based Cloud Storage System
Privacy Preserving Based Cloud Storage System
Kumar Goud
 
Dn31538540
Dn31538540Dn31538540
Dn31538540
IJMER
 
Jpdcs1(data lekage detection)
Jpdcs1(data lekage detection)Jpdcs1(data lekage detection)
Jpdcs1(data lekage detection)
Chaitanya Kn
 
A model to find the agent who responsible for data leakage
A model to find the agent who responsible for data leakageA model to find the agent who responsible for data leakage
A model to find the agent who responsible for data leakage
eSAT Publishing House
 
83504808-Data-Leakage-Detection-1-Final.ppt
83504808-Data-Leakage-Detection-1-Final.ppt83504808-Data-Leakage-Detection-1-Final.ppt
83504808-Data-Leakage-Detection-1-Final.ppt
naresh2004s
 
10.1.1.436.3364.pdf
10.1.1.436.3364.pdf10.1.1.436.3364.pdf
10.1.1.436.3364.pdf
mistryritesh
 

Similar to Jpdcs1 data leakage detection (20)

Data leakage detection
Data leakage detectionData leakage detection
Data leakage detection
 
Data leakage detection
Data leakage detectionData leakage detection
Data leakage detection
 
164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf
164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf
164788616_Data_Leakage_Detection_Complete_Project_Report__1_.docx.pdf
 
Privacy Preserving Based Cloud Storage System
Privacy Preserving Based Cloud Storage SystemPrivacy Preserving Based Cloud Storage System
Privacy Preserving Based Cloud Storage System
 
Dn31538540
Dn31538540Dn31538540
Dn31538540
 
709 713
709 713709 713
709 713
 
Psdot 13 robust data leakage and email filtering system
Psdot 13 robust data leakage and email filtering systemPsdot 13 robust data leakage and email filtering system
Psdot 13 robust data leakage and email filtering system
 
Jpdcs1(data lekage detection)
Jpdcs1(data lekage detection)Jpdcs1(data lekage detection)
Jpdcs1(data lekage detection)
 
Data leakage detection
Data leakage detectionData leakage detection
Data leakage detection
 
A model to find the agent who responsible for data leakage
A model to find the agent who responsible for data leakageA model to find the agent who responsible for data leakage
A model to find the agent who responsible for data leakage
 
A model to find the agent who responsible for data leakage
A model to find the agent who responsible for data leakageA model to find the agent who responsible for data leakage
A model to find the agent who responsible for data leakage
 
Data Allocation Strategies for Leakage Detection
Data Allocation Strategies for Leakage DetectionData Allocation Strategies for Leakage Detection
Data Allocation Strategies for Leakage Detection
 
DLD_SYNOPSIS
DLD_SYNOPSISDLD_SYNOPSIS
DLD_SYNOPSIS
 
purnima.ppt
purnima.pptpurnima.ppt
purnima.ppt
 
83504808-Data-Leakage-Detection-1-Final.ppt
83504808-Data-Leakage-Detection-1-Final.ppt83504808-Data-Leakage-Detection-1-Final.ppt
83504808-Data-Leakage-Detection-1-Final.ppt
 
10.1.1.436.3364.pdf
10.1.1.436.3364.pdf10.1.1.436.3364.pdf
10.1.1.436.3364.pdf
 
547 551
547 551547 551
547 551
 
Modeling and Detection of Data Leakage Fraud
Modeling and Detection of Data Leakage FraudModeling and Detection of Data Leakage Fraud
Modeling and Detection of Data Leakage Fraud
 
Privacy preserving detection of sensitive data exposure
Privacy preserving detection of sensitive data exposurePrivacy preserving detection of sensitive data exposure
Privacy preserving detection of sensitive data exposure
 
Privacy preserving detection of sensitive data exposure
Privacy preserving detection of sensitive data exposurePrivacy preserving detection of sensitive data exposure
Privacy preserving detection of sensitive data exposure
 

More from Chaitanya Kn (15)

Black box-software-testing-douglas-hoffman2483
Black box-software-testing-douglas-hoffman2483Black box-software-testing-douglas-hoffman2483
Black box-software-testing-douglas-hoffman2483
 
Nano tech
Nano techNano tech
Nano tech
 
Ds program-print
Ds program-printDs program-print
Ds program-print
 
Dbms 2
Dbms 2Dbms 2
Dbms 2
 
Dbms print
Dbms printDbms print
Dbms print
 
Ds 2 cycle
Ds 2 cycleDs 2 cycle
Ds 2 cycle
 
(Cse cs) ads programs list
(Cse  cs) ads programs list(Cse  cs) ads programs list
(Cse cs) ads programs list
 
Testing primer
Testing primerTesting primer
Testing primer
 
Stm unit1
Stm unit1Stm unit1
Stm unit1
 
Unix lab manual
Unix lab manualUnix lab manual
Unix lab manual
 
Stop complaining
Stop complainingStop complaining
Stop complaining
 
God doesn
God doesnGod doesn
God doesn
 
Os 2 cycle
Os 2 cycleOs 2 cycle
Os 2 cycle
 
Presentation1
Presentation1Presentation1
Presentation1
 
Fantastic trip by nasa
Fantastic trip by nasaFantastic trip by nasa
Fantastic trip by nasa
 

Recently uploaded

IATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdffIATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdff
17thcssbs2
 

Recently uploaded (20)

Salient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxSalient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptx
 
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringBasic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
An Overview of the Odoo 17 Discuss App.pptx
An Overview of the Odoo 17 Discuss App.pptxAn Overview of the Odoo 17 Discuss App.pptx
An Overview of the Odoo 17 Discuss App.pptx
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 
slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
slides CapTechTalks Webinar May 2024 Alexander Perry.pptxslides CapTechTalks Webinar May 2024 Alexander Perry.pptx
slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
 
Matatag-Curriculum and the 21st Century Skills Presentation.pptx
Matatag-Curriculum and the 21st Century Skills Presentation.pptxMatatag-Curriculum and the 21st Century Skills Presentation.pptx
Matatag-Curriculum and the 21st Century Skills Presentation.pptx
 
How to Manage Notification Preferences in the Odoo 17
How to Manage Notification Preferences in the Odoo 17How to Manage Notification Preferences in the Odoo 17
How to Manage Notification Preferences in the Odoo 17
 
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
 
Basic_QTL_Marker-assisted_Selection_Sourabh.ppt
Basic_QTL_Marker-assisted_Selection_Sourabh.pptBasic_QTL_Marker-assisted_Selection_Sourabh.ppt
Basic_QTL_Marker-assisted_Selection_Sourabh.ppt
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
 
Basic phrases for greeting and assisting costumers
Basic phrases for greeting and assisting costumersBasic phrases for greeting and assisting costumers
Basic phrases for greeting and assisting costumers
 
Advances in production technology of Grapes.pdf
Advances in production technology of Grapes.pdfAdvances in production technology of Grapes.pdf
Advances in production technology of Grapes.pdf
 
Jose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptx
Jose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptxJose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptx
Jose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptx
 
IATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdffIATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdff
 
Open Educational Resources Primer PowerPoint
Open Educational Resources Primer PowerPointOpen Educational Resources Primer PowerPoint
Open Educational Resources Primer PowerPoint
 
[GDSC YCCE] Build with AI Online Presentation
[GDSC YCCE] Build with AI Online Presentation[GDSC YCCE] Build with AI Online Presentation
[GDSC YCCE] Build with AI Online Presentation
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
 
The Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational ResourcesThe Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational Resources
 

Jpdcs1 data leakage detection

  • 1. DATA LEAKAGE DETECTION ABSTRACT: A data distributor has given sensitive data to a set of supposedly trusted agents (third parties). Some of the data is leaked and found in an unauthorized place (e.g., on the web or somebody’s laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. We propose data allocation strategies (across the agents) that improve the probability of identifying leakages. These methods do not rely on alterations of the released data (e.g., watermarks). In some cases we can also inject “realistic but fake” data records to further improve our chances of detecting leakage and identifying the guilty party. EXISTING SYSTEM:  Traditionally, leakage detection is handled by watermarking, e.g., a unique code is embedded in each distributed copy.  If that copy is later discovered in the hands of an unauthorized party, the leaker can be identified. o Disadvantages of Existing Systems:  Watermarks can be very useful in some cases, but again, involve some modification of the original data. Furthermore, watermarks can sometimes
  • 2. be destroyed if the data recipient is malicious. E.g. A hospital may give patient records to researchers who will devise new treatments.  Similarly, a company may have partnerships with other companies that require sharing customer data. Another enterprise may outsource its data processing, so data must be given to various other companies. We call the owner of the data the distributor and the supposedly trusted third parties the agents. PROPOSED SYSTEM:  Our goal is to detect when the distributor’s sensitive data has been leaked by agents, and if possible to identify the agent that leaked the data.  Perturbation is a very useful technique where the data is modified and made “less sensitive” before being handed to agents.  We develop unobtrusive techniques for detecting leakage of a set of objects or records.  We develop a model for assessing the “guilt” of agents.  We also present algorithms for distributing objects to agents, in a way that improves our chances of identifying a leaker.
  • 3.  Finally, we also consider the option of adding “fake” objects to the distributed set. Such objects do not correspond to real entities but appear realistic to the agents.  In a sense, the fake objects acts as a type of watermark for the entire set, without modifying any individual members. If it turns out an agent was given one or more fake objects that were leaked, then the distributor can be more confident that agent was guilty. Problem Setup and Notation: A distributor owns a set T={t1,…,tm}of valuable data objects. The distributor wants to share some of the objects with a set of agents U1,U2,…Un, but does not wish the objects be leaked to other third parties. The objects in T could be of any type and size, e.g., they could be tuples in a relation, or relations in a database. An agent Ui receives a subset of objects, determined either by a sample request or an explicit request:
  • 4. 1. Sample request 2. Explicit request Guilt Model Analysis: Our model parameters interact and to check if the interactions match our intuition, in this section we study two simple scenarios as Impact of Probability p and Impact of Overlap between Ri and S. In each scenario we have a target that has obtained all the distributor’s objects, i.e., T = S. Algorithms: 1. Evaluation of Explicit Data Request Algorithms In the first place, the goal of these experiments was to see whether fake objects in the distributed data sets yield significant improvement in our chances of detecting a guilty agent. In the second place, we wanted to evaluate our e-optimal algorithm relative to a random allocation. 2. Evaluation of Sample Data Request Algorithms With sample data requests agents are not interested in particular objects. Hence, object sharing is not explicitly defined by their requests. The distributor is “forced” to allocate certain objects to multiple agents only if the number of requested objects exceeds the number of objects in set T. The more data objects the agents request in total, the more recipients on average an object has; and the more objects are shared among different agents, the more difficult it is to detect a guilty agent.
  • 5. MODULES: 1. Login / Registration 2. Data Distributor: 3. Data Allocation Module: 4. Fake Object Module: 5. Data Leakage protection Module: 6. Finding Guilty Agents Module: 7. Mobile Alert: MODULES DESCRIPTION: 1. Login / Registration: This is a module mainly designed to provide the authority to a user/agent in order to access the other modules of the project. Here a user/agent can have the accessibility authority after the registration. 2. Data Distributor: A data Distributor part is developed in this module. A data distributor has given sensitive data to a set of supposedly trusted agents (third parties). Some of the
  • 6. data is leaked and found in an unauthorized place (e.g., on the web or somebody’s laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. 3. Data Allocation Module: The main focus of our project is the data allocation problem as how can the distributor “intelligently” give data to agents in order to improve the chances of detecting a guilty agent. 4. Fake Object Module: Fake objects are objects generated by the distributor in order to increase the chances of detecting agents that leak data. The distributor may be able to add fake objects to the distributed data in order to improve his effectiveness in detecting guilty agents. Our use of fake objects is inspired by the use of “trace” records in mailing lists. 5. Data Leakage protection Module: In this module, to protect the data leakage, a secret key is sent to the agent who requests for the files. The secret key is sent through the email id of the registered agents. Without the secret key the agent cannot access the file sent by the distributor.
  • 7. 6. Finding Guilty Agents Module: The Optimization Module is the distributor’s data allocation to agents has one constraint and one objective. The distributor’s constraint is to satisfy agents’ requests, by providing them with the number of objects they request or with all available objects that satisfy their conditions. His objective is to be able to detect an agent who leaks any portion of his data. This module is designed using the agent – guilt model. Here a count value (also called as fake objects) are incremented for any transfer of data occurrence when agent transfers data. Fake objects are stored in database. 7. Mobile Alert: In this module, an alert is sent to the distributor mobile, regarding the guilty agents who leaked the files. It is developed using NOKIA SDK 5100. Its only manual process, not an automatic triggered process. Hardware Required:  System : Pentium IV 2.4 GHz  Hard Disk : 40 GB  Floppy Drive : 1.44 MB  Monitor : 15 VGA colour  Mouse : Logitech.
  • 8.  Keyboard : 110 keys enhanced.  RAM : 256 MB Software Required:  O/S : Windows XP.  Front End : Asp.Net, C#, Nokia SDK 5100.  Data Base : SQL Server 2005.  Browser : IE / Firefox with Internet connection REFERENCE: Panagiotis Papadimitriou, and Hector Garcia-Molina, “Data Leakage Detection”, IEEE Transactions on Knowledge and Data Engineering, Vol. 23, NO.1, January 2011.