+ All Categories

ppt

Date post: 24-Nov-2014
Category:
Upload: sivaram-reddy
View: 10 times
Download: 0 times
Share this document with a friend
Popular Tags:
28
DATA LEAKAGE DETECTION using UNOBTRUSIVE technique BY: RAJASHEKAR BANALA(07H51A0538) PRANEETH KUMAR PALADUGU(07H51A0536) NISHANTH REDDY A(07H51A0529) SIVARAM REDDY(07H51A0581) Under the guidance of Mrs. E.V.N.Jyothi Assistant Professor
Transcript
Page 1: ppt

DATA LEAKAGE DETECTION using UNOBTRUSIVE technique

BY:

RAJASHEKAR BANALA(07H51A0538)PRANEETH KUMAR PALADUGU(07H51A0536)NISHANTH REDDY A(07H51A0529)SIVARAM REDDY(07H51A0581)Under the guidance

of

Mrs. E.V.N.JyothiAssistant Professor

Page 2: ppt

Agenda

PROBLEM DEFINITION INTRODUCTIONISSUESSCOPEANALYSISDESIGNIMPLEMENTATIONS

Page 3: ppt

Problem Definition

In the course of doing business, sometimes sensitive data must be handed over to supposedly trusted third parties.

Our goal is to detect when the distributor's

sensitive data has been leaked by agents, and if possible to identify the agent that leaked the data.

Page 4: ppt

Introduction

•A data distributor has given sensitive data to a set of supposedly trusted agents (third parties). Some of the data is leaked and found in an unauthorized place (e.g., on the web or somebody’s laptop).

•The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. We propose data allocation strategies (across the agents) that improve the probability of identifying leakages.

•These methods do not rely on alterations of the released data (e.g., watermarks). In some cases we can also inject “realistic but fake” data records to further improve our chances of detecting leakage and identifying the guilty party.

Page 5: ppt

Existing SystemThe Existing System can detect the

hackers but the total no of cookies (evidence) will be less and the organization may not be able to proceed legally for further proceedings due to lack of good amount of cookies and the chances to escape of hackers are high.

Page 6: ppt

Proposed System

In the Proposed System the hackers can be traced with good amount of evidence. In this proposed system the leakage of data is detected by the following methods viz..,generating fake objects, Watermarking and by Encrypting the data.

Page 7: ppt

Types of employees that put your company at risk

The security illiterate Majority of employees with little or no knowledge of security Corporate risk because of accidental breaches

The gadget nerds Introduce a variety of devices to their work PCs Download software

The unlawful residents Use the company IT resources in ways they shouldn't i.e., by storing music, movies, or playing games

The malicious/disgruntled employees Typically minority of employees Gain access to areas of the IT system to which they shouldn’t Send corporate data (e.g., customer lists, R&D, etc.) to third parties

Page 8: ppt

Issues• We develop a model for assessing the “guilt” of agents. We also present

algorithms for distributing objects to agents, in away that improves our chances of identifying a leaker.

• Finally, we also consider the option of adding “fake” objects to the distributed set.

• Such objects do not correspond to real entities but appear realistic to the agents.

• In a sense, the fake objects acts as a type of watermark for the entire set, without modifying any individual members. If it turns out an agent was given one or more fake objects that were leaked, then the distributor can be more confident that agent was guilty.

Page 9: ppt

ScopeThere are conventional techniques being used and these include technical

and fundamental analysis.

The main issue with these techniques is that they are manual, and need laborious work along with experience.

Page 10: ppt
Page 11: ppt

Problem Setup and NotationA distributor owns a set T={t1,…,tm}of valuable data objects.

The distributor wants to share some of the objects with a set of agents U1,U2,…Un, but does not wish the objects be leaked to other third parties. The objects in T could be of any type and size, e.g., they could be tuples in a relation, or relations in a database. An agent Ui receives a subset of objects, determined either by a sample request or an explicit request:

1. Sample request 2. Explicit request

Analysis

Page 12: ppt

Explicit Request:

Sample Request:

Sample request Ri = SAMPLE(T,mi): Any subset of mi records from T can be given to Ui.

Explicit request Ri = EXPLICIT(T, condi): Agent Ui receives all the T objects that satisfy condi.

Page 13: ppt

Explicit Data Requests

1: R ← Agents that can receive fake objects∅2: for i = 1, . . . , n do3: if bi > 0 then4: R ← R {i}∪5: Fi ← ∅6: while B > 0 do7: i ← SELECTAGENT(R,R1, . . . , Rn)8: f ← CREATEFAKEOBJECT(Ri, Fi, condi)9: Ri ← Ri {f}∪10: Fi ← Fi {f}∪11: bi ← bi − 112: if bi = 0 then13: R ← R\{Ri}14: B ← B − 1

In the first place, the goal of these experiments was

to see whether fake objects in the distributed data sets yield significant

improvement in our chances of detecting a

guilty agent. In the second place, we wanted to evaluate our e-optimal

algorithm relative to a random allocation.

Page 14: ppt

Sample Data Requests

1: a ← 0|T| a[k]:number of agents who have received object tk

2: R1 ← , . . . , Rn ← ∅ ∅

3: remaining ←

4: while remaining > 0 do5: for all i = 1, . . . , n : |Ri| < mi do6: k ← SELECTOBJECT(i,Ri) May also use additionalParameters

7: Ri ← Ri {tk}∪8: a[k] ← a[k] + 19: remaining ← remaining − 1

With sample data requests agents are not interested in particular objects. Hence, object sharing is not explicitly defined by their requests. The distributor is “forced” to allocate certain objects to multiple agents only if the number of requested objects exceeds the number of objects in set T. The more data objects the agents request in total, the more recipients on average an object has; and the more objects are shared among different agents, the more difficult it is to detect a guilty agent. 

Page 15: ppt

Login

Registration

Transfer Data to agents

View Data transfer between Agents

Find Guilt Agents

MODULE DIAGRAM

Probability distribution of Data transferred by guilt agents

Page 16: ppt

LoginTransfer data to agents

View transfer of data between agents

Find Guilt AgentsFrequency determination of leakage of data between agents

OBJECT DIAGRAM

Page 17: ppt

LOGINDATA TRANSFER

ADDING FAKE OBJECTS WHEN DATA TRANSFERRED BY AGENTS

FIND GUILT AGENTS

PROBABILITY DISTRIBUTION FOR DATA LEAKAGE

PROJECT FLOW DIAGRAM

Page 18: ppt

ARCHITECTURE DIAGRAM:

Page 19: ppt

Login as Distributor

Logout

Transfer data to Agents

ADD FAKE OBJECTS WHEN DATA TRANSFERRED BY AGENTS

Find Guilt Agents

Show the probability

distribution of data

E-R DIAGRAM:

Page 20: ppt

Login

Data Transfer

Fake objects addition

Guilt Model Analysis

Show the probability

distribution of data leakage

Logout

DATA FLOW DIAGRAM:

Page 21: ppt

The system has the following

• Data Allocation

• Fake Object

• Optimization

• Data Distributor

Page 22: ppt

Data Allocation : The main focus of our project is the data allocation

problem as how can the distributor “intelligently” give data to agents in order to improve the chances of detecting a guilty agent.

Page 23: ppt

Fake Object :

Fake objects are objects generated by the distributor in order to increase the chances of detecting agents that leak data. The distributor may be able to add fake objects to the distributed data in order to improve his effectiveness in detecting guilty agents. Our use of fake objects is inspired by the use of “trace” records in mailing lists.

Page 24: ppt

Optimization : The Optimization Module is the distributor’s data

allocation to agents has one constraint and one objective. The distributor’s constraint is to satisfy agents’ requests, by providing them with the number of objects they request or with all available objects that satisfy their conditions. His objective is to be able to detect an agent who leaks any portion of his data.

Page 25: ppt

Data Distributor :A data distributor has given sensitive data to a

set of supposedly trusted agents (third parties). Some of the data is leaked and found in an unauthorized place (e.g., on the web or somebody’s laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means.

Page 26: ppt

Operating System: Windows xp,,vista,2007Technologies : ASP .Net and C# .Net Database : MS-SQL Server 2005 IDE : Ms-Visual Studio .Net 2008

Preferred Technologies

Page 27: ppt

System : Pentium IV 2.4 GHz Hard Disk : 40 GBFloppy Drive : 1.44 MBMonitor : 15 VGA colourMouse : Logitech.Keyboard : 110 keys enhanced.RAM : 256 MB

Page 28: ppt

Recommended