+ All Categories
Home > Documents > MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2...

MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2...

Date post: 21-Feb-2021
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
43
MPRI 2.3.2 - Foundations of privacy Lecture 2 Kostas Chatzikokolakis Sep 26, 2016
Transcript
Page 1: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

MPRI 2.3.2 - Foundations of privacy

Lecture 2

Kostas Chatzikokolakis

Sep 26, 2016

Page 2: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Plan of the course

Quantitative Information Flow

◮ Motivation, application examples

◮ Secrets and vulnerability

◮ Channels and leakage

◮ Multiplicative Bayes-Capacity

◮ Comparing systems, the lattice of information

◮ Applications and exercises

Page 3: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Quantitative Information Flow

Information Flow: Leakage of secret information via correlated observables

Ideally: No leak

• No interference [Goguen & Meseguer’82]

In practice: There is almost always some leak

• Intrinsic to the system (public observables, part of the design)

• Side channels

need quantitative ways to measure the leak

Page 4: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Location-Based Systems

2

‣ Retrieval of Points of Interest (POIs).

‣Mapping Applications.

‣Deals and discounts applications.

‣ Location-Aware Social Networks.

A location-based system is a system that uses geographical information in order to provide a service.

Page 5: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Location-Based Systems

‣ Location information is sensitive. (it can be linked to home, work, religion, political views, etc).

‣ Ideally: we want to hide our true location.

‣ Reality: we need to disclose some information.

3

Page 6: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Example

‣ Find restaurants within 300 meters.

4

‣Hide location, not identity.

‣ Provide approximate location.

Page 7: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Obfuscation

7

area of interest

Page 8: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Obfuscation

7

reported position

area of interest

Page 9: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Obfuscation

7

area of retrieval

area of interest

Page 10: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Obfuscation

7

area of retrieval

area of interest

Page 11: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Obfuscation

7

area of interest

Page 12: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Issues to study

How can get we generate the noise?

What kind of formal privacy guarantees do we get?

Which mechanism gives optimal utility?

What if we use the service repeatedly?

Page 13: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Timing Attacks in Cryptosystems

Remote timing attack [BonehBrumley03]

1024-bit RSA key recovered in 2 hours from standard

OpenSSL implementation across LAN

Response time depends on the key!

Page 14: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Timing Attacks in Cryptosystems

What counter-measures can we use?

Make the decryption time constant

Too slow!

Force the set of possible decryption times to be small

Is it enough?

Must be combined with blinding

Careful analysis of the privacy guarantees is required

Page 15: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

What is a secret?

My password: x = ”dme3@21!SDFm12”

What does it mean for this password to be secret?

The adversary should not know it, i.e. x comes from a set X

of possible passwords

Is x ′ = 123 an equally good password? Why?

Passwords are drawn randomly from a probability distribution

Page 16: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

What is a secret?

A secret x is something about which the adversary knows

only a probability distribution π

π is called the adversary’s prior knowledge◮ π could be the distribution from which x is generated◮ or it could model the adversary’s knowledge on the population

the user comes from

How vulnerable is x?

It’s a property of π, not of x

Page 17: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Vulnerability

How vulnerable is our secret under prior π?

The answer highly depends on the application

Eg: assume uniformly distributed secrets but the adversary

knows the first 4 bytes

(this can be expressed by a prior π)

Is this threat substantial?◮ No: if the secrets are long passwords◮ Yes: if the first 4 bytes is a credit card pin

Page 18: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Vulnerability

To quantify the threat we need an operational scenario

What is the goal of the adversary?

How successful is the adversary in achieving this goal?

Vulnerability: measure of the adversary’s success

Uncertainty: measure of the adversary’s failure

Page 19: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

First approach

Assume the adversary can ask for properties of x

i.e. questions of the form “is x ∈ P?”

Goal: completely reveal the secret as quickly as possible

Measure of success: expected number of steps

Page 20: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

First approach

eg:

X = {heads, tails}, π = (1, 0)

X = {heads, tails}, π = (1/2, 1/2)

X = {a, b, c , d , e, f , g, h}, π = (1/8, . . . , 1/8)

X = {a, b, c , d , e, f , g, h}

π = (1/4, 1/4, 1/8, 1/8, 1/16, 1/16, 1/16, 1/16)

Page 21: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

First approach

Best strategy: at each step split the search space in sets of equal

probability mass

Page 22: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

First approach

At step i the total probability mass is 2−i .

If the probability of x is πx = 2−i , then at step i the search

space will only contain x . So it will take i = −log2πx steps

to reveal x .

So the expected number of steps is:

x

−πx log2 πx

(if π is not powers of 2 this will be a lower bound)

Page 23: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Shannon Entropy

This is the well known formula of Shannon Entropy:

H(π) =∑

x

−πx log2 πx

It’s a measure of the adversary’s uncertainty about x

Minimum value: H(π) = 0 iff πx = 1 for some x

Maximum value: H(π) = log2 |X | iff π is uniform

Page 24: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Shannon Entropy

The binary case X = {0, 1}, π = (x , 1− x):

Page 25: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Shannon Entropy

Very widely used to measure information flow

Is it always a good uncertainty measure for privacy?

Example: X = {0, . . . , 232}, π = (18, 782−32, . . . , 7

82−32)

H(π) = 28.543525

But the secret can be guessed with probability 1/8!

Undesired in may practical scenarios

Page 26: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Bayes Vulnerability

Adversary’s goal: correctly guess the secret in one try

Measure of success: probability of a correct guess

Optimal strategy: guess the x with the highest πx

Bayes Vulnerability:

Vb(π) = maxxπx

Page 27: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Bayes Vulnerability

Maximum value: Vb(π) = 1 iff πx = 1 for some x

Minimum value: Vb(π) = 1/|X | iff π is uniform

Min-entropy: log2(Vb(π))

Bayes risk: 1− Vb(π)

Previous example: π = (18, 782−32, . . . , 7

82−32)

Vb(π) = 1/8

Page 28: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Bayes Vulnerability

The binary case X = {0, 1}:

Page 29: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Guessing Entropy

Adversary’s goal: correctly guess the secret in many tries

Measure of success: expected number of tries

Optimal strategy: try secrets in decreasing order of

probability

Guessing entropy:

G (π) =

|X |∑

i=1

i · πxi

xi : indexing of X in decreasing probability order

Page 30: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Guessing Entropy

Minimum value: G (π) = 1 iff πx = 1 for some x

Maximum value: G (π) = (|X |+ 1)/2 iff π is uniform

The binary case:

Page 31: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Still not completely satisfied

What if the adversary wants to reveal part of a secret?

Or is satisfied with an approximative value?

Or we are interested in the probability of guessing after

multiple tries?

Page 32: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

ExampleSecret: database of 10-bit passwords for 1000 users:

pwd0, pwd1, . . . , pwd999

The adversary knows that the password of some user is z ,

but does not know which one (all are equally likely)

A1: guess the complete database◮ Vb(π) = 2

−9990

A2: guess the password of a particular user i◮ Create distribution πi for that user◮ Vb(πi ) = 0.001 · 1+ 0.999 · 2

10 ≈ 0.00198

A3: guess the password of any user◮ intuitively, the secret is completely vulnerable◮ how can we capture this vulnerability?

Page 33: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Abstract operational scenario

A makes a guess w ∈ W about the secret

The benefit provided by guessing w when the secret is x is

given by a gain function:

g :W ×X → R

Success measure: the expected gain of a best guess

Page 34: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

g-vulnerability

Expected gain under guess w :∑

x πxg(w , x)

Choose the one that maximizes the gain

g-vulnerability:

Vg(π) = maxw∈X

x∈X

πxg(w , x)

(sup if W is infinite)

l-uncertainty measures can be also defined using loss

functions: Ul(π) = minw∑

x πx l(w , x)

Page 35: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

The power of gain functions

Guessing a secret approximately.

g(w , x) = 1− dist(w , x)

SW1

4SE

1

4

NW1

4NE

1

4

CC

d d

d d

Guessing a part of a secret.

g(w , x) = Does w match the high-order bits of x?

Lab location:

N 39 .95185W 75 .18749

Guessing a property of a secret.

g(w , x) = Is x of gender w?

Ann

Sue

PaulBob

Tom

Guessing a secret in 3 tries.

g3(w , x) = Is x an element of set w of size 3?

PassWord

Dictionary:

superman

apple-juice

johnsmith62

secret.flag

history123

...

Page 36: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Password database example

Secret: database of 10-bit passwords for 1000 users:

pwd0, pwd1, . . . , pwd999

A3: guess the password of any user

W = {p | p ∈ {0 . . . 1023}}

g(p, x) =

{

1, if x [u] = p for some u

0, otherwise.

Vg(π) = 1

Page 37: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Expressiveness of Vg

Can we express Bayes-vulnerability using g-vulnearability?

A guesses the exact secret x in one try

Guesses W = X

Gain function:

gid(w , x) =

{

1, if w = x ,

0, if w 6= x .

Vgid coincides with Vb

Page 38: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Expressiveness of Vg

What about guessing entropy?

It’s an uncertainly measure, so we need loss functions

Guesses W = permutations of X

eg w1 = (x3, x1, x2)

(think of w as the order of guesses)

Loss function: lG (w , x) = i where i is the position of x in the

permutation w

Ulg (π) coincides with G (π)

Page 39: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Expressiveness of Vg

What about Shannon entropy?

Again we need loss functions, and infinitely many guesses

Guesses W = probability distributions over X

(think of w as a way to construct the search tree)

Loss function: lS(w , x) = − log2 wx(number of questions to find x using this tree)

Because of Gibb’s inequality: UlS (π) = H(π)

We can restrict to countably many guesses

Page 40: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Expressiveness of Vg

What other measures can we express as Vg,Ul?

What is a reasonable uncertainly measure f ?

Let’s fix some desired properties of f

Page 41: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Desired prop. of uncertainty measures

Domain and range: f : P(X )→ [0,∞)

Continuity: a small change in π should have a small effect in

f (π)

Concavity

We flip a coin give to the adversary the prior π1 with pb c

and prior π2 with pb 1− c

His uncertainty on average is cf (π1) + (1− c)f (π2)

If we give him a single prior π = cπ1 + (1− c)π2 his

uncertainly should be at least as big

f (∑

i ciπi ) ≥

i ci f (πi ) where

i ci = 1

Page 42: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Desired prop. of uncertainty measures

Implies continuity everywhere except on the boundary

Shannon-entropy, Bayes-vulnerability and Guessing-entropy satisfy

these properties

Page 43: MPRI 2.3.2 - Foundations of privacy Lecture 2 · 2016. 10. 6. · Location-Based Systems 2 ‣Retrieval of Points of Interest (POIs). ‣Mapping Applications. ‣Deals and discounts

Desired prop. of uncertainty measures

Let UX denote the set of all uncertainty measures

(i.e. non-negative continuous concave functions of PX )

Let LX denote the set of l-uncertainty functions Ul

What’s the relationship between UX and LX?


Recommended