© 2014 IBM CorporationJuly 8, 2015
Technologies to Protect ePrivacy
Lecture 1 – Introduction
Jan CamenischIBM Research – Zurich
@jancamenischwww.camenisch.org/eprivacy
© 2014 IBM CorporationJuly 8, 2015
We leave eTraces, lots of eTraces!
© 2014 IBM CorporationJuly 8, 2015
… and are observed by lots of sensors
© 2014 IBM Corporation4
Information leaked, some examples
! Network layer– MAC address, physical characteristics– IP address, location, ...
! OS & Applications– Browser (fonts, plugins, time zone, screen size, color depth)– Infrastructure services (instant messaging, email downloading, …) – Cloud storage, syncing of devices
! Usage– Web searches (Google, bing, ...) – Shopping history – Friends networks (Facebook, Linked-in)– eGovernment
© 2014 IBM Corporation5
Why are these information collected?
! To offer (better) services– Shipping address– Not having to enter information each time
! To prevent attacks...– DoS– Wikipedia edits
! To make money– Advertisements (loads of money)– Better service to customer
• Tailored products– Price discrimination
! Homeland security, police, … ! Criminals :-(
(source: wikipedia)
© 2014 IBM Corporation6
Some opinions....
! “You have zero privacy anyway — get over it. Scott McNealy, CEO Sun Microsystems, January 1999
! “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”
Eric Schmidt, CEO Google, December 2009
! “People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people. That social norm is just something that has evolved over time.”
Marc Zuckerman, CEO Facebook, January 2010
© 2014 IBM Corporation7 July 8, 2015
You have no privacy, get over it .....?!?
! Huge security problem!– Millions of hacked passwords (100'000 followers $115 - 2013)– Lost credit card numbers ($5 - 2013) – Stolen identities ($150 - 2005, $15 - 2009, $5 – 2013)– Lots of not reported issues (industrial espionage, etc)
! Difficult to put figures down– Credit card fraud – Spam & marketing – Manipulating stock ratings, etc..
! We know secret services can do it easily, but they are not the only ones– this is not about homeland security– and there are limits to the degree of protection that one can achieve
! … end we have not event discussed social issues such as democracy etc
! last but not least: data are the new money, so need to be protected
→ Privacy by Design
© 2014 IBM Corporation8
Need to protect our data!
!devices, sensors, etc cannot all be physically protected– authentication of all devices – authentication of all data
...makes it even worse :-(
!data cannot be controlled– minimize information – encrypt information – attach usage policies to each bit
© 2014 IBM CorporationJuly 8, 2015
So what can we do
!Legal approach–Regulate what information can be collected–How to collect it–How to use and protect it–Issue fines for misbehavior–Very different for different countries and cultures
!Technological approach–Protect data by encryption–Govern data by policies–Minimize data that needs to be used
© 2014 IBM Corporation
A Closer Look at Data Collection
© 2014 IBM CorporationJuly 8, 2015
More data is collected
! Collect more–Expand an existing person-specific data collection.
! Collect specifically–Replace an existing aggregate data collection with a person-specific
one.
! Collect if you can
Data collection per encounter for Illinois (bytes)
L. Sweeney. Information Explosion. Confidentiality, Disclosure, and Data Access: Theory and Practical Applications for Statistical Agencies
Examples (based field size) 1983 1996 Birth 280 1864 Hospital visit 0 663 Grocery visit 32 1272
© 2014 IBM CorporationJuly 8, 2015
Identifiability in the Internet
! Every device connected to the Internet gets an Internet Protocol (IP) address assigned by the Internet Service Provider (ISP).
! As long as the device stays connected, the IP address is a unique device identifier within the Internet.
! Connection data may/must be logged for various reasons (e.g., billing purposes, advertisement, law enforcement, etc.)
! Connection data/time/duration & contacted server IP address
! In case of criminal offense, police may ask ISP for log files
! ISP has customer’s postal address (and bank data) for billing
! Service providers (SP) may also log activities of their users
! Linking data of SP and ISP reveals user’s activities on the Web
! MAC (media access control) addresses are unique identifiers per network interface (Ethernet, Bluetooth, Wifi, …) – IETF considers random MACs
! Hardware is unique and can be traced, even through hops..
© 2014 IBM CorporationJuly 8, 2015
We leak data – e.g., browsers
Property (bits of identifying info)! User Agent (13.76 bits)! HTTP ACCEPT Headers (3.8 bits)! Browser Plugin Details (19.56 bits)! Time Zone (2.58 bits)! Screen Size and Color Depth (4.98 bits)! System Fonts (19.56 bits)! Are Cookies Enabled? (1.95 bits)! Limited supercookie test (3.28 bits)
https://panopticlick.eff.orgYour browser fingerprint appears to be unique among the 3,872,607 tested so far
© 2014 IBM CorporationJuly 8, 2015
Disk Storage per Person (DSP)
A rough measure of the growth in personal data [Sweeney 2001]:
– 28 MB in 1996 (Sweeney)– 472 MB in 2000 (Edelstein, 2003) – 5,200 GB in 2020 (IDC, 2012)
Remarks– Figures consider only non-removable media, and new devices sold– Of course, other things that PII are stored
From 2012 until 2020, the digital universe will about double every two years [Gantz and Reinsel, 2012]
DSP=hard disk storage soldworld population
© 2014 IBM CorporationJuly 8, 2015
Big Data
A new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis.
Also: many organizations (government and private) make data publicly available
Source: http://www.nature.com/msb/journal/v8/n1/full/msb201247.html
... privacy that was previously protected only by the fact that data gathering was difficult can now be trivially violated.
Samuel Weber, National Science Foundation (2012)
Example: How To Break Anonymity of the Netflix Prize Dataset: Arvind Narayanan, Vitaly Shmatikov
© 2014 IBM CorporationJuly 8, 2015
Radical Change in Advertising
E-mail, Web, social networks and mobile devices join in the conventional marketing channels (print media and TV / cinema, outdoor advertising, telemarketing, direct mail).
! Benefits of digital media for the advertising industry
– Reaction of the target consumers is directly measurable in number of page views, click-through rate or conversion rate
– Scattering losses can be much better minimized Pay-per-click→
– better validation of campaigns’ effectiveness
! Advertising inserts according to time of day, weather, region, behavior
→ audience-targeted advertisement
Facts:
! 90% of Facebook’s $3.8 billion revenue in 2011 came from ads.
! In 2012, online ad spending overtook print ad spending (USA).
© 2014 IBM CorporationJuly 8, 2015
Ad Networks
Source: www.futureofprivacy.org/2010/04/29/before-you-even-click
© 2014 IBM CorporationJuly 8, 2015
Summary
! Networks, Devices, Apps are built to leak much more data than necessary! It has become very profitable to mine data for money! Big data – mining money from all possible data has just started
© 2014 IBM Corporation
Is this all legal?
© 2014 IBM CorporationJuly 8, 2015
Laws and regulations – History
! Notion of privacy has changes throughout history– Curtains, shades, etc
! Code of fair information practices (1973)– Notice/Awareness– Choice/Consent– Access/Participation– Integrity/Security– Enforcement/Redress
! Many laws follow the same principles– US Privacy Act 1974– OECD Guidelines 1980– EU data protection directive 1995
! But: Often only own citizens are protected!– See, e.g., US laws
! Laws are always lagging behind technology, often difficult to capture
© 2014 IBM CorporationJuly 8, 2015
OECD Privacy Principles
!Collection LimitationThere should be limits to the collection of personal data and any such data obtained by lawful and fair means, and where appropriate, with the knowledge or consent of the data subject.
!Data QualityPersonal data should be relevant to the purposes for which they are to be used, and, to the extend necessary for those purposes, should be accurate, complete and kept up-to-date.
!Purpose SpecificationThe purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes.
© 2014 IBM CorporationJuly 8, 2015
OECD Privacy Principles
!Use LimitationPersonal data should not be disclosed, made available, or otherwise used for purposes other than those specified under the preceding purpose specification principle
–except with the consent of the data subject, or –by the authority of law.
!Security SafeguardsPersonal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure.
!AccountabilityA data controller should be accountable for complying with measures that give effect to the principles stated above.
© 2014 IBM CorporationJuly 8, 2015
OECD Privacy Principles
!OpennessThere should be a general policy of openness about developments, practices and policies with respect to personal data.
! Individual ParticipationAn individual should have the right:
– to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him;
– to have communicated to him, data relating to him• within a reasonable time,• at a charge, if any, that is not excessive,• in a reasonable manner, and Individual Participation in a form that is readily intelligible
to him;– to be given reasons if a request made under subparagraphs (a) and (b) is denied,
and to be able to challenge such denial; and– to challenge data relating to him and, if the challenge is successful to have the data
erased, rectified, completed or amended.
© 2014 IBM CorporationJuly 8, 2015
Laws Throughout the World
Privacy laws & regulations vary widely throughout the world! European Data Protection Directive
– privacy = fundamental human right– requires all European Union countries to adopt similar comprehensive privacy laws – revision in process (higher fines are foreseen)
! US sets on self-regulation, some sector-specific laws, with relatively minimal protections– Constitutional law governs the rights of individuals wrt the government– Tort (Schadenersatz) law governs disputes between private individuals or other private persons– Federal Trade Commission (FTC) deals with consumer protection
• HIPAA (Health Insurance Portability and Accountability Act, 1996) • COPPA (Children’s Online Privacy Protection Act, 1998)• GLB (Gramm-Leach-Bliley-Act, 1999)
→ over 7,000 State Legislations & Enforcement activity/Class Actions
! Australia: Privacy Act 1988, Amendment 2012– Notify users about collection; Notice if data are sent overseas, stronger access rights
In most countries, fines are minimal.More information http://www.informationshield.com/intprivacylaws.html
© 2014 IBM CorporationJuly 8, 2015
The Worst Data Breaches of 2011
535 breaches during 2011 that involve 30.4 million sensitive records(2013: 601 breaches recorded, involving 55 million records)
! Sony – after over a dozen data breaches Sony faces class action lawsuits over its failure to protect over 100 million user records.
! Epsilon – 60 million customer emails addresses! Sutter Physicians Services – stolen desktop contained 3.3 million patients’
medical details including name, address, phone number, email address & health insurance plan.
! Tricare and SAIC – backup tapes were stolen from the car of a Tricare employee; led to a $4.9 billion lawsuit being filed.
Privacy Rights Clearinghouse (PRC) (https://www.privacyrights.org/top-data-breach-list-2011)
And the trend continues :-( → Importance of strict privacy & security policies (incl. data retention) → Avoid “breaches” simply by properly encrypting sensitive data or,
better, using privacy enhancing technologies...
https://www.privacyrights.org/top-data-breach-list-2011
© 2014 IBM Corporation
What can be do?..... privacy by design!
© 2014 IBM CorporationJuly 8, 2015
of course there are limits...
! tracing is so easy– each piece of hardware is quite unique
– log files everywhere
! …. but that's not the point!– it's not about NSA et al.– active vs. passive “adversaries”
..... still, privacy by design!
© 2014 IBM Corporation
Our Vision In the Information Society, users can act and interact in a safe and secure way while retaining control of their private spheres.
© 2014 IBM Corporation29
PETs Can Help!
Privacy, Identity, and Trust Mgmt Built-In Everywhere!!Network Layer Anonymity
– ... in mobile phone networks– ... in the Future Internet as currently discussed– ... access points for ID cards
! Identification Layer–Access control & authorization
!Application Layer–“Standard” e-Commerce –Specific Apps, e.g., eVoting, OT, PIR, .....–Web 2.0, e.g., Facebook, Twitter, Wikis, ....
© 2014 IBM CorporationJuly 8, 2015
RFID Tags & Tracking
© 2014 IBM Corporation31
Hidden Tracking
RFID tags– Groceries, Clothes, Books – Passports– also wifi, bluetooth, …
Problems– Covert reading by radio– Unique Identity
Attacks– Tracking and targeting, e.g., London: the bins are watching
you– Blacklisting, e.g., books– Smart Bomb, Smart Robbing (cash)
www.v3.co.uk/2288299
© 2014 IBM Corporation32
RFID tags basics
! RFID (Radio frequency id): read the identity of a tag by radio– many different methods to communicate with the reader– e.g., tree walking when many tags are present
! Tag– passive, energy from field, sending by field modulation– chip (low number of gates & power), antenna, packaging
Successor of barcodes: wireless → significant improvements– no-line of sight, can store (and “process”) information– multiple tags can be read concurrently
© 2014 IBM Corporation33
Use Cases
!Tracking and tracing–luggage, products, live stock–supply chain mgmt–inventory control–sports timing
!counterfeit detection–drugs, mechanical parts, etc
!access control–buildings–EZ pass, ticketing–credit cards–passports & identity cards
© 2014 IBM Corporation34
Characteristics
properties define life time, kind of data that can be stored and processed
Source: Gildas Avoine, Privacy Challenges in RFID, 2011
© 2014 IBM Corporation35
Tag characteristics
! supply chain ! access control
© 2014 IBM Corporation36
Potential Privacy Issues
!Unique id for all objects (worldwide)!RFID tags carry sensitive information!User might not be aware of the tags they carry
–medication, cloths, credit cards, etc
!Tag can be read from a distance–largely depends on the signal strength of reader
!Even if contents is secure, location tracking still possible!Massive data aggregation, tracking and profiling
–Library!Alter stored data
© 2014 IBM Corporation37
Privacy Implications
Information obtainable from reading tags, e.g.,!Tag present → human present!Determining origin of human
–E.g., id-cards, credit card, library, cloths, etc!pure tracking (single tag, combination of tags)!hot-listing
–categorizing users of certain products, books, etc
© 2014 IBM CorporationJuly 8, 2015
RFID Protection Mechanisms
© 2014 IBM CorporationJuly 8, 2015
Different Categories of Remedies
!Destroying Tags–Specific commands–Physically
!Prevent communication of tags with readers–blocking–implementation of access control schemes
!Better protocols, use of crypto–difficult due to very limited resources of tags–lightweight crypto as a new research field–symmetric crypto only → key management issues
© 2014 IBM Corporation40
Destroying tags
User space solutions! Faraday Cage! RFID Zapper – destroy tag
– emits strong field frying tag
Industry provided solutions! Kill Command, deactivates chip
– needs password protection– all or nothing privacy protection– potential for misuse
• vandalism• re-enable covertly
! Clipping tags– consumer choice– visual feedback on state– maybe read later on close
(mechanical?) contact
© 2014 IBM CorporationJuly 8, 2015
some simple protocols
© 2014 IBM Corporation42
Better Protocols and use of Crypto
Restrictions! Powered only when in range of reader
– extremely limited time to perform computations– pre-computation impossible when tag is out of range
! Extremely few gates – as few as 5'000 are left for extra tasks– even symmetric crypto impossible (encryption, hash functions, PRF)– maybe not even enough for cryptographic PNG– some feedback shift-register generators work → no cryptographic security
→ only simple password comparison and XOR works
! collision avoidance protocols require (separate) identifier → must not use static identifier, but random not possible
! any solution uses at best symmetric crypto, but will require (complex) password/key management
© 2014 IBM Corporation43
Goals
! Authenticate Reader towards chip! Using only very simple primitives
–XOR, maybe hash function, maybe PRF/PRN
! Attacker model–eavesdropping communication (including replay attacks)–opening tags–cloning tags–initiate own communication with tags–tracking
© 2014 IBM Corporation44
Simple Authentication (Molnar/Wagner)
Assumption:
! Channel Reader → Tag eavesdroppable, Tag → Reader secure.
! Shared key s with all tags (library scenario): BIG DRAWBACK (unrealistic)
Goal: Authentication of Command, Maintain Privacy of tags
Analysis:
- security against passive eavesdropping on the reader-to-tag link
- adversary cannot replay protocol messages
- nonce cannot be used to distinguish different tags
- but requires a good source of randomness (open problem!)
rcmd, p
TagReader
HelloHello
shared secret s
verify p
chooses random rp=r⊕s
© 2014 IBM Corporation45
Basic PRF Private Authentication Scheme
Assumption:
Channel completely eavesdroppable, but can do PRFs on tag
Analysis
! no system wide secret, but secret per Tag
! provides privacy for tags, secure against passive eavesdropper
! linear work on reader (can you improve that?)– Improved solution requires tags to share keys → reduces privacy....
! Note: fs(i,r1,r2) can be used to encrypt the i-th messages with XOR
r2, p1
cmd, p2
TagReader
Hello, r1
shared secret s (per tag)
chooses random r2p1 = fs(0,r1,r2)
verify p2
random r1
find s s.t. p1 ok
p2 = fs(1,r1,r2)
© 2014 IBM Corporation46
Sublinear Protocols (Cheon-Hong-Tsudik)
Idea:
! choose two sets of keys K1 and K2 of size n=sqrt(N)
! assign unique pair (k1,k2) to each tag
Analysis:
! Less work for reader!
! But privacy decreases if too many tags compromised (forward security and probability of other tags), also no forward privacy
! Extension: add a third key for authentication (use above protocol for identification)
r2, p
TagReader
Hello, r1
shared secret (k1,k2)
chooses random r2p = fk1(r1,r2) f⊕ k2(r1,r2)
random r1
ui = fk1(i)(r1,r2), i = 1,..,n
vj = p f⊕ k2(j)(r1,r2), j = 1,..,n
find (i,j) s.t. ui = vj
© 2014 IBM Corporation47
Conclusion RFID
! RFID tags are spreading
! Tracking of equipment tightly linked to tracking people (hidden!)– same or similar issues also exist for other devices (wifi, bluetooth, NFC)
but easer to mitigate at least in theory
! Privacy poorly addressed, proposed protocols still lacking– hopefully more powerful tags in the future– key management for large-scale systems might prevent any privacy
Excellent RFID literature collection: www.avoine.net/rfid
© 2014 IBM CorporationJuly 8, 2015
Anonymous Communication
© 2014 IBM CorporationJuly 8, 2015
Today: Internet = Postcard, sometimes encrypted (SSL, S/MIME..)
However, communication can be linked by IP addresses and messages!
Sender and Receiver might – still reveal content of message– be dangerous to sender or receiver– Defeat the purpose of application (e.g., voting...)
Anonymous Communication
© 2014 IBM CorporationJuly 8, 2015
Approaches
! Encrypt messages between router, but routers still learn addresses
! Hide addresses from routers, how?
Limitations
! If there is only Alice and Bob and all channels can be observed No →anonymity
– Assume adversary is limited – Assume enough messages travel (caching strategies)
Anonymous Communication
© 2014 IBM CorporationJuly 8, 2015
A simple and elegant solution
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Inputs
1
none
none
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Flip Pairwise Coins
1
none
none
0
Assume we have secure point-to-point channels
© 2014 IBM CorporationJuly 8, 2015
1
none
none
0
1
0
Dining Cryptographers: Flip Pairwise Coins
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Announce Results
1
none
none
0
1
0
1 = 0 ' 1
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Announce Results
1
none
none
0
1
0
1 = 0 ' 1
1 = 0 ' 0 ' 1
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Announce Results
1
none
none
0
1
0
1 = 0 ' 1
1 = 0 ' 0 ' 1
1 = 0 ' 1
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Compute Sent Msg
1 = 0 ' 1
0
0
11
none
none
1 = 0 ' 1
1 = 0 ' 0 ' 1
1 = 1 ' 1 ' 1
© 2014 IBM CorporationJuly 8, 2015
Dining Cryptographers: Discussion
Security:! Information theoretic anonymity!!Can easily add encryption (orthogonal)
Limitations!Requires broadcast channel!Rather inefficient for the whole internet (everyone would have to
participate)–Hierarchical schemes have been proposed
© 2014 IBM CorporationJuly 8, 2015
Trusted Third Party solutions
© 2014 IBM CorporationJuly 8, 2015
All traffic routed by Anonymity Service (e.g., anonymizer.com, VPNs)! Proxy and encryption (e.g., SSL)
– Send Web-request via SSL to AS, AS gets and provides page– Trust only AS not all the routers anymore– AS learns sender & receiver and contents
! Using encryption (e.g., SSL to Bob with VPN to AS), so essentially– Alice sends c1 = EncAS(Bob,EncBob(m)) to AS – AS decrypts c1 and forwards EncBob(m) to Bob.– AS learns sender & receiver and is single point of failure
Anonymous Communication – Using a Proxy
ASAnonymity Service
© 2014 IBM CorporationJuly 8, 2015
Mix Networks – To Weaken Trust in AS
Route over several Anonymity Services: ! Alice selects route she likes! Alice sends to first router (A1) c1 = EncA1(A2,EncA2(A3,EncA3(Bob,EncBob(m)))) ! Each AS peels off layer and forwards.! Still, there needs to be more that one message at the same time
Properties:! Need to trust only one to hide address! Again: if Bob does not use encryption, end server learns contentNOTE: SECURITY AGAINST CHOSEN CIPHERTEXT NEEDED!!!
© 2014 IBM CorporationJuly 8, 2015
Variations:! Free path (onion routing) vs fixed path (mixes)! Provable vs “take what you get”
–Dropping, inserting, replaying, … ! Fixed Batches vs Real-time mixing strategies
Mix Networks – Variations
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
© 2014 IBM CorporationJuly 8, 2015
Onion Routing (Free Routes), e.g., TOR
1. Each User chooses own route2. Server peels of layer of encryption, discovers whom to send next3. Problem:
& How to mix& Encryption should not reveal number of (current) layers
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
Problem: size of ciphertexts depends on # of remaining routers
EncA1(A2,EncA2(A3,EncA3(Bob,EncBob(m))))
–Need to add name of next recipient–And typical encryption scheme expands length (cf. ElGamal where an encryption of one group element results in a ciphertext of two group elements)
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
Solution: –Notation:
• {m}k denotes the symmetric encryption of m under key k • {e}k-1 denotes the symmetric decryption of e under key k
–Assume sender shares symmetric keys ki with each router:
O1 = {{{{m}k4}k3}k2}k1, {{{Bob}k3}k2}k1, {{A3}k2}k1, {A2}k1–Receiving O1, Router A1 will compute:
O'1 = {{{m}k4}k3}k2, {{Bob}k3}k2, {A3}k2, A2–A1 needs to grow onion again: chose random R and comp.
O2 = {{m}k4}k3}k2, R , {{Bob}k3}k2, {A3}k2–A1 sends O2 to A2
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
! Receiving O2 = {{{m}k4}k3}k2, R , {{Bob}k3}k2, {A3}k2
A2 computes
O'2 = {{m}k4}k3, {R }k2-1 , {Bob}k3, A3
! A2 needs to regrow onion: choose random R' and compute
O3 = {{m}k4}k3, R', {R }k2-1 , {Bob}k3! And so on. At some point Bob will receive:
O4 = {m}k4, R”, {R'}k3-1, {{R }k2-1}k3-1He computes O'4 = m, {R”}k4-1, {{R'}k3-1}k4-1, {{{R }k2-1}k3-1}k4-1and, as {{{R }k2-1}k3-1}k4-1 does not match any name, outputs m
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
! How are keys distributed:
O1 = {{{{m}k4}k3}k2}k1, {{{Bob}k3}k2}k1, {{A3}k2}k1, {A2}k1
Replace, e.g., A3 by (C3,A3), where C3 is encryption of k3 under A3's public key
→ O1 = {{{{m}k4}k3}k2}k1, {{{C4, Bob}k3}k2}k1, {{C3,A3}k2}k1, {C2,A2}k1
! What about chosen ciphertext security? I.e., ciphertext must not be changeable!
– To make ciphertext non-malleable, include hash of onion as label in public key encryption
– → Routers cannot generate R' at random
– choose it pseudo randomly, e.g., R' = {0}k3-1
O3 = {{m}k4}k3, {0}k3-1 , {R }k2-1 , {C4, Bob}k3
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
L4
! So Onion O4 is computed as follows:
O4 = {m}k4, {0}k3-1, {{0}k2-1}k3-1, {{{0}k1-1}k2-1}k3-1, (C4,Bob),
where C4 = EncPKBob(k4,Hash(L4))
! So Onion O3 is computed as follows:
O3 = {{m}k4}k3, {{{0}k2-1}k3-1}k3 , {{{{0}k1-1}k2-1}k3-1}k3,{C4,Bob}k3,(C3,A3)
= {{m}k4}k3, {0}k2-1, {{0}k1-1}k2-1, {C4,Bob}k3, (C3,A3),where C3 = EncPK3(k3,Hash(L3))
© 2014 IBM CorporationJuly 8, 2015
Onion Routing
! So Onion O3 is computed as follows:
O3 = {{m}k4}k3, {{{0}k2-1}k3-1}k3 ,{{{{0}k1-1}k2-1}k3-1}k3, {C4,Bob}k3, (C3,A3) = {{m}k4}k3, {0}k2-1,{{0}k1-1}k2-1, {C4,Bob}k3, (C3,A3),where C3 = EncPK3(k3,Hash(L3))
! So Onion O2 is computed as follows:
O2 = {{{m}k4}k3}k2 , {0}k1-1, {{C4,Bob}k3}k2, {(C3,A3)}k2,(C2,A2),where C2 = EncPK2(k2,Hash(L2))
! So Onion O1 is computed as follows:
O1 = {{{{m}k4}k3}k2}k1, {{{C4,Bob}k3}k2}k1, {{(C3,A3)}k2}k1,{(C2,A2)}k1,(C1,A1)where C1 = EncPK1(k1,Hash(L1))
© 2014 IBM CorporationJuly 8, 2015
Technologies to Protect ePrivacy
Lecture 2 – Anonymous Credentials
Jan CamenischIBM Research – Zurich
@jancamenischwww.camenisch.org/eprivacy
© 2014 IBM Corporation73 July 8, 2015
Privacy at the Authentication Layer
Authentication without identification
© 2014 IBM Corporation74 July 8, 2015
What is an identity & identity management?
name
salary
credit card number
hobbies
phone number
address
language skills
leisure
shopping
work
public authority
nick name blood group
health care
marital status
birth date
health status
insurance
! ID: set of attributes shared w/ someone– attributes are not static: user & party can add
! ID Management: two things to make ID useful– authentication means– means to transport attributes between parties
© 2014 IBM Corporation75 July 8, 2015
Let's see a scenario
© 2014 IBM Corporation
Alice wants to watch a movie at Mplex
Alice
Movie Streaming Service
I wish to see Alice in Wonderland
© 2014 IBM Corporation
Alice wants to watch a movie at Mplex
Alice
Movie Streaming Service
You need:- subscription- be older than 12
© 2014 IBM Corporation
Watching the movie with the traditional solution
Alice
Movie Streaming Service
ok, here's - my eID - my subscription
© 2014 IBM Corporation
Watching the movie with the traditional solution
Alice
Movie Streaming Service
Aha, you are- Alice Doe- born on Dec 12, 1975- 7 Waterdrive- CH 8003 Zurich - Married- Expires Aug 4, 2018
Mplex Customer - #1029347 - Premium Subscription - Expires Jan 13, 2016
© 2014 IBM Corporation
Watching the movie with the traditional solution
Alice
Movie Streaming Service
Aha, you are- Alice Doe- born on Dec 12, 1975- 7 Waterdrive- CH 8003 Zurich - Married- Expires Aug 4, 2018
Mplex Customer - #1029347 - Premium Subscription - Expires Jan 13, 2016
This is a privacy and security problem!• - identity theft• - profiling • - discrimination
© 2014 IBM Corporation81 July 8, 2015
Watching the movie with the traditional solution
Alice
Movie Streaming Service
With OpenID and similar solution, e.g., log-in with Facebook
© 2014 IBM Corporation82 July 8, 2015
Watching the movie with the traditional solution
Alice
Movie Streaming Service
With OpenID and similar solution, e.g., log-in with Facebook
Aha, Alice is watching a 12+ movie
© 2014 IBM Corporation83 July 8, 2015
Watching the movie with the traditional solution
Alice
Movie Streaming Service
With OpenID and similar solution, e.g., log-in with Facebook
Aha, you are- [email protected] born on Dec 12, 1975- Alice's friends are .... - Alice's public profile is ...Mplex Customer - #1029347 - Premium Subscription - Expires Jan 13, 2016
Aha, Alice is watching a 12+ movie
© 2014 IBM Corporation
Identity Mixer solves this.
When Alice authenticates to the Movie StreamingService with Identity Mixer, all the services learns isthat Alice
has a subscriptionis older than 12
and no more.
© 2014 IBM CorporationJuly 8, 2015
Like PKI, but better:
! One secret Identity (secret key)
! Many Public Pseudonyms (public keys)
Privacy-protecting authentication with IBM Identity Mixer
© 2014 IBM CorporationJuly 8, 2015
Like PKI, but better:
! Issuing a credential
Privacy-protecting authentication with IBM Identity Mixer
Name = Alice DoeBirth date = April 3, 1997
© 2014 IBM Corporation87 July 8, 2015
Privacy-protecting authentication with Privacy ABCs
Alice
Movie Streaming Service
© 2014 IBM Corporation88
Privacy-protecting authentication with IBM Identity Mixer
Alice
I wish to see Alice in Wonderland
You need:- subscription- be older than 12
Movie Streaming Service
© 2014 IBM Corporation89
Privacy-protecting authentication with IBM Identity Mixer
Alice
Movie Streaming Service
© 2014 IBM Corporation90
Privacy-protecting authentication with IBM Identity Mixer
Alice
I wish to see Alice in Wonderland
You need:- subscription- be older than 12
Movie Streaming Service
© 2014 IBM CorporationJuly 8, 2015
Like PKI! but does not send credential! only minimal disclosure
Privacy-protecting authentication with IBM Identity Mixer
Alice
Movie Streaming Service
- valid subscription - eID with age ≥ 12
© 2014 IBM Corporation92 July 8, 2015
Privacy-protecting authentication with IBM Identity Mixer
Alice
Aha, you are- older than 12- have a subscription
Movie Streaming ServiceMovie Streaming Service
Like PKI! but does not send credential! only minimal disclosure (Public Verification Key
of issuer)
© 2014 IBM Corporation93
Advantages of Identity Mixer
! For Users: privacy– minimizing disclosure of personal data – keeping their identities safe– pseudonymous/anonymous access
! For Service Providers: security, accountability, and compliance– avoiding the risk of loosing personal data if it gets stolen– compliance with legislation (access control rules, personal data protection)– strong authentication (cryptographic proofs replace usernames/passwords) – user identification if required (under certain circumstances)
© 2014 IBM Corporation
Demo
Try yourself at www.ibm.biz/identitymixer on Privacy Day (January 28)
http://www.ibm.biz/identitymixer
© 2014 IBM Corporation95 July 8, 2015
Further Concepts
© 2014 IBM Corporation96 July 8, 2015
TTP
Inspector parameters
Inspection grounds
• If car is damaged: ID with insurance or gov't needs be retrieved
• Similarly: verifiably encrypt any certified attribute (optional)
• TTP is off-line & can be distributed to lessen trust
Concept – Inspection
© 2014 IBM Corporation97 July 8, 2015
• If Alice was speeding, license needs to be revoked!
• There are many different use cases and many solutions• Variants of CRL work (using crypto to maintain anonymity)
• Accumulators• Signing entries & Proof, ....
• Limited validity – certs need to be updated • ... For proving age, a revoked driver's license still works
Revocation authority parameters (public key)
Revocation info
Concept – Revocation
© 2014 IBM Corporation98 July 8, 2015
Concept – Usage Limitation
Degree of anonymity can be limited:
! If Alice and Eve are on-line at the same time, they are caught!
! Use Limitation – anonymous until:– If Alice used certs > 100 times total... – ... or > 10'000 times with Bob
! Alice's cert can be bound to hardware token (e.g., TPM)
© 2014 IBM Corporation99 July 8, 2015
A couple of use cases
© 2014 IBM Corporation100
Age verification
! Movie streaming services
! Gaming industry
! Online gambling platforms
! Dating websites
! Social benefits for young/old people
Proving 12+, 18+, 21+ without disclosing the exact date of birth – privacy and compliance with age-related legislation
© 2014 IBM Corporation101
Healthcare
! Anonymous access to patients' records– accessing medical test results
! Anonymous consultations with specialists– online chat with a psychologist – online consultation with IBM Watson
! Eligibility for the premium health insurance– proving that the body mass index (BMI) is in the certain range without disclosing the
exact weight, height, or BMI
Anonymous treatment of patients (while enabling access control and payments)
© 2014 IBM Corporation102
Subscriptions, membership
! Patent databases
! DNA databases
! News/Journals/Magazines
! Transportation: tickets, toll roads
! Loyalty programs
Who accesses which data at which time can reveal sensitive information about the users (their research strategy, location, habits, etc.)
???
© 2014 IBM Corporation103
Polls, recommendation platforms
! Online polls – applying different restrictions on the poll participants: location, citizenship
! Rating and feedback platforms– anonymous feedback for a course only from the students who attended it– wikis– recommendation platforms
Providing anonymous, but at the same time legitimate feedback
© 2014 IBM Corporation104 July 8, 2015
Towards Realizing Anonymous Creds
© 2014 IBM Corporation
An Software Stack View on Identity Mixer
policylayer
cryptolayer
applicationlayer
resource request
presentation tokenpresentation policy
Wallet
policy credentialmatcher
credential mgr
store
evidence gen.orchestration
policy tokenmatcher
token mgr
store
evidence verif.orchestration
......Sig Enc Com ZKP
AC & app logic
......Sig Enc Com ZKP
store
© 2014 IBM Corporation106 July 8, 2015
User Verifierpresentation policy
presentation token
The Policy Layer – An Example: Presentation policy
Terms and Conditions
https://movies.....com/specifications/voucher https://movies....com/parameters/voucher
2014-06-17T14:06:00Z
© 2014 IBM Corporation107 July 8, 2015
Privacy-protecting authentication with Privacy ABCs
Alice
Movie Streaming Service
© 2014 IBM CorporationJuly 8, 2015
EncryptionSchemes
SignatureSchemes
CommitmentSchemes
Zero-Knowledge Proofs
..... challenge is to do all this efficiently!
Required Technologies
© 2014 IBM Corporation109 July 8, 2015
zero-knowledge proofs
© 2014 IBM Corporation110 July 8, 2015
Zero-Knowledge Proofs
! interactive proof between a prover and a verifier about the prover's knowledge
! properties:
zero-knowledgeverifier learns nothing about the prover's secret
proof of knowledge (soundness)prover can convince verifier only if she knows the secret
completenessif prover knows the secret she can always convince the verifier
Commitment
Challenge
Response
© 2014 IBM Corporation111 July 8, 2015
Given group and element y Є .
Prover wants to convince verifier that she knows x s.t. y = gx
such that verifier only learns y and g.
t = gs yc ?
Prover:
random r
t := gr
Verifier:
random c
s := r - cx
t
s
c
notation: PK{(α): y = gα }
Zero Knowledge Proofs of Knowledge of Discrete Logarithms
y, gx
© 2014 IBM CorporationJuly 8, 2015
Zero Knowledge Proofs
Proof of knowledge: if a prover can successfully convince a verifier, then the secret need to be extractable.
Prover might do protocol computation in any way it wants & we cannot analyse code.Thought experiment: ! Assume we have prover as a black box → we can reset and rerun prover! Need to show how secret can be extracted via protocol interface
t
sc
t
s'c'
t = gs yc = gs' yc' → yc'-c = gs-s'
→ y = g(s-s')/(c'-c)
→ x = (s-s')/(c'-c) mod q
x x
© 2014 IBM CorporationJuly 8, 2015
Zero Knowledge Proofs: Security
Zero-knowledge property:
If verifier does not learn anything (except the fact that Alice knows x = log g y )
Idea: One can simulate whatever Bob “sees”.
t
sc
Choose random c', s' compute t := gs' yc'
if c = c' send s' = s , otherwise restart
Problem: if domain of c too large, success probability becomes too small
© 2014 IBM Corporation114 July 8, 2015
One way to modify protocol to get large domain c:
t = gs yc ?
Prover:
random r
t := gr
Verifier:
random c,v h := H(c,v)
h := H(c,v) ?s := r - cx
t
s
h
c,v
notation: PK{(α): y = gα }
Zero Knowledge Proofs of Knowledge of Discrete Logarithms
y, gx
© 2014 IBM CorporationJuly 8, 2015
Zero Knowledge Proofs: Security
One way to modify protocol to get large domain c:
t
h
Choose random c', s' compute t' := gs' yc'
after having received c “reboot” verifier
Choose random scompute t := gs yc
send s
s
t'
h
c,v
c,v
© 2014 IBM Corporation116 July 8, 2015
From Protocols To Signatures
Signing a message m:- chose random r Є Zq and
- compute c := H(gr||m) = H(t||m) s := r - cx mod (q)
- output (c,s)
Verifying a signature (c,s) on a message m:
- check c = H(gs yc||m) ? ↔ t = gs yc ?
Security:- underlying protocol is zero-knowledge proof of knowledge- hash function H(.) behaves as a “random oracle.”
Signature SPK{(α): y = gα }(m):
© 2014 IBM Corporation117 July 8, 2015
Zero Knowledge Proofs of Knowledge of Discrete Logarithms
Logical combinations:
PK{(α,β): y = gα ∧ z = gβ ∧ u = gβhα }
PK{(α,β): y = gα ∨ z = gβ }
Non-interactive (Fiat-Shamir heuristic, Schnorr Signatures):
SPK{(α): y = gα }(m)
Many Exponents:
PK{(α,β,γ,δ): y = gα hβzγkδuβ }
Intervals and groups of different order (under SRSA):
PK{(α): y = gα ∧ α Є [A,B] }
PK{(α): y = gα ∧ z = gα ∧ α Є [0,min{ord(g),ord(g)}] }
© 2014 IBM Corporation118 July 8, 2015
Some Example Proofs and Their Analysis
Let g, h, C1, C2, C3 be group elements.
Now, what does PK{(α1,β1,α2,β2, α3, β3): C1= gα1hβ1 ∧ C2= gα2hβ2 ∧ C3 =gα3hβ3∧ C3 = gα1gα2hβ3 }
mean?
→ Prover knows values α1, β1, α2, β2, β3 such that
C1= gα1hβ1 , C2= gα2hβ2 and
C3 = gα1gα2hβ3 = gα1 + α2 hβ3 = g α3 hβ3
α3 = a1 + a2 (mod q)
And what about:PK{(α1,...,β3): C1= gα1hβ1 ∧ C2= gα2hβ2 ∧ C3 =gα3hβ3 C3∧ = gα1 (g5)α2hβ3 }
→ C3 = gα1gα2hβ3 = gα1 + 5 α2 hβ3
α3 = a1 + 5 a2 (mod q)
© 2014 IBM Corporation119 July 8, 2015
Some Example Proofs and Their Analysis
Let g, h, C1, C2, C3 be group elements.
Now, what does PK{(α1,..,β3): C1= gα1hβ1 ∧ C2= gα2hβ2 ∧ C3 =gα3hβ3 ∧ C3 = C2α1hβ3 } mean?
→ Prover knows values α1, β1, α2, β2, β3 such that
C1= gα1hβ1 , C2= gα2hβ2 and
C3 = C2α1hβ3 = (gα2hβ2)α1hβ3 = gα2·α1hβ3+β2·α1
C3 = gα2·α1 hβ3+β2·α1 = gα3 hβ3'
a3 = a1 · a2 (mod q)
And what aboutPK{(α1,β1 β2): C1= gα1hβ1 ∧ C2= gα2hβ2 ∧ C2 = C1α1hβ2 }
→ a2 = a12 (mod q)
© 2014 IBM Corporation120 July 8, 2015
Some Example Proofs and Their Analysis
Let g, h, C1, C2, C3 be group elements.
Now, what does PK{(α1,..,β2): C1= gα1hβ1 ∧ C2= gα2hβ2 ∧ g = (C2/C1)α1hβ2 } mean?
→ Prover knows values α, β1, β2 such that
C1= gα1hβ1
g = (C2/C1)α1hβ2 = (C2 g-α1h-β1)α1 hβ2
→ g1/α1 = C2 g-α1h-β1 hβ2/α1
C2 = gα1 hβ1 h-β2/α1 g1/α1 = gα1 + 1/α1 hβ1-β2/α1
C2 = gα2 hβ2
α2 = α1 + a1-1 (mod q)
© 2014 IBM Corporation121 July 8, 2015
signature schemes
© 2014 IBM Corporation122 July 8, 2015
RSA Signature Scheme – for reference
Rivest, Shamir, and Adlemann 1978
Secret Key: two random primes p and qPublic Key: n := pq, prime e,
and collision-free hash function H: {0,1}* -> {0,1}ℓ
Computing signature on a message m Є {0,1}* d := 1/e mod (p-1)(q-1)
s := H(m) d mod n
Verification of signature s on a message m Є {0,1}*
se = H(m) (mod n)
Correctness: se = (H(m)d)e = H(m)d·e = H(m) (mod n)
© 2014 IBM Corporation123 July 8, 2015
RSA Signature Scheme – for reference
Verification signature on a message m Є {0,1}* se := H(m) (mod n)
Wanna do proof of knowledge of signature on a message, e.g.,
PK{ (m,s): se = H(m) (mod n) }
But this is not a valid proof expression!!!! :-(
© 2014 IBM Corporation124 July 8, 2015
Public key of signer: RSA modulus n and ai, b, d Є QRn,
Secret key: factors of n
To sign k messages m1, ..., mk Є {0,1}ℓ :
● choose random prime 2ℓ+2 > e > 2ℓ+1 and integer s ≈ n● compute c :
c = (d / (a1m1·...· ak
mk bs ))1/e mod n
● signature is (c,e,s)
CL-Signature Scheme
© 2014 IBM Corporation125 July 8, 2015
To verify a signature (c,e,s) on messages m1, ..., mk:
● m1, ..., mk Є {0,1}ℓ:● e > 2ℓ+1
● d = ce a1m1·...· ak
mk bs mod nTheorem: Signature scheme is secure against adaptively chosen message attacks under Strong RSA assumption.
CL-Signature Scheme
© 2014 IBM Corporation126 July 8, 2015
Recall: d = ce a1m1a2m2 bs mod n
Observe:
!Let c' = c btmod n with randomly chosen t !Then d = c'e a1m1a2m2 bs-et (mod n), i.e.,
(c',e, s* = s-et) is also signature on m1 and m2
To prove knowledge of signature (c',e, s*) on m2 and some m1 !provide c'
! PK{(ε, µ1, σ) : d/a2m2 := c'ε a1µ1 b σ ∧ µ Є {0,1}ℓ ∧ ε > 2ℓ+1 }
→ proves d := c'ε a1µ1 a2m2b σ
Proving Knowledge of a CL-signature
© 2014 IBM CorporationJuly 8, 2015
Technologies to Protect ePrivacy
Lecture 3 – Anonymous Credentials II
Jan CamenischIBM Research – Zurich
@jancamenischwww.camenisch.org/eprivacy
© 2014 IBM Corporation128 July 8, 2015
Privacy-protecting authentication with Privacy ABCs
Alice
signature scheme
commitment scheme
zero-knowledge proofs
© 2014 IBM Corporation129 July 8, 2015
Zero Knowledge Proofs of Knowledge of Discrete Logarithms
Logical combinations:
PK{(α,β): y = gα ∧ z = gβ ∧ u = gβhα }
PK{(α,β): y = gα ∨ z = gβ }
Non-interactive (Fiat-Shamir heuristic, Schnorr Signatures):
SPK{(α): y = gα }(m)
Many Exponents:
PK{(α,β,γ,δ): y = gα hβzγkδuβ }
Intervals and groups of different order (under SRSA):
PK{(α): y = gα ∧ α Є [A,B] }
PK{(α): y = gα ∧ z = gα ∧ α Є [0,min{ord(g),ord(g)}] }
© 2014 IBM Corporation130 July 8, 2015
Public key of signer: RSA modulus n and ai, b, d Є QRn,
Secret key: factors of n
To sign k messages m1, ..., mk Є {0,1}ℓ :
● choose random prime 2ℓ+2 > e > 2ℓ+1 and integer s ≈ n● compute c :
c = (d / (a1m1·...· ak
mk bs ))1/e mod n
● signature is (c,e,s)
CL-Signature Scheme
© 2014 IBM Corporation131 July 8, 2015
To verify a signature (c,e,s) on messages m1, ..., mk:
● m1, ..., mk Є {0,1}ℓ:● e > 2ℓ+1
● d = ce a1m1·...· ak
mk bs mod n
Theorem: Signature scheme is secure against adaptively chosen message attacks under Strong RSA assumption.
CL-Signature Scheme
© 2014 IBM Corporation132 July 8, 2015
Recall: d = ce a1m1a2m2 bs mod n
Observe:
!Let c' = c btmod n with randomly chosen t !Then d = c'e a1m1a2m2 bs-et (mod n), i.e.,
(c',e, s* = s-et) is also signature on m1 and m2
To prove knowledge of signature (c',e, s*) on m2 and some m1 !provide c'
! PK{(ε, µ1, σ) : d/a2m2 := c'ε a1µ1 b σ ∧ µ Є {0,1}ℓ ∧ ε > 2ℓ+1 }
→ proves d := c'ε a1µ1 a2m2b σ
Proving Knowledge of a CL-signature
© 2014 IBM Corporation133 July 8, 2015
commitment scheme
© 2014 IBM CorporationJuly 8, 2015
m
m, 2-36-17 m є ?
mmm
mmm
Commitment Scheme: Functionality
© 2014 IBM CorporationJuly 8, 2015
m, 2-36-17
m', 3-21-11m' є
?mmm
m є ?
mmm
Binding
Commitment Scheme: Security
© 2014 IBM CorporationJuly 8, 2015
m, 2-36-17
m', 3-21-11m' є
?mmm
m є ?
mmm
Binding
Commitment Scheme: Security
© 2014 IBM CorporationJuly 8, 2015
Hiding: for all message m, m'
m'm
mmm'
mmm
Commitment Scheme: Security
© 2014 IBM CorporationJuly 8, 2015
Hiding: for all message m, m'
m'm
mmm'
mmm
mmm'
mmm
m'
m ?
Commitment Scheme: Security
© 2014 IBM CorporationJuly 8, 2015
Commitment Schemes
Group G = = of order q
To commit to element x Є Zq:
• Pedersen: perfectly hiding, computationally binding choose r Є Zq and compute c = gxhr
• ElGamal: computationally hiding, perfectly binding:choose r Є Zq and compute c = (gxhr, gr)
To open commitment:• reveal x and r to verifier• verifier checks if c = gxhr
© 2014 IBM CorporationJuly 8, 2015
Pedersen's Scheme:
Choose r Є Zq and compute c = gxhr
Perfectly hiding:
Let c be a commitment and u= logg h
Thus c = gxhr = gx+ur = g(x+ur')+u(r-r')
= gx+ur'hr-r' for any r'!
I.e., given c and x' here exist r' such that c = gx'hr'
Computationally binding:Let c, (x', r') and (x, r) s.t. c = gx'hr' = gxhr
Then gx'-x = hr-r' and u = logg h = (x'-x)/(r-r') mod q
Pedersen's Commitment Scheme
© 2014 IBM CorporationJuly 8, 2015
Proof m
true
m
Proof m = 2 •m'
m, m'
true
m m'
Proof of Knowledge of Contents
Proof of Relations among Contents
Commitment Scheme: Extended Features
© 2014 IBM CorporationJuly 8, 2015
Proof m
true
m
Proof m = 2 •m'
m, m'
true
m m'
Commitment Scheme: Extended Features
Let C1 = gmhr and C' = gm'hr then:
PK{(α,β): C = gβhα }
PK{(α,β,γ): C' = gβhα ⋀ C = (g2)βhγ }
© 2014 IBM Corporation143 July 8, 2015
putting things together
© 2014 IBM Corporation144 July 8, 2015
Realizing Pseudonyms and Key Binding
! Let G = = of order q
! User's secret key: random sk ∈ Zq
! To compute a pseudonym Nym – Choose random r ∈ Zq – Compute Nym = gskhr
© 2014 IBM Corporation145 July 8, 2015
Like PKI, but better:
! Issuing a credential
Privacy-protecting authentication with Privacy ABCs
Concept: credentials
Name = Alice DoeBirth date = April 3, 1997
© 2014 IBM Corporation146 July 8, 2015
Realizing Issuance of Credential
Recall: a signature (c,e,s) on messages m1, ..., mk:–m1, ..., mk Є {0,1}ℓ:–e > 2ℓ+1 –d = ce a1
m1·...· akmk bs mod n
Problem: Pseudonym not in message space!
Solution: Sign secret key instead
→ d = ce a1sk· a2
m2·...· akmk bs mod n
New Problem: how can we sign a secret message?
© 2014 IBM CorporationJuly 8, 2015
C = a1sk bs'
Realizing Issuance of Credential
n, ai, b, d
© 2014 IBM CorporationJuly 8, 2015
PK{(µ1,
σ' ) :
C = a 1
µ1 b σ' }
C = a1sk bs'
C, nam
e
Realizing Issuance of Credential
n, ai, b, d
© 2014 IBM CorporationJuly 8, 2015
PK{(µ1,
σ' ) :
C = a 1
µ1 b σ' }
C = a1sk bs'
Realizing Issuance of Credential
(c,e,s”)
n, ai, b, d
C, nam
ec = (d/C a2
name bs”)1/e mod n
© 2014 IBM CorporationJuly 8, 2015
d = ce a1sk a2
name bs” + s' (mod n)
PK{(µ1,
σ' ) :
C = a 1
µ1 b σ' }
C = a1sk bs'
(c,e,s”)
c = (d/C a2
name bs”)1/e mod n
Realizing Issuance of Credential
n, ai, b, d
C, nam
e
© 2014 IBM CorporationJuly 8, 2015
Realizing Issuance of Credential
n, ai, b, dWant to sign w.r.t. Nym = gskhr
© 2014 IBM CorporationJuly 8, 2015
PK{(µ1,
ρ,σ' ) :
Nym =
gµ1 h
ρ ⋀ C
= a 1µ1 a 2
ρ b σ' }
Nym = gskhr
C = a1sk a2
r bs'
Realizing Issuance of Credential
n, ai, b, d
C, Nym
, name
Want to sign w.r.t. Nym = gskhr
c = (d/C a3name bs”)1/e mod n
stores Nym, name
d = ce a1ska2
ra3name bs” + s' (mod n)
© 2014 IBM CorporationJuly 8, 2015
Polling: Scenario and Requirements
Scenario:!Pollster(s) and a number of users!Only registered user (e.g., students who took a course) can voice
opinion (e.g., course evaluation)!User can voice opinion only once (subsequent attempts are
dropped)!Users want to be anonymous !A user's opinion in different polls must not be linkable
© 2014 IBM CorporationJuly 8, 2015
Polling – Solution: Registration
!User generates pseudonym (ID for registration)!User obtains credential on pseudonym stating that she is eligible
for polls, i.e., (c,e,s)
d = ce a1ska2
r a3attr bs (mod n)
!Credential can contain attributes (e.g., course ID) about her
(n,a1,a2,b,d)
© 2014 IBM CorporationJuly 8, 2015
Polling – Solution: Submit Poll
1. User generates domain pseudonym, domain = pollID
2. User transforms credential
3. Transformed credential with a subset of the attributes– User is anonymous and unlinkable
– Multiple opinions are detected because uniqueness of domain pseudonym
© 2014 IBM CorporationJuly 8, 2015
Polling – Solution: Polling
1. Domain pseudonym: P = gdsk = H(pollID)sk
P1 = H(pollID1)sk and P2 = H(pollID2)sk are unlinakble (under the Decisional Diffie-Hellman assumption)
2. User transforms credential: – c' = c bs'mod n with randomly chosen s'– SPK{(ε, µ1, µ2, µ3,σ) : P = gdµ1 ⋀ d := c'ε a1µ1 a2µ2a3µ3b σ (mod n)
⋀ µ1, µ2, µ3 Є {0,1}ℓ ⋀ ε > 2ℓ+1 }(opinion)
© 2014 IBM Corporation157 July 8, 2015
TTP
Inspector parameters
Inspection grounds
• If car is damaged: ID with insurance or gov't needs be retrieved
• Similarly: verifiably encrypt any certified attribute (optional)
• TTP is off-line & can be distributed to lessen trust
Concept – Inspection
© 2014 IBM CorporationJuly 8, 2015
Public Key Encryption
Key Generation
Encryption
Decryption
© 2014 IBM CorporationJuly 8, 2015
Security
Like Envelopes !?
No info about message
© 2014 IBM CorporationJuly 8, 2015
Security
or ?
Like Envelopes !?
This is called semantic security (secure if used once only.)
© 2014 IBM CorporationJuly 8, 2015
ElGamal Encryption Scheme
! Group G = of order q
! Secret Key Group x Є {1,...,q}; Public key y = gx
! To encrypt message m Є :– choose random r Є {1,...,q}; – compute c = (yr m, gr)
! To decrypt ciphertext c = (c1,c2)
– We know c = (yr m, gr) = (gxr m, gr)
– Thus set m = c1 c2-x = yr m g-xr = yr-r m = m
© 2014 IBM CorporationJuly 8, 2015
Realizing Inspection
Nym = gskhr
d = ce a1ska2
ra3name bs” + s' (mod n)
Nym
y= gx
! Encrypt Nym : random u Є {1,...,q} and enc = (yu Nym, gu) = (e1,e2)! Compute proof token (presentation token):
– compute c' = c btmod n with randomly chosen t – compute proof
PK{(ε, µ1, µ2, µ3, σ) :
d := c'ε a1µ1a2µ2a3µ3b σ ∧ e1 = yρ gµ1hµ2 ∧ e2 = gρ ∧
» µ1, µ3, µ3 Є {0,1}ℓ ∧ ε > 2ℓ+1 }
© 2014 IBM Corporation
Revocation of credentials
© 2014 IBM Corporation164 July 8, 2015
publishes
revocation info
looks up
Alice should be able to convince verifier that her credential is among the good ones!
Anonymous Credential Revocation
various reasons to revoke credential& user lost credential / secret key& misbehavior of user
© 2014 IBM Corporation165 July 8, 2015
?
Anonymous Credential Revocation
& Pseudonyms → standard revocation lists don't work
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: First Solution
! Include into credential some credential ID ui as message, e.g., d = ce a1
ska2ui bs” + s' (mod n)
! Publish list of all valid (or invalid) ui's.(u1,..., uk)
! Alice proves that her ui is on the list.–Choose random g–Compute Uj = guj for uj in (u1,..., uk)–Prove PK{(ε, µ, ρ, σ) : ( d = c'ε a1
ρa2µ b σ (n) ∧ U1 = gµ )
∨ (( ( ∨(d = c'ε a1ρa2
µ b σ (mod n) ∧ Uk =
gµ ) }! Not very efficient, i.e., linear in size k of list :-(
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: Second Solution
! Include into credential some credential ID ui as message, e.g., d = ce a1
ska2ui bs” + s' (mod n)
! Publish list of all invalid ui's.(u1,..., uk)
! Alice proves that her ui is not on the list.– Choose random h and compute U = hui
– Prove PK{(ε, µ, ρ, σ) : d = c'ε a1ρa2
µ b σ (mod n)
∧ U = hµ }– Verifier checks whether U = huj for all uj on the list.
! Better, as only verifier needs to do linear work (and it can be improved using so-call batch-verification...)
! What happens if we make the list of all valid ui's public?! If credential is revoked, all past transactions become linkable...
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: Second Solution
Variation: verifier could choose h and keep it fixed for a while! Can pre-compute list Ui = hui
!→ single table lookup! BUT: if user comes again, verifier can link!!!! ALSO: verifier could not change h at all! or use the same as other
verifiers!–one way out h = H(verifier, date), so user can check correctness.–date could be the time up to seconds and the verifier could just store
all the lists, i.e., pre-compute it.
© 2014 IBM CorporationJuly 8, 2015
… better implementation of proof :
#3
Sig( 0 ,#1)Sig(#1,#4)Sig(#4,#5)Sig(#5, N )
contains # where
#i < #
© 2014 IBM CorporationJuly 8, 2015
#1
#6 #5#4
#3
#2
#2
contains # that
is included in
Proof requires witness
Issuer accumulates all "good" serial numbers
credentials contain random serial number #
#2#2
Revocable Credentials: Third Solution
Using cryptographic accumulators:
© 2014 IBM CorporationJuly 8, 2015
#1
#6 #5#4
#3
#2
#2
contains # that
is included in
Proof would require witness
to revoke #2 issuer publishes new accumulator & new witnesses for unrevoked credentials
#2#2
Revocable Credentials: Third Solution
Using cryptographic accumulators:
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: Third Solution
#1
#6 #5#4
#3
#2
Using so-called cryptographic accumulators:! Key setup: RSA modulus n, seed v
! Accumulate: – values are primes ei– accumulator value: z = v Π ei mod n– publish z and n– witness value x for ej : s.t. z = x ej mod n
can be computed as x = v e1·...·ej-1 · ej+1·...·ek mod n
! Show that your value e is contained in accumulator:– provide x for e– verifier checks z = x e mod n
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: Third Solution
#1
#6 #5#4
#3
#2
Security of accumulator: show that e s.t. z = x e mod n for e that is not contained in accumulator:
– For fixed e: Equivalent to RSA assumption – Any e: Equivalent to Strong RSA assumption
Revocation: Each cert is associated with an e and each user gets witness x with certificate. But we still need:
– Efficient protocol to prove that committed value is contained in accumulator.
– Dynamic accumulator, i.e., ability to remove and add values to accumulator as certificates come and go.
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: Third Solution
! Prove that your key is in accumulator:–Commit to x:
• choose random s and g and • compute U1 = x hs, U2 = gs and reveal U1 ,U2, g
–Run proof-protocol with verifier PK{(ε, µ, ρ, σ, ξ, δ) :
d = c'ε a1ρa2
µ b σ (mod n) ∧ z = U1µ(1/h)ξ (mod n)
∧ 1 = U2µ(1/g)ξ (mod n) ∧ U2 = gδ (mod n)}
© 2014 IBM CorporationJuly 8, 2015
Revocable Credentials: Third Solution
! Analysis–No information about x and e is revealed:
• (U1, U2) is a secure commitment to x• proof-protocol is zero-knowledge
–Proof is indeed proving that e contained in the certificate is also contained in the accumulator:a) 1 = U2µ(1/g)ξ = (gδ)µ (1/g)ξ (mod n)
=> ξ = δ µb) z = U1µ(1/h)ξ =U1µ(1/h)δ µ =(U1/hδ )µ (mod n)c) d = c'ε a1
ρa2µ b σ (mod n)
© 2014 IBM CorporationJuly 8, 2015
Revocation: Third Solution
Dynamic Accumulator
!When a new user gets a certificate containing enew
–Recall: z = v Π ei mod n
–Thus: z' = z enew mod n
–But: then all witnesses are no longer valid, i.e., need to be updated x' = x enew mod n
© 2014 IBM CorporationJuly 8, 2015
Revocation: Third Solution
Dynamic Accumulator! When a certificate containing erev revoked
–Now z' = v Π ei = z 1/erev mod n –Witness:
• Use Ext. Euclid to compute a and b s.t. a eown + b erev = 1
• Now x' = x b z' a mod n • Why: x'eown= ((x b z' a )eown) erev 1/erev mod n
= ((x b z' a )eown erev ) 1/erevmod n = ((x eown) b erev (z' erev) a eown) 1/erev mod n = (z b erev z a eown ) 1/erev mod n
= z 1/erev mod n = z' :-)
© 2014 IBM CorporationJuly 8, 2015
Revocation: Third Solution (improved)
Dynamic Accumulator: in case the issuer knows the factorization of n! When a new user gets a certificate containing enew
–Recall: z = v Π ei mod n–Actually v never occurs anywhere... so: v' = v 1/enew mod n and x = z 1/enew mod n
–Thus z needs not to be changed in case new member joins!
! Witnesses need to be recomputed upon revocation only!
© 2014 IBM Corporation179 July 8, 2015
no additional effort for verifier
if credential is valid→ no need to check revocation updates from issuer
Revocation: Zeroth Solution
Update of Credentials: encode validity time as attribute
© 2014 IBM CorporationJuly 8, 2015
U
(c,e,s”)
U := a1m1a2
m2 bs' Choose e,s”
c = (d/(Ua3m3a4
time bs” ))1/e
mod n
Revocation: Zeroth Solution
Re-issue certificates
(off-line – interaction might be too expensive)
Recall issuing for identity mixer:
© 2014 IBM CorporationJuly 8, 2015
(ci,ei,si”)
Revocation: Zeroth Solution
Re-issue certificates (off-line – interaction might be too expensive)
! Idea: just repeat last step for each new time time':
!Update information (ci,ei,si”) can be pushed to user by many different means
Choose ei,si”
ci = (d/(Ua3m3'a4
time' bsi” ))1/ei mod n
© 2014 IBM CorporationJuly 8, 2015
References
! D. Chaum, J.-H. Evertse, and J. van de Graaf. An improved protocol for demonstrating possession of discrete logarithms and some generalizations. In EUROCRYPT ’87, vol. 304 of LNCS, pp. 127–141. Springer-Verlag, 1988.
! S. Brands. Rapid demonstration of linear relations connected by boolean operators.In EUROCRYPT ’97, vol. 1233 of LNCS, pp. 318–333. Springer Verlag, 1997.
! Mihir Bellare: Computational Number Theory http://www-cse.ucsd.edu/~mihir/cse207/w-cnt.pdf
! Camenisch, Lysanskaya: Dynamic Accumulators and Applications to Efficient Revocation of Anonymous Credentials. Crypto 2002, Lecture Notes in Computer Science, Springer Verlag.
! Ateniese, Song, Tsudik: Quasi-Efficient Revocation of Group Signatures. In Financial Cryptography 2002, Lecture Notes in Computer Science, Springer Verlag.
! Jan Camenisch, Natalie Casati, Thomas Gross, Victor Shoup: Credential Authenticated Identification and Key Exchange. CRYPTO 2010:255-276
! Jan Camenisch, Maria Dubovitskaya, Gregory Neven: Oblivious transfer with access control. ACM Conference on Computer and Communications Security 2009: 131-140
! Ateniese, Song, Tsudik: Quasi-Efficient Revocation of Group Signatures. In Financial Cryptography 2002, Lecture Notes in Computer Science, Springer Verlag.
! M. Bellare, C. Namprempre, D. Pointcheval, and M. Semanko: The One-More-RSA-Inversion Problems and the Security of Chaum's Blind Signature Scheme. Journal of Cryptology, Volume 16, Number 3. Pages 185 -215, Springer-Verlag, 2003.
! E. Bangerter, J. Camenisch and A. Lyskanskaya: A Cryptographic Framework for the Controlled Release Of Certified Data. In Twelfth International Workshop on Security Protocols 2004. www.zurich.ibm.com/~jca/publications
! Stefan Brands: Untraceable Off-line Cash in Wallets With Observers: In Advances in Cryptology – CRYPTO '93. Springer Verlag, 1993.
! J. Camenisch and A. Lyskanskaya: Efficient Non-transferable Anonymous Multi-show Credential System with Optional Anonymity Revocation. www.zurich.ibm.com/~jca/publications
http://www-cse.ucsd.edu/~mihir/cse207/w-cnt.pdf
© 2014 IBM CorporationJuly 8, 2015
References
! David Chaum: Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms. In Communications of the ACM, Vol. 24 No. 2, pp. 84—88, 1981.
! David Chaum: Blind Signatures for Untraceable Payments. In Advances in Cryptology – Proceedings of CRYPTO '82, 1983.
! David Chaum: Security Without Identification: Transaction Systems to Make Big Brother obsolete: in Communications of the ACM, Vol. 28 No. 10, 1985.
! Camenisch, Shoup: Practical Verifiable Encryption and Decryption of Discrete Logarithms. CRYPTO 2003: 126-144
! Victor Shoup: A computational introduction to Number Theory and Algebra. Available from: http://www.shoup.net/ntb/
! D. Molnar and D. Wagner: Privacy and Security in Library RFID. Issues, Practices, and Architectures. In ACM CCS 2004.
! G. Avoine, M.A. Bingöl, X. Carpent. S.B.O. Yalcin: Privacy-Friendly Authentication in RFID Systems: On Sublinear Protocols Based on Symmetric-Key Cryptography. In IEEE Transactions on Mobile Computing, 2013.
! J.H. Cheon, J. Hong, G. Tsudik: Reducing RFID reader load with the meet-in-the-middle strategy. In Journal of Communications and Networks (14): 10-14 (2012)
! M. Ohkubo, K. Suzuki, S. Kinoshita: Efficient hash-chain based RFID privacy protection scheme. In: UbiComp Workshop, Ubicomp Privacy: Current Status and Future Directions (2004)
! D. Chaum: Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms. In Communications of the ACM.
! D. Chaum: The Dining Cryptographers Problem: Unconditional Sender and Recipient Untraceability. Journal of Cryptology, 1988.
! J. Camenisch and V. Shoup: Practical Verifiable Encryption and Decryption of Discrete Logarithms. In Advances in Cryptology - CRYPTO 2003.
! J. Camenisch and A. Lysyanskaya: A Formal Treatment of Onion Routing. In Advances in Cryptology - CRYPTO 2005.
! J. Camenisch and A.Mityagin: Mix-Network with Stronger Security. In Privacy Enhancing Technologies – PET 2005.
! T. ElGamal: A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms. In Advances in Cryptology - CRYPTO '84.
http://www.shoup.net/ntb/
© 2014 IBM Corporation184 July 8, 2015
Conclusions
!Roadmap– Explain possibilities to engineers, policy makers etc– Usable prototypes – Provide transparency – Public infrastructure for privacy protection– Laws with teeth (encourage investment in privacy)
!Challenges– Internet services get paid with personal data (inverse incentive)– End users are not able to handle their data (user interfaces..)– Security technology typically invisible and hard to sell
!Towards a secure information society– Society changes quickly and gets shaped by technology – Consequences are hard to grasp (time will show...)– We must inform and engage in a dialog
© 2014 IBM Corporation185 July 8, 2015
Thank you!!eMail: [email protected]!Links:
– www.abc4trust.eu– www.futureID.eu– www.au2eu.eu– www.PrimeLife.eu – www.zurich.ibm.com/idemix– idemixdemo.zurich.ibm.com
!Code– github.com/p2abcengine & abc4trust.eu/idemix
http://www.abc4trust.eu/http://www.futureID.eu/http://www.PrimeLife.eu/http://www.zurich.ibm.com/idemix
E-Privacy – Privacy in the Electronic SocietySlide 2Slide 3Slide 4Slide 5Slide 6Slide 7Slide 8Slide 9Slide 10Slide 11Slide 12Slide 13Slide 14Slide 15Slide 16Slide 17Slide 18Slide 19Slide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Slide 27Vision: Privacy, Trust and ID ManagementSlide 29Slide 30Slide 31Slide 32Slide 33Slide 34Slide 35Slide 36Slide 37Slide 38Slide 39Slide 40Slide 41Slide 42Slide 43Slide 44Slide 45Slide 46Slide 47Slide 48Slide 49Slide 50Slide 51Slide 52Slide 53Slide 54Slide 55Slide 56Slide 57Slide 58Slide 59Slide 60Slide 61Slide 62Slide 63Slide 64Slide 65Slide 66Slide 67Slide 68Slide 69Slide 70Slide 71Slide 72Slide 73Slide 74Slide 75Slide 76Slide 77Slide 78Slide 79Slide 80Slide 81Slide 82Slide 83Slide 84Slide 85Slide 86Slide 87Slide 88Slide 89Slide 90Slide 91Slide 92Slide 93Slide 94Slide 95Slide 96Slide 97Slide 98Slide 99Slide 100Slide 101Slide 102Slide 103Slide 104Slide 105Presentation policySlide 107Slide 108Slide 109Slide 110Slide 111Slide 112Slide 113Slide 114Slide 115Slide 116Slide 117Slide 118Slide 119Slide 120Slide 121Slide 122Slide 123Slide 124Slide 125Slide 126Slide 127Slide 128Slide 129Slide 130Slide 131Slide 132Slide 133Slide 134Slide 135Slide 136Slide 137Slide 138Slide 139Slide 140Slide 141Slide 142Slide 143Slide 144Slide 145Slide 146Slide 147Slide 148Slide 149Slide 150Slide 151Slide 152Slide 153Slide 154Slide 155Slide 156Slide 157Slide 158Slide 159Slide 160Slide 161Slide 162Slide 163Slide 164Slide 165Slide 166Slide 167Slide 168Slide 169Slide 170Slide 171Slide 172Slide 173Slide 174Slide 175Slide 176Slide 177Slide 178Slide 179Slide 180Slide 181Slide 182Slide 183Slide 184Slide 185