+ All Categories
Home > Documents > Trapdoor Hash Functions and Their...

Trapdoor Hash Functions and Their...

Date post: 17-Mar-2020
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
33
Trapdoor Hash Functions and Their Applications Nico Döttling CISPA Helmholtz Center Sanjam Garg UC Berkeley Yuval Ishai Technion Giulio Malavolta Carnegie Mellon University Tamer Mour Weizmann Institute Rafail Ostrovsky UC Los Angeles TPMPC 2019, Bar Ilan University 1
Transcript

Trapdoor Hash Functionsand Their Applications

Nico Döttling

CISPA Helmholtz Center

Sanjam Garg

UC Berkeley

Yuval Ishai

Technion

Giulio Malavolta

Carnegie Mellon University

Tamer Mour

Weizmann Institute

Rafail Ostrovsky

UC Los Angeles

TPMPC 2019, Bar Ilan University1

Trapdoor Hash Functionsand Their Applications

Nico Döttling

CISPA Helmholtz Center

Sanjam Garg

UC Berkeley

Yuval Ishai

Technion

Giulio Malavolta

Carnegie Mellon University

Tamer Mour

Weizmann Institute

Rafail Ostrovsky

UC Los Angeles

TPMPC 2019, Bar Ilan University2

Setting: Sender-Receiver Computation

Sender Receiver

𝒚 ∈ 𝒀 𝒙 ∈ 𝑿

Output:𝒇 𝒙, 𝒚

communication

𝒇:𝑿 × 𝒀 → 𝒁

3

Sender Receiver

Input: 𝒚 ∈ 𝒀 Input: 𝒙 ∈ 𝑿Output: 𝒇 𝒙, 𝒚

Ideal World:Simple and optimal solutions.

Real World (semi-honest security):Sender and receiver do not trust each other, and want to keep their inputs private.Main question in this work:

receiver message

sender message

Focus: two-message protocols with minimum communication.

Can Secure Protocols be as Efficient as Ideal-World Solutions?

Setting: Sender-Receiver Computation

4

Setting I.Sender input is larger than receiver input

|𝒚| ≫ |𝒙|

Example:

String Oblivious Transfer (OT)

In Ideal World, communication dominated by length of second message 𝑓 𝑥, 𝑦 = 𝑛.

GoalOptimize length of second message, i.e.

download rate = |𝑓(𝑥,𝑦)|

|𝑆𝑒𝑐𝑜𝑛𝑑 𝑀𝑒𝑠𝑠𝑎𝑔𝑒|

Can Secure Protocols be as Efficient as Ideal-World Solutions?

𝑦0, 𝑦1 ∈ 0,1 𝑛 𝑥 ∈ {0,1}Output: 𝑦𝑥

Ideal-World Solution:download rate = 1

Negative AnswerSecure protocols with exact rate 1 do not exist.(even with correlated randomness)

For 2𝜆 security,𝒔𝒆𝒄𝒐𝒏𝒅𝒎𝒆𝒔𝒔𝒂𝒈𝒆 > 𝒏 + 𝟐𝝀

Best we can hope for:download rate → 𝟏

when 𝑛 → ∞.

Positive AnswerProtocols with rate approaching 1 using Trapdoor Hash. 5

Results in Setting I: Rate-1 Oblivious Transfer and MoreLife before Trapdoor Hash- Rate - ½ OT is easy to achieve.- Only known solution for higher rate uses high-rate homomorphic encryption.- Only rate-1 homomorphic encryption scheme: Damgård-Jurik scheme based on DCR.Exception: [Gentry-Halevi’19]

DCR LWE QR DDH

Oblivious Transfer (OT) rate-1 [DJ00] rate-½ rate-½ rate-½

Life after Trapdoor Hash

DCR LWE QR DDH

Oblivious Transfer (OT) rate-1 rate-1 rate-1 rate-1

Oblivious Linear-Function Evaluation (OLE) rate-1 rate-1 rate-1 rate-1

Oblivious Matrix-Vector Product (OMV) - rate-1 rate-1 -

6

Results in Setting I: Application of Rate-1 OT1. Single-Server Private Information Retrieval [Kushilevitz-Ostrovsky’97]First rate-1 PIR protocols with polylogarithmic communication [Ishai-Paskin’07].

DCR LWE QR DDH

Before TDH 𝑂(logc 𝑛), rate-1 𝑂(logc 𝑛) 𝑂(2√log 𝑛) [KO97] 𝑂(2√log 𝑛) [KO97]

After TDH 𝑂(logc 𝑛), rate-1 𝑂(logc 𝑛), rate-1 𝑂(logc 𝑛), rate-1 𝑂(logc 𝑛), rate-1

2. Homomorphic Encryption for Branching ProgramsFirst semi-compact homomorphic encryption from DDH,QR [Ishai-Paskin’07].

3. Lossy Trapdoor Functions [Peikert-Waters’07]First rate-1 constructions.

DCR LWE QR DDH

Before TDH rate-1 - - constant rate [GGH18]

After TDH rate-1 rate-1 rate-1 rate-17

Setting I.Sender input is larger than receiver input

|𝒚| ≫ |𝒙|

Example:

String Oblivious Transfer (OT)

In Ideal World, communication dominated by length of second message 𝑓 𝑥, 𝑦 = 𝑛.

GoalOptimize length of second message, i.e.

download rate = |𝑓(𝑥,𝑦)|

|𝑆𝑒𝑐𝑜𝑛𝑑 𝑀𝑒𝑠𝑠𝑎𝑔𝑒|

Can Secure Protocols be as Efficient as Ideal-World Solutions?

Setting II.Receiver input is larger than sender input

|𝒚| ≪ |𝒙|

Example:

RAM Computation on Big Data

𝑦0, 𝑦1 ∈ 0,1 𝑛 𝑥 ∈ {0,1}Output: 𝑦𝑥

RAM machine 𝑀w/ running time

𝑇 ≪ 𝑛 𝑥 ∈ 0,1 𝑛

Output: 𝑀(𝑥)

8

(DNA analysis, suspects lookup, etc.)

Setting I.Sender input is larger than receiver input

|𝒚| ≫ |𝒙|

Example:

String Oblivious Transfer (OT)

In Ideal World, communication dominated by length of second message 𝑓 𝑥, 𝑦 = 𝑛.

GoalOptimize length of second message, i.e.

download rate = |𝑓(𝑥,𝑦)|

|𝑆𝑒𝑐𝑜𝑛𝑑 𝑀𝑒𝑠𝑠𝑎𝑔𝑒|

Can Secure Protocols be as Efficient as Ideal-World Solutions?

Setting II.Receiver input is larger than sender input

|𝒚| ≪ |𝒙|

Example:

RAM Computation on Big Data

In Ideal World, communication is 𝑦 ≪ 𝑛:”much smaller” than 𝒏

GoalCommunication smaller than 𝒙 = 𝒏.Non-trivial for two-message protocols.

𝑦0, 𝑦1 ∈ 0,1 𝑛 𝑥 ∈ {0,1}Output: 𝑦𝑥

RAM machine 𝑀w/ running time

𝑇 ≪ 𝑛

𝑥 ∈ 0,1 𝑛

Output: 𝑀(𝑥)

9

Results in Setting II: Sublinear Secure RAM Computation

Overall Com. Assumption Security

Laconic Function Evaluation [QWW18] ෨𝑂(𝑇) LWE full

Laconic OT [CDGGMP17] ෨𝑂(𝑇) DDH UMA

Private Laconic OT (through TDH) 𝑂(𝑇 ⋅ 𝑛) DDH full

𝑂(𝑇 ⋅ 3 𝑛) SXDH+ full

Life before Trapdoor HashOnly fully secure solution is based on lattice assumptions.

Life after Trapdoor HashFirst sublinear two-message protocols under number-theoretic assumptions.

Open Question (already): can we close the efficiency gap?

10

Trapdoor Hash Functions

Hash Functions Trapdoor Functions

H

F

F-1(td)

11

Trapdoor Hash Functions

H

H-1(td)

12

Trapdoor Hash Functions

H

H-1(td,hints)

13

Defining Trapdoor Hash

Alice Bob

input 𝒙 ∈ 𝟎, 𝟏 𝒏 hash 𝒉 ∈ 𝟎, 𝟏 𝝀

The Hash Function:H

𝒉Input Privacy: ℎ hides 𝑥.

“I want to learn 𝑥[𝑖], give me some hints”

Key Generation:(key, trapdoor)← G(𝑖)

key

Index Privacy: key hides 𝑖.

input 𝒙 ∈ 𝟎, 𝟏 𝒏 hint 𝒆to recover 𝒙[𝒊]

The “Hinting” Function:E(key)

𝒆 Decoding:𝑥[𝑖] ← D(trapdoor, ℎ, 𝑒)

Efficiency: small hints, i.e. high rate

𝒉

14

Defining Trapdoor Hash

Alice Bob

input 𝒙 ∈ 𝟎, 𝟏 𝒏 hash 𝒉 ∈ 𝟎, 𝟏 𝝀

The Hash Function:H

𝒉Input Privacy: ℎ hides 𝑥.

“I want to learn 𝒙[𝒊𝟏, … , 𝒊𝒎], give me some hints”

Key Generation:(keyj, trapdoorj)← G(𝑖𝑗)

key1, … , keym

Index Privacy: key hides 𝑖.

input 𝒙 ∈ 𝟎, 𝟏 𝒏 hints 𝒆𝟏, … , 𝒆𝒎to recover 𝒙[𝒊𝟏, … , 𝒊𝒎]

The “Hinting” Function:E(key)

𝒆𝟏, … , 𝒆𝒎Decoding:

𝑥[𝑖𝑗] ← D(trapdoorj, ℎ, 𝑒𝑗)Efficiency: small hints, i.e. high rate

𝒉

15

Defining Trapdoor Hash

input 𝒙 ∈ 𝟎, 𝟏 𝒏 hash 𝒉 ∈ 𝟎, 𝟏 𝝀

The Hash Function:H Input Privacy: ℎ hides 𝑥.

Index Privacy: key hides 𝑖.

Key Generation:(key, trapdoor)← G(𝑖)

input 𝒙 ∈ 𝟎, 𝟏 𝒏 hint 𝒆

The “Hinting” Function:E(key)

Decoding:𝑥[𝑖] ← D(trapdoor, ℎ, 𝑒)

TDH = (H,G,E,D)

Rate:𝟏

|𝒆|

Optimally, rate = 1.

Main technical contribution:Rate-1 TDH from DDH,QR,LWE,DCR

We also consider TDH for general functions of 𝑥. 16

Trapdoor Hash from DDH

Similar technique used to construct IBE [DG17], laconic OT [CDGGMP17] andTrapdoor Functions [GH18,GGH19].

- Multiplicative abelian group 𝔾 of prime order 𝑝, with a public generator 𝒈 ∈ 𝔾.For all 𝑔1, 𝑔2 ∈ 𝔾,

𝒈𝟏 ⋅ 𝒈𝟐 = 𝒈𝟐 ⋅ 𝒈𝟏 ∈ 𝔾

- The DDH (Decisional Diffie-Hellman) assumptionFor uniform a, b, c ∈ ℤ𝑝 and an element 𝑔 ∈ 𝔾,

𝒈𝒂, 𝒈𝒃, 𝒈𝒂𝒃 ≡ (𝒈𝒂, 𝒈𝒃, 𝒈𝒄)

17

Trapdoor Hash from DDH

Alice Bob𝑔1,0 𝑔2,0 𝑔3,0 … 𝑔𝑛,0𝑔1,1 𝑔2,1 𝑔3,1 … 𝑔𝑛,1

∈ 𝔾2×𝑛

public parameters2 × 𝑛 uniform group elements

input 𝒙 ∈ 𝟎, 𝟏 𝒏

18

Trapdoor Hash from DDH

Alice Bob

𝐻 𝑥 =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]ℎ = 𝐻 𝑥

Hash Function:

𝑔1,0 𝑔2,0 𝑔3,0 … 𝑔𝑛,0𝑔1,1 𝑔2,1 𝑔3,1 … 𝑔𝑛,1

public parameters

1 0 0 ⋅⋅⋅ 1

input 𝒙 ∈ 𝟎, 𝟏 𝒏

𝑔1,0 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝑔𝑛,0𝒈𝟏,𝟏 𝑔2,1 𝑔3,1 … 𝒈𝒏,𝟏

H ℎ ∈ 𝔾

Input Privacy: statistical*.

19

Trapdoor Hash from DDH

Alice Bob

“I want to learn 𝑥[𝑖]”

𝑘𝑒𝑦 =𝒈𝟏,𝟎 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝒏,𝟎𝒈𝟏,𝟏 𝒈𝟐,𝟏 𝒈𝟑,𝟏 … 𝒈𝒏,𝟏

ℎ = 𝐻 𝑥

𝒌𝒆𝒚 =𝑔1,0𝑡 … 𝑔𝑖,0

𝑡 … 𝑔𝑛,0𝑡

𝑔1,1𝑡 … 𝒈𝒊,𝟏

𝒕 ⋅ 𝒈 … 𝑔𝑛,1𝑡

trapdoor: uniform 𝒕 ∈ ℤ𝒑

Hash Function:Key Generation:

Index Privacy: assuming DDH,

𝒈𝟏,𝟎 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝒏,𝟎𝒈𝟏,𝟏 𝒈𝟐,𝟏 𝒈𝟑,𝟏 … 𝒈𝒏,𝟏

≡ uniform matrix in 𝔾2×𝑛

𝑔1,0 𝑔2,0 𝑔3,0 … 𝑔𝑛,0𝑔1,1 𝑔2,1 𝑔3,1 … 𝑔𝑛,1

public parameters

𝐻 𝑥 =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]

20

Trapdoor Hash from DDH

Alice Bob

“I want to learn 𝑥[𝑖]”

𝑒 = 𝐸(𝑘𝑒𝑦, 𝑥)

𝑘𝑒𝑦 =𝒈𝟏,𝟎 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝒏,𝟎𝒈𝟏,𝟏 𝒈𝟐,𝟏 𝒈𝟑,𝟏 … 𝒈𝒏,𝟏

ℎ = 𝐻 𝑥

𝐸 𝑘𝑒𝑦, 𝑥 =ෑ

𝑗

𝒈𝑗,𝑥[𝑗]

𝒌𝒆𝒚 =𝑔1,0𝑡 … 𝑔𝑖,0

𝑡 … 𝑔𝑛,0𝑡

𝑔1,1𝑡 … 𝒈𝒊,𝟏

𝒕 ⋅ 𝒈 … 𝑔𝑛,1𝑡

trapdoor: uniform 𝒕 ∈ ℤ𝒑

Hash Function:

Hinting:

Key Generation:

Rate:1

𝑒=1

𝜆

𝑔1,0 𝑔2,0 𝑔3,0 … 𝑔𝑛,0𝑔1,1 𝑔2,1 𝑔3,1 … 𝑔𝑛,1

public parameters

1 0 0 ⋅⋅⋅ 1

input 𝒙 ∈ 𝟎, 𝟏 𝒏

𝒈1,0 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝑛,0𝒈𝟏,𝟏 𝒈2,1 𝒈3,1 … 𝒈𝒏,𝟏

E 𝑒 ∈ 𝔾

𝐻 𝑥 =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]

21

Trapdoor Hash from DDH

Alice Bob

“I want to learn 𝑥[𝑖]”

𝑒 = 𝐸(𝑘𝑒𝑦, 𝑥)

𝑘𝑒𝑦 =𝒈𝟏,𝟎 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝒏,𝟎𝒈𝟏,𝟏 𝒈𝟐,𝟏 𝒈𝟑,𝟏 … 𝒈𝒏,𝟏

ℎ = 𝐻 𝑥

𝐸 𝑘𝑒𝑦, 𝑥 =ෑ

𝑗

𝒈𝑗,𝑥[𝑗]

𝒌𝒆𝒚 =𝑔1,0𝑡 … 𝑔𝑖,0

𝑡 … 𝑔𝑛,0𝑡

𝑔1,1𝑡 … 𝒈𝒊,𝟏

𝒕 ⋅ 𝒈 … 𝑔𝑛,1𝑡

trapdoor: uniform 𝒕 ∈ ℤ𝒑

Hash Function:

Hinting:

Key Generation:

How to recover 𝑥[𝑖] given- The hash ℎ- The hint 𝑒- The trapdoor 𝑡 ?

𝑔1,0 𝑔2,0 𝑔3,0 … 𝑔𝑛,0𝑔1,1 𝑔2,1 𝑔3,1 … 𝑔𝑛,1

public parameters

1 0 0 ⋅⋅⋅ 1

input 𝒙 ∈ 𝟎, 𝟏 𝒏

𝒈1,0 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝑛,0𝒈𝟏,𝟏 𝒈2,1 𝒈3,1 … 𝒈𝒏,𝟏

E 𝑒 ∈ 𝔾

𝐻 𝑥 =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]

22

Trapdoor Hash at The Bar: DecodingObservation: given that

1 0 0 ⋅⋅⋅ 1

input 𝒙 ∈ 𝟎, 𝟏 𝒏

𝒈1,0 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝑛,0𝒈𝟏,𝟏 𝒈2,1 𝒈3,1 … 𝒈𝒏,𝟏

E 𝑒 =ෑ

𝑗

𝒈𝑗,𝑥[𝑗] =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]𝑡 ⋅ 𝑔𝑥 𝑖 = ℎ𝑡 ⋅ 𝑔𝑥[𝑖]

1 0 0 ⋅⋅⋅ 1

input 𝒙 ∈ 𝟎, 𝟏 𝒏

𝑔1,0 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝑔𝑛,0𝒈𝟏,𝟏 𝑔2,1 𝑔3,1 … 𝒈𝒏,𝟏

H ℎ =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]

𝒈𝟏,𝟎 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝒏,𝟎𝒈𝟏,𝟏 𝒈𝟐,𝟏 𝒈𝟑,𝟏 … 𝒈𝒏,𝟏

=𝑔1,0𝑡 … 𝑔𝑖,0

𝑡 … 𝑔𝑛,0𝑡

𝑔1,1𝑡 … 𝒈𝒊,𝟏

𝒕 ⋅ 𝒈 … 𝑔𝑛,1𝑡

23

Trapdoor Hash from DDH

Alice Bob

“I want to learn 𝑥[𝑖]”

𝑒 = 𝐸(𝑘𝑒𝑦, 𝑥)𝒆 = 𝒉𝒕 → 𝒙 𝒊 = 𝟎

𝒆 = 𝒉𝒕 ⋅ 𝒈 → 𝒙 𝒊 = 𝟏

𝑘𝑒𝑦 =𝒈𝟏,𝟎 𝒈𝟐,𝟎 𝒈𝟑,𝟎 … 𝒈𝒏,𝟎𝒈𝟏,𝟏 𝒈𝟐,𝟏 𝒈𝟑,𝟏 … 𝒈𝒏,𝟏

ℎ = 𝐻 𝑥

𝒌𝒆𝒚 =𝑔1,0𝑡 … 𝑔𝑖,0

𝑡 … 𝑔𝑛,0𝑡

𝑔1,1𝑡 … 𝒈𝒊,𝟏

𝒕 ⋅ 𝒈 … 𝑔𝑛,1𝑡

trapdoor: uniform 𝒕 ∈ ℤ𝒑

Decoding: using hash ℎ and trapdoor 𝑡

Hash Function:

Hinting:

Key Generation:

𝑔1,0 𝑔2,0 𝑔3,0 … 𝑔𝑛,0𝑔1,1 𝑔2,1 𝑔3,1 … 𝑔𝑛,1

public parameters

𝐸 𝑘𝑒𝑦, 𝑥 =ෑ

𝑗

𝒈𝑗,𝑥[𝑗]

𝐻 𝑥 =ෑ

𝑗

𝑔𝑗,𝑥[𝑗]

24

Rate-𝟏/𝝀 Trapdoor Hash

- For applications in Setting II (Sublinear Secure RAM Computation): Rate-1/𝜆 TDH is sufficient*.

- For applications in Setting I (Rate-1 OT): We need Rate-1 TDH, i.e. TDH where the hint is a single bit.

25

Rate-1 Trapdoor Hash from DDH

Alice Bob

𝑒 = 𝐸(𝑘𝑒𝑦, 𝑥)𝑒 =ෑ

𝑖

𝒈𝑖,𝑥[𝑖]

Decoding: using hash ℎ and trapdoor 𝑡Hinting:

𝒉𝒕 ∈ 𝔾 𝒉𝒕 ⋅ 𝒈 ∈ 𝔾

𝑥 𝑖 = 0 𝑥 𝑖 = 1

𝑒

𝒙 𝒊 = 𝟎

= ℎ𝑡 ? = ℎ𝑡 ⋅ 𝑔 ?

𝒙 𝒊 = 𝟏

26

Rate-1 Trapdoor Hash from DDH

Alice Bob

𝐿𝑆𝐵(𝑒)𝑒 =ෑ

𝑖

𝒈𝑖,𝑥[𝑖]

Decoding: using hash ℎ and trapdoor 𝑡Hinting:

𝒉𝒕 ∈ 𝔾 𝒉𝒕 ⋅ 𝒈 ∈ 𝔾

𝑥 𝑖 = 0 𝑥 𝑖 = 1

Attempt I:Send LSB of 𝑒.Fails when 𝐿𝑆𝐵 ℎ𝑡 = 𝐿𝑆𝐵(ℎ𝑡𝑔) is equal: happens with probability 1/2.

𝐿𝑆𝐵(𝑒)

𝒙 𝒊 = 𝟎

= 𝐿𝑆𝐵(ℎ𝑡) ? = 𝐿𝑆𝐵(ℎ𝑡𝑔 )?

𝒙 𝒊 = 𝟏

27

Rate-1 Trapdoor Hash from DDH

Alice Bob

Φ(𝑒)𝑒 =ෑ

𝑖

𝒈𝑖,𝑥[𝑖]

Decoding: using hash ℎ and trapdoor 𝑡Hinting:

𝒉𝒕 ∈ 𝔾 𝒉𝒕 ⋅ 𝒈 ∈ 𝔾

𝑥 𝑖 = 0 𝑥 𝑖 = 1

Attempt I:Send LSB of 𝑒.Fails when 𝐿𝑆𝐵 ℎ𝑡 = 𝐿𝑆𝐵(ℎ𝑡𝑔) is equal: happens with probability 1/2.

Φ(𝑒)

𝒙 𝒊 = 𝟎

= Φ(ℎ𝑡) ? = Φ(ℎ𝑡𝑔 )?

𝒙 𝒊 = 𝟏

Goal: an encoding Φ:𝔾 → {0,1} such that with high probability

Φ ℎ𝑡 ≠ Φ ℎ𝑡𝑔28

Rate-1 Trapdoor Hash from DDH

Goal: an encoding Φ:𝔾 → {0,1} such that with high probability

Φ ℎ𝑡 ≠ Φ ℎ𝑡𝑔

𝔾1 𝑔 𝑔2 𝑒 ∈ 𝔾

Attempt II:Φ 𝑒 = LSB of number of steps to reach 𝟏

29

Rate-1 Trapdoor Hash from DDH

Goal: an encoding Φ:𝔾 → {0,1} such that with high probability

Φ ℎ𝑡 ≠ Φ ℎ𝑡𝑔

Attempt II:Φ 𝑒 = LSB of number of steps to reach 𝟏

Clearly, Φ ℎ𝑡 ≠ Φ(ℎ𝑡𝑔) for all 𝑡 ∈ ℤ𝑝.

Problem: not efficient (this is DLOG).

𝔾1 𝑔 𝑔2 ℎ𝑡 ℎ𝑡𝑔

30

Rate-1 Trapdoor Hash from DDH

Goal: an encoding Φ:𝔾 → {0,1} such that with high probability

Φ ℎ𝑡 ≠ Φ ℎ𝑡𝑔

Idea: distributed discrete log [BGI16,DKK18]

𝔾1 𝑔 𝑔2

Attempt III: using a PRF, define many random “reference points”: 𝒛 s.t. 𝑷𝑹𝑭 𝒛 = 𝟎.

Φ 𝑒 = LSB of number of steps to reach the closest reference point

𝑧1 𝑧2 𝑧3𝑒 ∈ 𝔾

31

Rate-1 Trapdoor Hash from DDH

Goal: an encoding Φ:𝔾 → {0,1} such that with high probability

Φ ℎ𝑡 ≠ Φ ℎ𝑡𝑔

Idea: distributed discrete log [BGI16,DKK18]

𝔾1 𝑔 𝑔2

Attempt III: using a PRF, define many random “reference points”: 𝒛 s.t. 𝑷𝑹𝑭 𝒛 = 𝟎.

Φ 𝑒 = LSB of number of steps to reach the closest reference point

If the reference points are dense enough, this is efficient.If the reference points are sparse enough, Φ ℎ𝑡 ≠ Φ(ℎ𝑡𝑔) with high probability.

𝑧1 𝑧2 𝑧3ℎ𝑡 ℎ𝑡𝑔

32

Conclusion and Further Research

We introduce Trapdoor Hash:- Simple primitive with constructions from various standard assumption.- Powerful primitive with significant applications in communication-efficient protocols.

Applications in Setting I: Rate-Optimal Protocols for OT and Other Functions:- Constructions from new standard assumptions.- More general functionalities?- Constructions from general assumptions?

Applications in Setting II: Sublinear Protocols for Secure RAM Computation:- First constructions from number-theoretic assumptions.- Can we get constructions as efficient as lattice-based schemes?

Other applications of trapdoor hash or similar techniques?Thanks for Listening!33


Recommended