Cryptography and Privacy Preserving Operations Lecture 2: Pseudo-randomness - PowerPoint PPT Presentation

1 / 69
About This Presentation
Title:

Cryptography and Privacy Preserving Operations Lecture 2: Pseudo-randomness

Description:

... way function is guaranteed to exist, can construct an O(n2 log n) one-way function g: ... zk =rk x. A. y,r2. y,rk. z1, z2, zk. y. Check whether f(xi)=y ... – PowerPoint PPT presentation

Number of Views:88
Avg rating:3.0/5.0
Slides: 70
Provided by: csI88
Category:

less

Transcript and Presenter's Notes

Title: Cryptography and Privacy Preserving Operations Lecture 2: Pseudo-randomness


1
Cryptography and Privacy Preserving Operations
Lecture 2 Pseudo-randomness
  • Lecturer Moni Naor
  • Weizmann Institute of Science

2
Recap of Lecture 1
  • Key idea of cryptography use computational
    intractability for your advantage
  • One-way functions are necessary and sufficient to
    solve the two guard identification problem
  • Notion of Reduction between cryptographic
    primitives
  • Amplification of weak one-way functions
  • Things are a bit more complex in the
    computational world (than in the information
    theoretic one)
  • Encryption easy when you share very long strings
  • Started with the notion of pseudo-randomness

3
Is there an ultimate one-way function?
  • If f10,1 ? 0,1 and f20,1 ? 0,1 are
    guaranteed to
  • Be polynomial time computable
  • At least one of them is one-way.
  • then can construct a function g0,1 ? 0,1
    which is one-way
  • g(x1, x2 ) (f1(x1),f2 (x2 ))
  • If an 5n2 time one-way function is guaranteed to
    exist, can construct an O(n2 log n) one-way
    function g
  • Idea enumerate Turing Machine and make sure they
    run 5n2 steps
  • g(x1, x2 ,, xlog (n) )M1(x1), M2(x2), , Mlog
    n(xlog (n))
  • If a one-way function is guaranteed to exist,
    then there exists a 5n2 time one-way
  • Idea concentrate on the prefix

1/p(n)
4
Conclusions
  • Be careful what you wish for
  • Problem with resulting one-way function
  • Cannot learn about behavior on large inputs from
    small inputs
  • Whole rational of considering asymptotic results
    is eroded
  • Construction does not work for non-uniform
    one-way functions

5
The Encryption problem
  • Alice would want to send a message m ? 0,1n
    to Bob
  • Set-up phase is secret
  • They want to prevent Eve from learning anything
    about the message

m
Alice
Bob
Eve
6
The encryption problem
  • Relevant both in the shared key and in the public
    key setting
  • Want to use many times
  • Also add authentication
  • Other disruptions by Eve

7
What does learn mean?
  • If Eve has some knowledge of m should remain the
    same
  • Probability of guessing m
  • Min entropy of m
  • Probability of guess whether m is m0 or m1
  • Probability of computing some function f of m
  • Ideally the message sent is a independent of the
    message m
  • Implies all the above
  • Shannon achievable only if the entropy of the
    shared secret is at least as large as the message
    m entropy
  • If no special knowledge about m
  • then m
  • Achievable one-time pad.
  • Let r?R 0,1n
  • Think of r and m as elements in a group
  • To encrypt m send rm
  • To decrypt z send mz-r

8
Pseudo-random generators
  • Would like to stretch a short secret (seed) into
    a long one
  • The resulting long string should be usable in
    any case where a long string is needed
  • In particular as a one-time pad
  • Important notion Indistinguishability
  • Two probability distributions that cannot be
    distinguished
  • Statistical indistinguishability distances
    between probability distributions
  • New notion computational indistinguishability

9
Computational Indistinguishability
  • Definition two sequences of distributions Dn
    and Dn on 0,1n are computationally
    indistinguishable if
  • for every polynomial p(n) and sufficiently large
    n, for every probabilistic polynomial time
    adversary A that receives input y ? 0,1n and
    tries to decide whether y was generated by Dn or
    Dn
  • ProbA0 Dn - ProbA0 Dn lt
    1/p(n)
  • Without restriction on probabilistic polynomial
    tests equivalent to variation distance being
    negligible
  • ?ß ? 0,1n Prob Dn ß - Prob Dn ß lt
    1/p(n)

10
Pseudo-random generators
  • Definition a function g0,1 ? 0,1 is said
    to be a (cryptographic) pseudo-random generator
    if
  • It is polynomial time computable
  • It stretches the input g(x)gtx
  • denote by l(n) the length of the output on
    inputs of length n
  • If the input is random the output is
    indistinguishable from random
  • For any probabilistic polynomial time adversary A
    that receives input y of length l(n) and tries to
    decide whether y g(x) or is a random string from
    0,1l(n) for any polynomial p(n) and
    sufficiently large n
  • ProbArand yg(x) - ProbArand y?R
    0,1l(n) lt 1/p(n)
  • Important issues
  • Why is the adversary bounded by polynomial time?
  • Why is the indistinguishability not perfect?

11
Pseudo-random generators
  • Definition a function g0,1 ? 0,1 is said
    to be a (cryptographic) pseudo-random generator
    if
  • It is polynomial time computable
  • It stretches the input g(x)gtx
  • denote by l(n) the length of the output on
    inputs of length n
  • If the input (seed) is random, then the output is
    indistinguishable from random
  • For any probabilistic polynomial time adversary A
    that receives input y of length l(n) and tries to
    decide whether y g(x) or is a random string from
    0,1l(n) for any polynomial p(n) and
    sufficiently large n
  • ProbArand yg(x) - ProbArand y?R
    0,1l(n) lt 1/p(n)
  • Want to use the output a pseudo-random generator
    whenever long random strings are used
  • Especially encryption
  • have not defined the desired properties yet.

Anyone who considers arithmetical methods of
producing random numbers is, of course, in a
state of sin.

J. von Neumann
12
Important issues
  • Why is the adversary bounded by polynomial time?
  • Why is the indistinguishability not perfect?

13
Construction of pseudo-random generators
  • Idea given a one-way function there is a hard
    decision problem hidden there
  • If balanced enough looks random
  • Such a problem is a hardcore predicate
  • Possibilities
  • Last bit
  • First bit
  • Inner product

14
Hardcore Predicate
  • Definition let f0,1 ? 0,1 be a function.
    We say that h0,1 ? 0,1 is a hardcore
    predicate for f if
  • It is polynomial time computable
  • For any probabilistic polynomial time adversary A
    that receives input yf(x) and tries to compute
    h(x) for any polynomial p(n) and sufficiently
    large n
  • ProbA(y)h(x) -1/2 lt 1/p(n)
  • where the probability is over the choice y and
    the random coins of A
  • Sources of hardcoreness
  • not enough information about x
  • not of interest for generating pseudo-randomness
  • enough information about x but hard to compute
    it

15
Exercises
  • Assume one-way functions exist
  • Show that the last bit/first bit are not
    necessarily hardcore predicates
  • Generalization show that for any fixed function
    h0,1 ? 0,1 there is a one-way function
    f0,1 ? 0,1 such that h is not a hardcore
    predicate of f
  • Show a one-way function f such that given yf(x)
    each input bit of x can be guessed with
    probability at least 3/4

16
Single bit expansion
  • Let f0,1n ? 0,1n be a one-way permutation
  • Let h0,1n ? 0,1 be a hardcore predicate for
    f
  • Consider g0,1n ? 0,1n1 where
  • g(x)(f(x), h(x))
  • Claim g is a pseudo-random generator
  • Proof can use a distinguisher for g to guess
    h(x)

f(x), h(x))
f(x), 1-h(x))
17
Hardcore Predicate With Public Information
  • Definition let f0,1 ? 0,1 be a function.
    We say that h0,1 x 0,1 ? 0,1 is a
    hardcore predicate for f if
  • h(x,r) is polynomial time computable
  • For any probabilistic polynomial time adversary A
    that receives input yf(x) and public randomness
    r and tries to compute h(x,r) for any polynomial
    p(n) and sufficiently large n
  • ProbA(y,r)h(x,r) -1/2 lt 1/p(n)
  • where the probability is over the choice y of r
    and the random coins of A
  • Alternative view can think of the public
    randomness as modifying the one-way function f
    f(x,r)f(x),r.

18
Example weak hardcore predicate
  • Let h(x,i) xi
  • I.e. h selects the ith bit of x
  • For any one-way function f, no polynomial time
    algorithm A(y,i) can have probability of success
    better than 1-1/2n of computing h(x,i)
  • Exercise let c0,1 ? 0,1 be a good error
    correcting code
  • c(x) is O(x)
  • distance between any two codewords c(x) and c(x)
    is a constant fraction of c(x)
  • It is possible to correct in polynomial time
    errors in a constant fraction of c(x)
  • Show that for h(x,i) c(x)i and any one-way
    function f, no polynomial time algorithm A(y,i)
    can have probability of success better than a
    constant of computing h(x,i)

19
Inner Product Hardcore bit
  • The inner product bit choose r ?R 0,1n let
  • h(x,r) r x ? xi ri mod 2
  • Theorem Goldreich-Levin for any one-way
    function the inner product is a hardcore
    predicate
  • Proof structure
  • Algorithm A for inverting f
  • There are many xs for which A returns a correct
    answer (r x ) on ½e of the r s
  • Reconstruction algorithm R take an algorithm A
    that guesses h(x,r) correctly with probability
    ½e over the rs and output a list of candidates
    for x
  • No use of the y info by R (except feeding to A)
  • Choose from the list the/an x such that f(x)y

The main step!
20
Why list?
  • Cannot have a unique answer!
  • Suppose A has two candidates x and x
  • On query r it returns at random either r x
    or r x
  • ProbA(y,r) r x ½ ½Probrx rx ¾

21
A algorithm for guessing rxR Reconstruction
algorithm that outputs a list of candidates for
xA algorithm for inverting f on a given y
y
A
y
R
y,r1
A
?
z1 r1 x
y,r2
A
?
z2 r2 x
?
y,rk
A
?
zk rk x
z1, z2, ? zk
x1 ,x2 ? xk
xix
Check whether f(xi)y
22
Warm-up (1)
  • If A returns a correct answer on 1-1/2n of the r
    s
  • Choose r1, r2, rn ?R 0,1n
  • Run A(y,r1), A(y,r2), A(y,rn)
  • Denote the response z1, z2, zn
  • If r1, r2, rn are linearly independent then
  • there is a unique x satisfying rix zi for all
    1 i n
  • Probzi A(y,ri) rix 1-1/2n
  • Therefore probability that all the zis are
    correct is at least ½
  • Do we need complete independence of the ri s?
  • one-wise independence is sufficient
  • Can choose r ?R 0,1n and set ri rei
  • ei 0i-110n-i
  • All the ri s are linearly independent
  • Each one is uniform in 0,1n

23
Warm-up (2)
  • If A returns a correct answer on 3/4e of the r
    s
  • Can amplify the probability of success!
  • Given any r ? 0,1n Procedure A(y,r)
  • Repeat for j1, 2,
  • Choose r ?R 0,1n
  • Run A(y,rr) and A(y,r), denote the sum of
    responses by zj
  • Output the majority of the zjs
  • Analysis
  • Przj rx PrA(y,r)rx A(y,rr)(rr)x
    ½2e
  • Does not work for ½e since success on r and
    rr is not independent
  • Each one of the events zj rx is independent
    of the others
  • Therefore by taking sufficiently many js can
    amplify to a value as close to 1 as we wish
  • Need roughly 1/e2 examples
  • Idea for improvement fix a few of the r

24
The real thing
  • Choose r1, r2, rk ?R 0,1n
  • Guess for j1, 2, k the value zj rjx
  • Go over all 2k possibilities
  • For all nonempty subsets S ?1,,k
  • Let rS ? j? S rj
  • The implied guess for zS ? j? S zj
  • For each position xi
  • for each S ?1,,k run A(y,ei-rS)
  • output the majority value of zs A(y,ei-rS)
  • Analysis
  • Each one of the vectors ei-rS is uniformly
    distributed
  • A(y,ei-rS) is correct with probability at least
    ½e
  • Claim For every pair of nonempty subset S ?T
    ?1,,k
  • the two vectors rS and rT are pair-wise
    independent
  • Therefore variance is as in completely
    independent trials
  • I is the number of correct A(y,ei-rS), VAR(I)
    2k(½e)
  • Use Chebyshevs Inequality PrI-E(I)
    ?vVAR(I)1/?2
  • Need 2k n/e2 to get the probability of error
    to 1/n
  • So process is successful simultaneously for all
    positions xi,i?1,,n

S
T
25
Analysis
  • Number of invocations of A
  • 2k n (2k-1) poly(n, 1/e) n3/e4
  • Size of resulting list of candidates for x
  • for each guess of z1, z2, zk unique x
  • 2k poly(n, 1/e) ) n/e2
  • Conclusion single bit expansion of a one-way
    permutation is a pseudo-random generator

guesses
positions
subsets
n1
n
x
f(x)
h(x,r)
26
Reducing the size of the list of candidates
  • Idea bootstrap
  • Given any r ? 0,1n Procedure A(y,r)
  • Choose r1, r2, rk ?R 0,1n
  • Guess for j1, 2, k the value zj rjx
  • Go over all 2k possibilities
  • For all nonempty subsets S ?1,,k
  • Let rS ? j? S rj
  • The implied guess for zS ? j? S zj
  • for each S ?1,,k run A(y,r-rS)
  • output the majority value of zs A(y,r-rS)
  • For 2k 1/e2 the probability of error is, say,
    1/8
  • Fix the same r1, r2, rk for subsequent
    executions
  • They are good for 7/8 of the rs
  • Run warm-up (2)
  • Size of resulting list of candidates for x is
    1/e2

27
Application Diffie-Hellman
  • The Diffie-Hellman assumption
  • Let G be a group and g an element in G.
  • Given g, agx and bgy it is hard to find cgxy
  • for random x and y is probability of poly-time
    machine outputting gxy is negligible
  • More accurately a sequence of groups
  • Dont know how to verify given c whether it is
    equal to gxy
  • Exercise show that under the DH Assumption
  • Given agx , bgy and r ? 0,1n no polynomial
    time machine can guess r gxy with advantage
    1/poly
  • for random x,y and r

28
Application if subset is one-way, then it is a
pseudo-random generator
  • Subset sum problem given
  • n numbers 0 a1, a2 ,, an 2m
  • Target sum y
  • Find subset S? 1,...,n ? i ?S ai,y
  • Subset sum one-way function f0,1mnn ?
    0,1mmn
  • f(a1, a2 ,, an , x1, x2 ,, xn )
  • (a1, a2 ,, an , ? i1n xi ai mod 2m )
  • If mltn then we get out less bits then we put in.
  • If mgtn then we get out more bits then we put in.
  • Theorem if for mgtn subset sum is a one-way
    function, then it is also a pseudo-random
    generator

29
Subset Sum Generator
  • Idea of proof use the distinguisher A to compute
    r x
  • For simplicity do the computation mod P for
    large prime P
  • Given r ? 0,1n and (a1, a2 ,, an ,y)
  • Generate new problem(a1, a2 ,, an ,y)
  • Choose c ?R ZP
  • Let ai ai if ri0 and ai aic mod P if ri1
  • Guess k ?R o,,n - the value of ? xi ri
  • the number of locations where x and r are 1
  • Let y yc k mod P
  • Run the distinguisher A on (a1, a2 ,, an
    ,y)
  • output what A says Xored with parity(k)
  • Claim if k is correct, then (a1, a2 ,, an
    ,y) is ?R pseudo-random
  • Claim for any incorrect k, (a1, a2 ,, an
    ,y) is ?R random
  • y z (k-h)c mod P where z ? i1n xi ai mod
    P and h? xi ri
  • Therefore probability to guess correctly r x is
    1/n(½e) (n-1)/n (½) ½e/n

ProbA0pseudo ½e
ProbA0random ½
pseudo-random
random
correct k
incorrect k
30
Interpretations of the Goldreich-Levin Theorem
  • A tool for constructing pseudo-random generators
  • The main part of the proof
  • A mechanism for translating general confusion
    into randomness
  • Diffie-Hellman example
  • List decoding of Hadamard Codes
  • works in the other direction as well (for any
    code with good list decoding)
  • List decoding, as opposed to unique decoding,
    allows getting much closer to distance
  • Explains unique decoding when prediction was
    3/4e
  • Finding all linear functions agreeing with a
    function given in a black-box
  • Learning all Fourier coefficients larger than e
  • If the Fourier coefficients are concentrated on a
    small set can find them
  • True for AC0 circuits
  • Decision Trees

31
Composing PRGs
l1
  • Composition
  • Let
  • g1 be a (l1, l2 )-pseudo-random generator
  • g2 be a (l2, l3)-pseudo-random generator
  • Consider g(x) g2(g1(x))
  • Claim g is a (l1, l3 )-pseudo-random generator
  • Proof consider three distributions on 0,1l3
  • D1 y uniform in 0,1l3
  • D2 yg(x) for x uniform in 0,1l1
  • D3 yg2(z) for z uniform in 0,1l2
  • By assumption there is a distinguisher A between
    D1 and D2
  • A must either
  • distinguish between D1 and D3 - can use A use
    to distinguish g2
  • or
  • distinguish between D2 and D3 - can use A use
    to distinguish g1

l2
l3
triangle inequality
32
Composing PRGs
  • When composing
  • a generator secure against advantage e1
  • and a
  • a generator secure against advantage e2
  • we get security against advantage e1e2
  • When composing the single bit expansion generator
    n times
  • Loss in security is at most e/n
  • Hybrid argument to prove that two distributions
    D and D are indistinguishable
  • suggest a collection of distributions D D0, D1,
    Dk D such that
  • If D and D can be distinguished, there is a
    pair Di and Di1 that can be distinguished.
  • Difference e between D and D means e/k between
    some Di and Di1
  • Use such a distinguisher to derive a contradiction

33
From single bit expansion to many bit expansion
Internal Configuration
Input
Output
x
f(x)
h(x,r)
r
h(f(x),r)
f(2)(x)
f(3)(x)
h(f (2)(x),r)
h(f (m-1)(x),r)
f(m)(x)
  • Can make r and f(m)(x) public
  • But not any other internal state
  • Can make m as large as needed

34
Exercise
  • Let Dn and Dn be two distributions that
    are
  • Computationally indistinguishable
  • Polynomial time samplable
  • Suppose that y1, ym are all sampled according
    to Dn or all are sampled according to Dn
  • Prove no probabilistic polynomial time machine
    can tell, given y1, ym, whether they were
    sampled from Dn or Dn

35
Existence of PRGs
  • What we have proved
  • Theorem if pseudo-random generators stretching
    by a single bit exist, then pseudo-random
    generators stretching by any polynomial factor
    exist
  • Theorem if one-way permutations exist, then
    pseudo-random generators exist
  • A harder theorem to prove
  • Theorem HILL if one-way functions exist, then
    pseudo-random generators exist
  • Exercise show that if pseudo-random generators
    exist, then one-way functions exist

36
Next-bit Test
  • Definition a function g0,1 ? 0,1 is said
    to pass the next bit test if
  • It is polynomial time computable
  • It stretches the input g(x)gtx
  • denote by l(n) the length of the output on
    inputs of length n
  • If the input (seed) is random, then the output
    passes the next-bit test
  • For any prefix 0 ilt l(n), for any probabilistic
    polynomial time adversary A that receives the
    first i bits of y g(x) and tries to guess the
    next bit, or any polynomial p(n) and sufficiently
    large n
  • ProbA(yi,y2,, yi) yi1 1/2 lt 1/p(n)
  • Theorem a function g0,1 ? 0,1 passes the
    next bit test if
  • and only if it is a pseudo-random generator

37
Next-block Undpredictable
  • Suppose that the function G maps a given a seed
    into a sequence of blocks
  • let l(n) be the length of the number of blocks
    given a seed of length n
  • If the input (seed) is random, then the output
    passes the next-block unpredicatability test
  • For any prefix 0 ilt l(n), for any probabilistic
    polynomial time adversary A that receives the
    first i blocks of y g(x) and tries to guess the
    next block yi1, for any polynomial p(n) and
    sufficiently large n
  • ProbA(y1,y2,, yi) yi1 lt 1/p(n)
  • Exercise show how to convert a next-block
    unpredictable generator into a pseudo-random
    generator.

y1 y2, ,
38
Pseudo-Random Generatorsconcrete version
  • Gn?0,1?m ??0,1?n
  • A cryptographically strong pseudo-random sequence
    generator - if passes all polynomial time
    statistical tests
  • (t,?)-pseudo-random - no test A running in time t
    can distinguish with advantage ?

39
Three Basic issues in cryptography
  • Identification
  • Authentication
  • Encryption
  • Solve in a shared key environment

A
B
S
S
40
Identification - Remote login using
pseudo-random sequence
  • A and B share key S??0,1?k
  • In order for A to identify itself to B
  • Generate sequence Gn(S)
  • For each identification session - send next block
    of Gn(S)

Gn(S)
41
Problems...
  • More than two parties
  • Malicious adversaries - add noise
  • Coordinating the location block number
  • Better approach Challenge-Response

42
Challenge-Response Protocol
  • B selects a random location and sends to A
  • A sends value at random location

A
B
Whats this?
43
Desired Properties
  • Very long string - prevent repetitions
  • Random access to the sequence
  • Unpredictability - cannot guess the value at a
    random location
  • even after seeing values at many parts of the
    string to the adversarys choice.
  • Pseudo-randomness implies unpredictability
  • Not the other way around for blocks

44
Authenticating Messages
  • A wants to send message M??0,1?n to B
  • B should be confident that A is indeed the sender
    of M
  • One-time application
  • S (a,b) -
  • where a,b?R ?0,1?n
  • To authenticate M supply aM? b
  • Computation is done in GF2n

45
Problems and Solutions
  • Problems - same as for identification
  • If a very long random string available -
  • can use for one-time authentication
  • Works even if only random looking
  • a,b

A
B
Use this!
46
Encryption of Messages
  • A wants to send message M??0,1?n to B
  • only B should be able to learn M
  • One-time application
  • S a
  • where a?R ?0,1?n
  • To encrypt M
  • send a ? M

47
Encryption of Messages
  • If a very long random looking string available -
  • can use as in one-time encryption

A
B
Use this!
48
Pseudo-random Functions
  • Concrete Treatment
  • F ?0,1?k ? ?0,1?n ? ?0,1?m
  • key Domain
    Range
  • Denote Y FS (X)
  • A family of functions Fk FS S??0,1?k ? is
    (t, ?, q)-pseudo-random if it is
  • Efficiently computable - random access
  • and...

49
(t,?,q)-pseudo-random
  • The tester A that can choose adaptively
  • X1 and get Y1 FS (X1)
  • X2 and get Y2 FS (X2 )
  • Xq and get Yq FS (Xq)
  • Then A has to decide whether
  • FS ?R Fk or
  • FS ?R R n ? m ? F F ?0,1?n ? ?0,1?m ?

50
(t,?,q)-pseudo-random
  • For a function F chosen at random from
  • (1) Fk FS S??0,1?k ?
  • (2) R n ? m ? F F ?0,1?n ? ?0,1?m ?
  • For all t-time machines A that choose q
    locations and try to distinguish (1) from (2)
  • ? Prob?A? 1 ? F?R Fk ?
  • - Prob?A? 1 ? F?R R n ? m ? ? ? ?

51
Equivalent/Non-Equivalent Definitions
  • Instead of next bit test for X??X1,X2 ,?, Xq?
    chosen by A, decide whether given Y is
  • Y FS (X) or
  • Y?R?0,1?m
  • Adaptive vs. Non-adaptive
  • Unpredictability vs. pseudo-randomness
  • A pseudo-random sequence generator g?0,1?m
    ??0,1?n
  • a pseudo-random function on small domain ?0,1?log
    n??0,1? with key in ?0,1?m

52
Application to the basic issues in cryptography
  • Solution using a shared key S
  • Identification
  • B to A X ?R ?0,1?n
  • A to B Y FS (X)
  • A verifies
  • Authentication
  • A to B Y FS (M)
  • replay attack
  • Encryption
  • A chooses X?R ?0,1?n
  • A to B ltX , Y FS (X) ? M gt

53
Goal
  • Construct an ensemble Fk k?L ? such that
  • for any tk, 1/?k, qk k?L ? polynomial in k,
    for all but finitely many ks
  • Fk is a (tk, ?k, qk )-pseudo-random family

54
Construction
  • Construction via Expansion
  • Expand n or m
  • Direct constructions

55
Effects of Concatenation
  • Given l Functions F1 , F2 ,?, Fl decide whether
    they are
  • l random and independent functions
  • OR
  • FS1 , FS2 ,?, FSl for S1,S2 ,?, Sl ?R?0,1?k
  • Claim If Fk FS S??0,1?k ? is
    (t,?,q)-pseudo-random
  • cannot distinguish two cases
  • using q queries
  • in time tt - l?q
  • with advantage better than l??

56
Proof Hybrid Argument
  • i0 FS1 , FS2 ,?, FSl p0
  • i R1, R2 , ? , Ri-1,FSi , FSi1 ,?, FSl
    pi
  • il R1, R2 , ? , Rl
    pl
  • ? pl - p0 ?? ? ? ? i ?pi1 - pi ?? ?/l

57
...Hybrid Argument
  • Can use this i to distinguish whether
  • FS ?R Fk or FS ?R R n ? m
  • Generate FSi1 ,?, FSl
  • Answer queries to first i-1 functions at random
    (consistently)
  • Answer query to FSi , using (black box) input
  • Answer queries to functions i1 through l with
    FSi1 ,?, FSl
  • Running time of test - t ? l?q

58
Doubling the domain
  • Suppose F(n) ?0,1?k ? ?0,1?n ? ?0,1?m which
    is (t,?,q)-p.r.
  • Want F(n1) ?0,1?k ? ?0,1?n1 ? ?0,1?m
    which is (t,?,q)-p.r.
  • Use G ?0,1?k ? ?0,1?2k which is (t ,?) p.r
  • G(S) ? G0(S) G1(S)
  • Let FS (n1)(bx) ? FGb(s) (n)(x)

59
Claim
  • If G is (t?q,?1)-p.r and F(n) is (t?2q,?2,q)-p.r,
    then F(n1) is (t,?1 ?2 ?2,q)-p.r
  • Proof three distributions
  • (1) F(n1)
  • (2) FS0 (n) , FS1 (n) for independent S0, S1
  • (3) Random

D? ?1 ?2 ?2
60
...Proof
  • Given that (1) and (3) can be distinguished with
    advantage ?1 ?2 ?2 , then either
  • (1) and (2) with advantage ?1
  • G can be distinguished with advantage ?1
  • or
  • (2) and (3) with advantage 2 ?2
  • F(n) can be distinguished with advantage ?2
  • Running time of test - t ? q

61
Getting from G to F(n)
  • Idea Use recursive construction
  • FS (n)(bnbn-1 ?b1)
  • ? FGb1(s) (n-1)(bn-1bn-2 ?b1)
  • ? Gbn(Gbn-1 ( ? Gb1(S)) ?)
  • Each evaluation of FS (n)(x) n invocations of G

62
Tree Description
S
G1(S)
G0(S)
G0(G0(S))
Each leaf corresponds to an X. Label on leaf
value of pseudo-random function
G1(G0(G0(S)))
63
Security claim
  • If G is (t ?qn ,?) p.r,
  • then F(n) is (t, ? ? n?q??,q) p.r
  • Proof Hybrid argument by levels
  • Di
  • truly random labels for nodes at level i.
  • Pseudo-random from i down
  • Each Di - a collection of q functions
  • ? i ?pi1 - pi ?? ?/n? q??

64
Hybrid
?S
i
S1
S0
Di
G0(S0)
n-i
G1(G0(S0))
65
Proof of Security
  • Can use this i to distinguish concatenation of q
    sequence generators G from random.
  • The concatenation is (t,q?) p.r
  • Therefore the construction is (t,?,q) p.r

66
Disadvantages
  • Expensive - n invocations of G
  • Sequential
  • Deterioration of ?
  • But does the job!
  • From any pseudo-random sequence generator
    construct a pseudo-random function.
  • Theorem one-way functions exist if and only if
    pseudo-random functions exist.

67
Applications of Pseudo-random Functions
  • Learning Theory - lower bounds
  • Cannot PAC learn any class containing
    pseudo-random function
  • Complexity Theory - impossibility of natural
    proofs for separating classes.
  • Any setting where huge shared random string is
    useful
  • Caveat what happens when the seed is made
    public?

68
References
  • Blum-Micali SIAM J. Computing 1984
  • Yao
  • Blum, Blum, Shub SIAM J. Computing, 1988
  • Goldreich, Goldwasser and Micali J. of the ACM,
    1986

69
...References
  • Books
  • O. Goldreich, Foundations of Cryptography - a
    book in three volumes.
  • Vol 1, Basic Tools, Cambridge, 2001
  • Pseudo-randomness, zero-knowledge
  • Vol 2, about to come out
  • (Encryption, Secure Function Evaluation)
  • Other volumes in www.wisdom.weizmann.ac.il/oded/b
    ooks.html
  • M. Luby, Pseudorandomness and Cryptographic
    Applications, Princeton University Pres
  • ,

70
References
  • Web material/courses
  • S. Goldwasser and M. Bellare, Lecture Notes on
    Cryptography,
  • http//www-cse.ucsd.edu/mihir/papers/gb.html
  • Wagner/Trevisan, Berkeley
  • www.cs.berkeley.edu/daw/cs276
  • Ivan Damgard and Ronald Cramer, Cryptologic
    Protocol Theory
  • http//www.daimi.au.dk/ivan/CPT.html
  • Salil Vadhan, Pseudorandomness
  • http//www.courses.fas.harvard.edu/cs225/Lectures
    -2002/
  • Naor, Foundations of Cryptography and Estonian
    Course
  • www.wisdom.weizmann.ac.il/naor
Write a Comment
User Comments (0)
About PowerShow.com