Title: ForwardSecurity in the Limited Communication Model
1Forward-Security in the Limited Communication
Model
Warsaw University and CNR Pisa
2This talk
- Brief introduction to the
- Limited Communication Model.
- The talk mostly is based on the following papers
- D06a S. Dziembowski Intrusion-Resilience via
the Bounded-Storage Model, TCC 2006 - D06b S. Dziembowski On Forward-Secure Storage,
accepted to CRYPTO 2006
3Idea of the Limited Communication Model
- Main idea
- Construct cryptographic protocols where the
secrets are so large that cannot be efficiently
stolen. - (details follow)
4Plan
- Motivation and Introduction
- Protocols for
- session-key generation
- entity authentication
- secure storage
- Connections to the other areas
5Disclaimer
- New area, work in progress!!!
6 7can impersonate the user!
K
the bank
8Question
- Can cryptography prevent such an attack?
9Can we prevent this attack?
- If we have a trusted hardware then this attack is
preventable. - This hardware can be e.g.
- a token
- a list of one-time passwords
- Note that the human-memorized passwords dont
help (because the virus can record the
key-strokes).
10Assume there is no trusted hardware
- (trusted hardware costs and is not easy to use)
- Can we have any security in this case?
- Looks quite hopeless...
- (If the adversary can take control over our
machine, than he can do whatever he wants.)
11The contribution of D06a
- We propose a cheap method that makes the task of
the adversary significantly harder - Idea
- Make the secret key K so large that the
adversary cannot retrieve it. - We consider
- entity-authentication
- session-key generation
12The model (1/3)
- We assume that
- the secret key K is stored on a machine which can
be infected by a virus - the virus can perform an arbitrary computation on
K, but the output of this computation U is
shorter than K - she can retrieve U.
13The model (2/3)
- As long as the machine is infected, the virus has
a full controll over it. - We want to have security in the periods when the
virus is not controlling the machine - This is called intrusion resilience.
We assume that the virus cannot modify the data
on the machine.
14The model (3/3)
- What else can the adversary do?
She can perform active attacks eavesdrop,
substitute and fabricate messages.
15A tool Bounded Storage Model
- It turns out that this is related to the Bounded
Storage Model (BSM) Maurer 1992 - In the BSM the security of the protocols is based
on the assumption that one can broadcast more
bits than the adversary can store. - The computing power of the adversary may be
unlimited!
16Some cryptographic tools
17Symmetric encryption
Key generation algorithm
secret key K
secret key K
M ?
Eve
ciphertext C Encr(K,M)
18Indistinguishability
- What does it mean that Eve has no information
about M? - To say She cannot guess M is not enough...
- Consider the following game
A selects messages M0,M1 and sends them to the
oracle
?
? selects a random bit b and a random key K,
and sends C E(K,Mb) to the adversary
b?
19One-time pad encryption
Disdvantage the key is as long as the
message! Shannon 1949 this is optimal unless
we limit the power of the adversary in some way.
20Message Authentication Codes
- Observation encryption does not guarantee
integrity of a message (example one-time pad) - To ensure the integrity one has to use MACs
(message authentication codes).
Eve can modify the transmited messages
M,MAC(K,M)
verifies the MAC and accepts only M
21Public key encryption
Key generation algorithm
Bobs public key e
Bobs secret key d
M ?
ciphertext C Encr(e,M)
22How to limit the power of the adversary
- Classical cryptographylimit the adversary to the
poly-timedisadvantage we dont know how to
prove any security here - Information-theoretic cryptographyassume
- quantum communication,
- bounded-storage,
- noisy channels
advantage provable security!
23The Bounded-Storage Model (BSM)
short initial key Y
randomizer disappears
knows Uh(R)
Eve shouldnt be able to distinguish X from
random
X ?
24Power of the adversary
- Note
- The only resource that we bound is memory.
- The computing power is unlimited!
25BSM previous results
- Several key-expansion functions f were proven
secure DR02, DM04b, Lu04, Vad04. - Of course their security depends on the bound on
the memory of the adversary. - We call a function s-secure if it is secure
against an adversary that has memory of a size s.
26The scheme of DM02
0
1
1
1
0
0
1
0
0
0
1
1
1
0
0
0
0
0
1
0
1
1
1
0
0
1
0
0
1
1
0
0
0
1
0
0
0
1
1
1
0
0
1
0
0
1
1
0
0
0
0
1
0
0
XOR
27End of the introduction to cryptography
28How is BSM related to our model?
- Seems that the assumptions are oposite
29Entity authentication the problem
the user can verify the authenticity of the bank
the bank cannot verify the authenticity of the
user
30Entity authentication the solution
verifies
The communication is done via the channel C.
31Security of the entity authentication protocol
(1/3)
- Clearly as long as the adversary is controlling
Alices machine, she can impersonate her. - But what happens when the adversary looses
control on the users machine?
32Security of the entity authentication protocol
(2/3)
33Security of the entity authentication protocol
(3/3)
- What about the active attacks?
Since the communication is done via the channel
C, the only thing that the adversary can do is
to act as a wire.
34Session-key generation
- The entity authentication protocols without key
generation are often not very useful. - It is much more practical to have a session-key
generation protocol.
35The session-key generation
long-term key K
Bob
Alice
...
36Intrusion-resilient session-key generation
compromised sessions when adversary installed
a virus on the machine of at least one of the
users
non-compromised sessions
compromised sessions
time
Clearly leaks to the adversary.
We want the keys generated in non-compromised
sessions to remain secret!
37Intrusion resilience backward forward security
38Forward-secure session-key generation (standard
method)
(Encr,Decr) a public key encryption scheme
long term key key K for a MAC
decrypts Z from C
erases d
Z
39Our protocol
- Outline
- We achieve forward security in a standard way.
- Only the backward security is novel.
- Challenge
- How to generate the key K (for authentication)
in a backward-secure way?
40A (slightly wrong) idea
41Security
42Security proof attempt (1/2)
43Security a proof attempt (2/2)
44How the adversary can influence the outcome of
the protocol
0
1
1
1
0
0
1
0
0
0
1
1
1
0
0
0
0
0
1
0
1
1
1
0
0
1
0
0
1
1
0
0
0
1
0
0
0
1
1
1
0
0
1
0
0
1
1
0
0
0
0
1
0
0
45Repairing the protocol
- How to repair the protocol?
- Our idea add hashing
46The Random Oracle Model
- We model the hash function as a random oracle
containing a truly random function.
The oracle can be queried by the the honest users
and by the adversary.
47Finalizing the proof
- So the adversary has no information about Ka and
Kb. - If KaKb - we are done!
- Otherwise Ka and Kb are independent.Alice and
Bob will detect it when they use Ka and Kb in a
MAC scheme.
48Improvements
- The Random Oracle Assumption was removed in Cash
et al. Cryptology ePrint Archive Report
2005/409. - They also introduce the name Limited
Communication Model.
49Independent Work
- Giovanni Di Crescenzo, Richard Lipton and
Shabsi Walfish Perfectly Secure Password
Protocols in the Bounded Retrieval Model, TCC
2006 - Main difference the adversary can retrieve only
individual bits.
50Example
- The function f of DM02 was proven secure when
the memory of the adversary has a size of around
8 of the length of the randomizer. - In our case the players need to store 2
randomizers, so the protocol is secure if the
adversary cannot retrieve 4 of the key. - Exampleif K 5 GB, then we can allow her to
retrieve 200 MB. - This can probably be improved significantly...
51Practicality?
- Note the trusted server can generate the key
pseudo-randomly and just store the seed.
52The contribution of D06b Forward-Secure
Storage (FSS)
Eve can compute any value U h(C) with U ltlt
C
C Encr(K,M)
C
M Decr(K,C)
U
M ?
K
53How realistic is this scenario that the key K
leaks?
- The encryption scheme can be broken.We model
this by granting to the adversary an unlimitted
computing power.This is called an
information-theoretic model. - The user can loose the key, esp. if she uses it
for a longer period.In this case we assume that
the computing power is limited (polynomial).This
is called a computational model.(weaker model
but allows more efficient solutions)
54Formal security definition of the FSS (1/2)
Consider the following game between an adversary
A and an oracle ?
A selects two messages M0,M1 and sends them to
the oracle
?
? selects a random bit b and a random key K,
computes C Encr(K,Mb) and send it to the
adversary
A
A stores an arbitrary U h(C) (with U ltlt
C)
? sends K to the adversary
b?
55Formal security definition of the FSS (2/2)
- We require that that the adversary has no chances
of guessing b correctly with a probability
significantly better than 0.5. - (In the computational case we will also assume
that the adversary is performing only poly-time
computations.)
56Information-theoretic solution a wrong idea
57Can it work?
- From the Shannon theorem it cannot be correct!
- Why?
- Because the key is shorter than the message...
- So it can be broken even without an assumption
that the key leaks
58A better idea
message
key
Z M
Y
M
Z
R
f(Y,R) M Z
59Security proof (1/4)
- Suppose
- A an adversary that breaks the FSS scheme, i.e.
wins the distinguishing game with probability 0.5
a, for some non-negligible a. - We construct an adversary B that breaks the
function f in the following sense
B will get a string X equal to and will
guess if (0) or (1) occured.
The adversary B will do it by simulating the
adversary A.
60Security proof (2/4)
M0,M1
B
A
select a random string W
R
(against BSM)
send (R,W) to A
store the internal state of A
K
(against FSS)
X
send (K,Z) to A
if A guesses b correctly then output X
f(Y,R), otherwise output X is random.
61Security proof (3/4)
- Observation 1 If X f(Y,R) then the adversary A
guesses b correctly with probability at least 0.5
a.
- Observation 2 If X is random then the adversary
A guesses b correctly with probability exactly
0.5. - Proof
- From the point of view of the adversary the
random variables - (K,Z X Mb)
- b
- are independent! qed
62Security proof (4/4)
prob. 0.5
prob. 0.5
X is random
X f(Y,R)
prob. 0.5
prob. 0.5
prob. 0.5 a
prob. 0.5 - a
A is right
A is right
A is wrong
A is wrong
B is right in these cases
probability 0.5 (0.5 a) 0.25 0.5 a/2
63Problem with the information-theoretic scheme
- The secret key needs to be larger than the
message! - What if we want the key to be shorter?
64Computational FSS (with a short key) idea 1
(Encr,Decr) IT-secureFSS
key K
)
Encr(K,
message M
Idea 1 Use a short secret key K and expand it
pseudorandomly to a longer key K.
key K
cryptographic pseudorandom generator
key K
)
Encr(K,
message M
65(A tool cryptographic PRG)
- A cryptographic pseudorandom generator is a
function that expands a short seed s into a much
longer string x, in such a way that -
- x cannot be distinguished from a random string
(in poly-time) - (assuming that s was chosen uniformly at random)
66Idea 1 intuition
- Idea 1 should work because
- from the point of view of the adversary K is
indistinguishable from uniform - Turns out that this intuition is wrong...
- (although probably most of PRGs work well here)
67Example (1/3)
- Suppose that we stored on a computer
a long string R(R1,...,Rt) of length t
an encrypted index i 1,...,t E(K,i)
(where E is some encryption scheme)
An adversary gets access to this computer and can
retrieve t/2 bits. Later, she learns K. Can she
now compute Ri ? With probability 0.5
trivial. Turns out can be done with probability
1 !!! (for a very weird encryption function E)
68A Tool Private Information Retrieval
an index i 1,...,t
a string R(R1,...,Rt)
query Q(N,i)
generates and stores some secret N
answer A(Q(N,i),R)
database
user
A(Q(N,i),R)
N
- the database doesnt learn i
- A(Q(N,i),R) ltlt t
Ri
69PIR implementation
- See e.g.
- Eyal Kushilevitz and Rafail Ostrovsky,
Replication Is Not Needed Single Database,
Computationally-Private Information Retrieval,
FOCS97
70PIR of KO97
71Quadratic Residues
- ZN - a group of numbers 1,,N, relatively
prime with N - A number a is a quadratic residue modulo N
(denote QR(N)) if there exits b, such that - a b2 mod N
- Suppose Npq, where p and q large primes
ZN
ZN
QR(N)
QNR(N)
72Facts about QR
- Given the factorization p,q of n it is easy to
- generate random elements of QR(N) and QNR(N),
- decide membership in QR(N) and QNR(N).
- The second point is hard without the knowledge of
the factorization of N. - A product of a and b is a quadratic residue
- iff
- exactly one of a,b is a quadratic residue
73PIR of KO97 - actions of the user
Basic idea arrange the string R into a square s
x s matrix, where s vt
- The user on input i
- generates a random Nwe will work in ZN
- produces a1,,as,such thatonly ac is QNR
- sends N,a1,,as, to thedatabase
row d
74PIR of KO97 - actions of the database
it is a QR iff Ri1
R
b1 a1a22a32a4a52a6
b2 a12a2a3a4a52a62
b3 a1a22a32a42a5a6
The database sends b1,,bs to the user
75PIR of KO97 final remarks
- The user just looks at bd and decides that
- Ri 1 if bd is a QR
- Ri 0, otherwise.
- Security follows from the fact that the database
cannot distinguish QR from QNR
76PIR of KO97 the end
77Example (2/3)
a long string R(R1,...,Rt) of length t
an encrypted index i 1,...,t E(K,i)
- We construct the encryption scheme (E,D).
- ingridients
- (E,D) - some encryption scheme with a key K.
- PIR
key K
(E,D)
user secret N
K
E(K,i)
Q(N,i)
E(K,i)
this is secure...
78Example (3/3)
a long string R(R1,...,Rt) of length t
Q(N,i)
E(K,i)
The adversary simulates the user and stores
A(Q(N,i),R)
A(Q(N,i),R)
Ri
users secret N
K
79Idea 2
Recall the Idea 1
key K
(E,D) --- some encryption scheme
cryptographic pseudorandom generator
key K
use message M as a key for encryption (E,D)
)
Encr(K,
a random message M
message M
message M
)
E(M,
message M
80Idea 2 security proof (sketch)
Suppose we have an adversary A that breaks the
scheme
From the security of (E,D) A cannot break the
modfied scheme.
Consider a modified scheme
Hence we can construct an adversary that
breaks the original scheme!
a random message L
)
Encr(K,
a random message M
message M
)
E(M,
E(L,
message M
81Complexity-theoretic view on encryption
The adversary knows C and she knows that M M0
or M M1 and wants to decide if there exists K
such that C Encr(K,M0)
C
this is an NP language!
Observe that if M gtgt K then the probability
that for random M0 and M1 and a random K0 there
exists K1 such that Encr(K0,M0) Encr(K1,M1) is
negligible
messages
M0
M1
C0
C0
C0
C0
K0
keys
C0
C0
82Complexity-theoretic view
classical encryption exists
P ?NP
gt
- In case of FSS the adversary
- stores some information about C(compresses it)
- later obtains the witness K
P ?NP
FSS exists
gt
and there exist NP problems that are
incompressible
83Compressibility of NP-instances
- This notion was recently studied in
- Danny Harnik, Moni Naor
- On the Compressibility of NP Instances and
- Cryptographic Applications
- ECCC Report TR06-022, 2006
- See also
- Bella Dubrov, Yuval Ishai
- On the Randomness Complexity of Efficient
Sampling - STOC 2006
84The contribution of HN06
- They describe a hierarchy of incompressible
NP-languages. - The show several implications for cryptography
and complexity of the assumption that certain
languages are incompressible.
85The idea of HN06
- Def. An NP-language L is compressible to L if
there exists an poly-time algorithm Z such that - Z(x) L ltgt x L
- the length of Z(x) is polynomial in x and inlog
w, where w is the witness of x -
- Compression is witness-retrievable if one can
compute (in poly-time) the witness for Z(x) from
w and Z(x). - (Observe that we need the witness-retrievability
in our case!)
86The end