Robust Declassification - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

Robust Declassification

Description:

an observer learns by watching S evolve. Need some way to compare them... CSFW14. 16 ... only learns information. by watching the system. (Can't lose information. ... – PowerPoint PPT presentation

Number of Views:29
Avg rating:3.0/5.0
Slides: 42
Provided by: stephanz
Category:

less

Transcript and Presenter's Notes

Title: Robust Declassification


1
Robust Declassification
  • Steve Zdancewic
  • Andrew Myers
  • Cornell University

2
Information Flow Security
Information flow policies are a natural way to
specify precise, system-wide, multi-level
security requirements. Enforcement mechanisms
are often too restrictive prevent desired
policies. Information flow controls provide
declassification mechanisms to accommodate
intentional leaks. But hard to understand
end-to-end system behavior.
3
Declassification
Declassification (downgrading) is the intentional
release of confidential information.
Policy governs use of declassification operation.
4
Password Example
public query
confidential password
Password Checker
This system includes declassification.
public result
5
Attack Copy data into password
public query
high security data
copy
Password Checker
high security data
Attacker can launder data through the password
checker.
leaked result
6
Robust Declassification
Goal Formalize the intuition that an
attacker should not be able to abuse the
downgrade mechanisms provided by the system to
cause more information to be declassified than
intended.
7
How to Proceed?
  • Characterize what information is declassified.
  • Make a distinction between intentional and
    unintentional information flow.
  • Explore some of the consequences of robust
    declassification.

8
A Simple System Model
9
Views of a System
A view of (S, ?) is an equivalence relation on S.
Example S String ? Integer "integer
component is visible"
(x,i) ?I (y,j) iff i j
("attack at dawn", 3) ?I ("retreat", 3)
/
("attack at dawn", 3) ?I ("retreat", 4)
10
Example Views
Example S String ? Integer "string
component is visible" (x,i) ?S (y,j)
iff x y "integer is even or
odd" (x,i) ?E (y,j) iff i2 j2
"complete view" (x,i) ??
(y,j) iff (x,i) (y,j)
11
Passive Observers
A view induces an observation of a trace
t1 ("x",1)?("y",1)?("z",2)?("z",3) t1 through
view ?I 1 ? 1 ? 2 ? 3 t2
("a",1)?("b",2)?("z",2)?("c",3) t2 through view
?I 1 ? 2 ? 2 ? 3
12
Observational Equivalence
  • The induced observational equivalence is S?
  • S? s' if the traces from s look the same
  • as the traces from s' through the view ?.

13
Simple Example
?
?
14
Simple Example
S?
?S?
15
Why Did We Do This?
  • is a view of S indicating what
  • an observer sees directly.
  • S? is a view of S indicating what
  • an observer learns by watching S evolve.

Need some way to compare them
16
An Information Lattice
The views of a system S form a lattice. Order
?A ? ?B
17
Lattice Order
?? ?I ? ?S
?I
?S
?E
??
18
Information Learned via Observation
??
S?
?
Lemma ? ? S?
??
19
Natural Security Condition
??
Definition A system S is ?-secure whenever
? S?
? S?
Closely related to standard definitions of
non-interference style properties.
??
20
Declassification
Declassification is intentional leakage of
information. Implies that ? S?
/
We want to characterize unintentional
declassification.
21
A Simple Attack Model
An ?A-attack is a system A (S, ?A) such that
?A A?A
?A is the attacker's view ?A is a set of
additional transitions introduced by the
attacker ?A A?A means "fair environment"
22
Attacked Systems
Given a system S (S, ?) and attack A
(S, ?A) the attacked system is S?A (S, ???A)
?

s
23
More Intuition
S? describes the information intentionally decla
ssified by the system a specification for how
S ought to behave.
(S?A)?A describes the information obtained by
an attack A.
24
Example Attack
?A
?A
25
Example Attack
Attack transitions affect the system.
?A
?A
26
Example Attack
Attacked system may reveal more.
?(S?A)?A
S?
?S?A
27
Robust Declassification
A system S is robust with respect to attack A
whenever (S?A)?A ? S?A.
??
S?A
(S?A)?A
?A
??
Intuition Attack reveals no additional
information.
28
Secure Systems are Robust
Theorem If S is ?A-secure then S is ?A-robust
with respect to all ?A-attacks.
Intuition S doesn't leak any information
to ?A-observer, so no declassifications to
exploit.
29
Characterizing Attacks
Given a system S and a view ?A, for
what ?A-attacks is S robust?
  • Need to rule out
  • Attack transitions
  • Like this
  • (integrity property)

S?
?S?
30
Integrity of Policy
Attack circumvents or alters the policy decision
made by the system.
S?
?S?
31
Providing Robustness
  • Identify a class of relevant attacks
  • Example attacker may modify / copy files
  • Example untrusted host may send bogus requests
  • Try to verify that the system is robust vs. that
    class of attacks.
  • Proof fails system is insecure.
  • Possible to provide enforcement mechanisms
    against certain classes of attacks
  • - Protecting integrity of downgrade policy

32
Future Work
This is a simple model extend definition
of robustness to better models.
How does robustness fit in with
intransitive non-interference?
We're putting the idea of robustness into
practice in a distributed version of Jif.
33
Conclusions
Its critical to prevent downgrading mechanisms
in a system from being abused.
Robustness is an attempt to capture this idea in
a formal way.
Suggests that integrity and confidentiality are
linked in systems with declassification.
34
Correction to Paper
Bug-fixed version of paper available http//www.cs
.cornell.edu/zdance
35
A Non-Attack
Uses information attacker can't see!
?A
?A
36
Observational Equivalence
s
Are s and s' observationally equivalent with
respect to ? ?
S? ?
s'
37
Observational Equivalence
s3
s1
s2
s
s4
Trcs(S)
s5
s6
Take the sets of traces starting at s and s'.
s2'
s1'
s'
Trcs'(S)
s4'
s5'
s3'
38
Observational Equivalence
s3
s1
s2
s
s4
s5
s6
Look at the traces modulo ?.
s2'
s1'
s'
s4'
s5'
s3'
39
Observational Equivalence
s3
s1
s2
s
s4
s5
s6
Look at the traces modulo ?.
s2'
s1'
s'
s4'
s5'
s3'
40
Observational Equivalence
Obss(S, ?)
s3
s1
s2
s
s4
s5
s6
Obss'(S, ?)
s2'
s1'
s'
s4'
s5'
s3'
41
Observational Equivalence
Obss(S, ?)
Are they stutter equivalent?
?
Yes! s S? s'

Obss'(S, ?)
?
Write a Comment
User Comments (0)
About PowerShow.com