Evaluating System Security: THE ORANGE BOOK etc - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Evaluating System Security: THE ORANGE BOOK etc

Description:

A technique used to provide a measure of trust based on the security ... Subject the product to evaluation through out its development life cycle ... – PowerPoint PPT presentation

Number of Views:191
Avg rating:3.0/5.0
Slides: 32
Provided by: gmu1
Category:

less

Transcript and Presenter's Notes

Title: Evaluating System Security: THE ORANGE BOOK etc


1
Evaluating System Security THE ORANGE BOOK etc
  • Ravi Sandhuedted byDuminda Wijesekera

2
Evaluating system security
  • A formal security evaluation requires
  • Systems functional requirements
  • Systems assurance requirements
  • A methodology to determine if the system meets
    these requirements
  • A measure of evaluation
  • Referred to as a level of trust
  • A formal evaluation methodology
  • A technique used to provide a measure of trust
    based on the security requirements and evidence
    that the system has met them

3
Evaluation methods
  • Subject the product to evaluation through out its
    development life cycle
  • Take the finished product to a testing laboratory
  • Obtain a certificate of trustworthiness
  • Historical development
  • Many standards
  • TESEC 1983-1999 (The Orange Book)
  • ITSEC 1991-2001
  • Federal criteria 1992
  • FIPS 140-1 of 1994 and FIPS-2 of 2001
  • The common criteria 1998- present
  • Other commercial efforts

4
The Orange Book
  • TESEC Trusted Computer System Evaluation
    Criteria
  • Uses
  • Evaluation classes and online structure
  • Identifies functional and assurance criteria
  • Both embedded in prose descriptions of named
    classes
  • Does not match with current requirements
    engineering methodologies
  • Example Life cycle models

5
Orange book notations
  • Reference monitor an abstract machine that
    mediates all access control decisions
  • Reference validation mechanism (RVM) an
    implementation of a reference monitor
  • Security kernal software hardware that
    implements a reference monitor
  • Trusted Computing Base (TCB) all protection
    mechanisms that enforce security policy
  • Target of Evaluation (ToE) the subject of
    evaluation (system or product)

6
TESEC Functional Requirements -1
  • DAC requirements
  • Object reuse requirements
  • Threats of an attacker gathering information from
    cache and disk
  • MAC requirements
  • BLP properties
  • Label hierarchies
  • Label requirements
  • For subjects and objects

7
TESEC functional requirements -2
  • Identification and authentication requirements
  • Access control granularity groups, users, rows
  • Trusted path requirements
  • Provides communication requirements between the
    TCB and the user process
  • Audit requirements
  • What data must be audited, what an audit record
    must contain and what events must be audited

8
ORANGE BOOK CLASSES- 1
  • A1 Verified Design
  • B3 Security Domains
  • B2 Structured Protection
  • B1 Labeled Security Protection
  • C2 Controlled Access Protection
  • C1 Discretionary Security Protection
  • D Minimal Protection

HIGH SECURITY
NO SECURITY
9
Security Classes C Details 1
  • C1-Discretionary protection minimal protection,
    covers testing and documentation
  • C2- Controlled access protection C1 object
    reuse audit.
  • Most common for commercial products
  • Many OS vendors C2 protection

10
Security Classes B Details -2
  • B1- Labeled security protection MAC labeling
    support
  • Some testing
  • An informal model of security policy
  • Many OS vendors provide B1 additions
  • B2- Strict Protection expanded labeling trust
    paths for login principle of least privilege
  • Covert channel analysis
  • A formal security model
  • Configuration management

11
Security Classes B3A Details -3
  • B3- security domains full reference validation
  • Trust path requirements
  • Constrained disciplined code development
  • Modularity, layering and data hiding during
    design
  • A- Verified Protection functionally similar to
    B3, but more stringent assurances. Use formal
    methods in
  • Design specification analysis
  • Covert channel analysis

12
Orange Book Criteria
  • Security policy
  • Accountability
  • Assurance
  • Documentation

13
Security Policy
  • C1 C2 B1 B2 B3 A1
  • Discretionary Access Control nc nc nc
  • Object Reuse 0 nc nc nc nc
  • Labels 0 0 nc nc
  • Label Integrity 0 0 nc nc nc
  • Exporting Labeled Information 0 0 nc nc
    nc
  • Labeling Human-Readable Output 0 0 nc
    nc nc
  • Mandatory Access Control 0 0 nc nc
  • Subject Sensitivity Labels 0 0 0 nc
    nc
  • Device Labels 0 0 0 nc nc

Key 0 no requirement, added requirement,
nc no change
14
Accountability
  • C1 C2 B1 B2 B3 A1
  • Identification and Authentication nc nc
    nc
  • Audit 0 nc
  • Trusted Path 0 0 0 nc
  • no requirement
  • added requirement
  • nc no change

15
Assurances
  • no requirement
  • added requirement
  • nc no change
  • C1 C2 B1 B2 B3 A1
  • System Architecture nc
  • System Integrity nc nc nc nc nc
  • Security Testing
  • Design Spec. and Verification 0 0
  • Covert Channel Analysis 0 0 0
  • Trusted Facility Management 0 0 0 nc
  • Configuration Management 0 0 0 nc
  • Trusted Recovery 0 0 0 0 nc
  • Trusted Distribution 0 0 0 0 0

16
Documentation
  • C1 C2 B1 B2 B3 A1
  • Security Features User's Guide nc nc nc nc nc
  • Trusted Facility Manual nc
  • Test Documentation nc nc nc
  • Design Documentation nc
  • no requirement
  • added requirement
  • nc no change

17
Analyzing covert channels
  • B1 No requirement
  • B2 Covert storage channels
  • B3 Covert channels (i.e. storage and timing
    channels)
  • A1 Formal methods

18
System analysis
  • C1 The TCB shall maintain a domain for its own
    execution that protects it from tampering
  • C2 The TCB shall isolate the resources to be
    protected
  • B1 The TCB shall maintain process isolation
  • B2 The TCB shall be internally structured into
    well-defined largely independent modules
  • B3 The TCB shall incorporate significant use of
    layering, abstraction and data hiding
  • A1 No change

19
Design specification and verification
  • C2 No requirement
  • B1 Informal or formal model of the security
    policy
  • B2 Formal model of the security policy that is
    proven consistent with its axioms
  • DTLS (descriptive top-level specification) of
    the TCB
  • B3 A convincing argument shall be given that the
    DTLS is consistent with the model
  • A1 FTLS (formal top-level specification) of the
    TCB
  • A combination of formal and informal techniques
    shall be used to show that the FTLS is consistent
    with the model
  • A convincing argument shall be given that the
    DTLS is consistent with the model

20
Informal view of classes
  • C1, C2 Simple enhancement of existing systems.
    No breakage of applications
  • B1 Relatively simple enhancement of existing
    systems. Will break some applications.
  • B2 Relatively major enhancement of existing
    systems. Will break many applications.
  • B3 Failed A1
  • A1 Top down design and implementation of a new
    system from scratch

21
TSEC Evaluation Process
  • Design analysis
  • Rigorous analysis of system design based on
    documentation
  • Completeness and correctness criteria
  • Evaluators produce an initial product assessment
    report (IPAR)
  • Test analysis
  • Test coverage assessment
  • Executing vender supplied tests
  • Evaluators produce an final product assessment
    report (FAR)
  • Final review Results of each of these phases
    presented to the technical review board (TRB)
  • Review IPAR and FAR and award a rating

22
Criticisms of orange book criteria
  • Mixes various levels of abstraction in a single
    document
  • Does not address integrity of data
  • Combines functionality and assurance in a single
    linear rating scale

23
Functionality vs. Assurance
  • functionality is multi-dimensional
  • assurance has a linear progression

24
NCSC titles for selected classes
  • Red Trusted Network Interpretation
  • Lavender Trusted Database Interpretation
  • Orange Trusted Computer System Evaluation
    Criteria
  • Yellow Guidance for Applying the Orange Book

25
International Efforts
  • ITSEC 1991-2001 developed by western countries
  • Provides six levels of trust called levels of
    trust
  • E1, E2, E3, E4, E5, E6.
  • A certification process was in place.

26
ITSEC Levels E1 to E3
  • E1 Security target, informal description of
    architecture, testing
  • E2 Informal description of detailed design and
    ToE.
  • E3 More stringent requirements on detailed
    design correspondence between source code and
    security requirements

27
ITSEC Levels E4 to E6
  • E4 Formal model of security policy, structured
    approach for detailed system design, design level
    vulnerability analysis
  • E5 correspondence between detailed design
    source code level vulnerability analysis
  • E6 Extensive use of formal methods, formal proof
    that the architectural design consistent with
    security policy, partial mapping of executable to
    source

28
ITSEC Process
  • Each country had own process
  • Certified licensed evaluation facility
  • Process
  • Security target,
  • Binding of assurance requirements
  • When target approved, tested target
  • Rigid documentation

29
ITSEC assurance requirements
  • Suitability of requirements specification
  • Consistency
  • Coverage can the threats be covered by
  • environmental assumptions
  • Security requirements
  • Binding requirements enforcement
  • Do the enforcement mechanisms correctly enforce
    security policy?
  • Are the enforcement mechanisms mutually
    supportive?

30
In TESEC but not ITSEC
  • Architecture requirements
  • Tamper proof reference monitors
  • Process isolation
  • Principle of least privilege
  • Well defined user interfaces
  • Systems integrity
  • Required formal methods, but no approved formal
    methods

31
In ITSEC but not in TSEC
  • Requiring security assessment during design and
    development
  • Levels E2 onwards requires maintaining the
    mapping between requirements, design, detailed
    design and coding.
  • Procedures for delivery, generation and
    distribution procedures
  • Secure startup and operations procedures
  • Many forms of vulnerability assessments
  • No analysis of design level vulnerability
    assessment
  • Assessment of cryptographic capabilities
Write a Comment
User Comments (0)
About PowerShow.com