Safety Critical Systems 5 Formal Verification and Testing - PowerPoint PPT Presentation

About This Presentation
Title:

Safety Critical Systems 5 Formal Verification and Testing

Description:

Safety Critical Systems 5 Formal Verification and Testing T 79.5303 Safety Critical Systems Formal Verification/testing Verification and validation Verification is ... – PowerPoint PPT presentation

Number of Views:36
Avg rating:3.0/5.0
Slides: 41
Provided by: tcsHutFi
Category:

less

Transcript and Presenter's Notes

Title: Safety Critical Systems 5 Formal Verification and Testing


1
Safety Critical Systems 5Formal Verification and
Testing
  • T 79.5303 Safety Critical Systems

2
Formal Verification/testing
3
Verification and validation
  • Verification is the process of determining that a
    system or module meets its specification.
  • Validation is the process of determining that a
    system is appropriate for its purpose.
  • Testing is a process used to verify or validate
    system or its components.

4
(No Transcript)
5
Prover Formal Verification
6
Prover
7
Prover
8
Prover
9
Prover
10
Prover
11
Prover
12
Prover
13
Prover
14
Prover iLock for Signalling
15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
Testing
26
Testing in different stages of V
  • Testing is performed during various stage of
    system development.
  • Module testing evaluation of a small function
    of the hardware/software.
  • System integration testing investigates
    correct interaction of modules.
  • System validation testing a complete system
    satisfies its requirements.

27
Forms of Testing
  • Dynamic testing - execution of the system or
    component in the natural/simulated environment.
  • Functional test all functions
  • Structural test signal/test cases (glass-box)
  • Random n-dimensional input space
  • Static testing - reviews, inspections and
    walkthroughs. Static code analysis for software.
  • Modelling - mathematical representation of the
    behaviour of a system or its environment.

28
Testing Methods
  • Black-box testing requirements-based, no
    information of the system, what is inside.
  • White-box testing more information about the
    system design to guide testing. Open view glass
    box.
  • Gray-box testing open the internal structure,
    but not detailed information

29
Dynamic testing techniques
  • Dynamic testing standards IEC1508, BCS (British
    Computer Society), Def STAN 00-55 and DO-178B.
  • - Process simulation
  • Error seeding/guessing
  • Timing and memory tests
  • Performance/stress testing
  • Probabilistic testing values for failure
    rates

30
Test planning
  • Lifecycle Phase Activity Safety case
  • Requirements Hazard identification Analysis
    results
  • Test planning Identify tests integrity Strategy
    for V/V
  • Req/Design/Test Trace hazards to specs. Risk
    reduction
  • Req/Design Define specs Design analysis
  • Safety Functional Requirements are the actual
    safety- related functions which the system,
    sub-system or item of equipments required to
    carry out. (CENELEC)

31
Simulator testing
  • Safety critical standards e.g. Def STAN 00-551
    recommend that if a simulator is used to validate
    a safety-critical system then the simulator
    should be properly validated.
  • In industry, simulators are validated using ad
    hoc techniques and no guidelines on simulator
    validation are available.

32
Simulator testing
  • Modified lifecycle model which illustrates the
    importance of environment simulation and helps to
    define the techniques which should be adopted.
  • This model expands the conventional V model to
    form a W model, where the left hand side
    represents the development of the product and the
    right hand side the development of the simulator
    used to test it.
  • The W lifecycle model defines a similar set of
    phases for the development of the environment
    simulator to those used in the development of the
    product itself. This does not necessarily imply
    that the amount of effort required in the former
    is equal to that in the latter.

33
The W Model of the Software Development
Lifecycle
34
Statistical software testing
  • A type of random testing inputgtoutput
  • Provides quantifiable metric of software
    integrity probability of failure and
    reliability figures
  • A proper environment simulation is needed
  • Statistical method(s) is needed to produce an
    estimate of probability of failure and a measure
    of the confidence in that estimate

35
Safety Case / Lifecycle 1
36
Safety Case / Lifecycle 2
37
Test plan /activities
38
Definitions of Testability
  • The degree to which a system or component
    facilitates the establishment of test criteria
    and the performance of tests to determine whether
    those criteria have been met.
  • The effort required to apply a given testing
    strategy to a system.
  • The ease with which faults in a system can be
    made to reveal themselves during testing.

39
Enough Testing
  • How much testing do I need to do to prove that
    my system is safe?.
  • An industrial project developed results which
    included the situation where failures were
    observed during testing. For example, a 99
    confidence that the probability of failure on
    test demand is smaller than 0,001 requires about
    5000 demands all of which are successful.
  • Safety critical system testing starts, when
    normal industrial testing procedures has passed
    without a single failure.

40
Testing
  • Home assignment
  • 12.7 Describe the characteristics of the three
    major categories of dynamic testing and give
    examples of techniques that fall within each
    group. State whether each group corresponds to a
    black-box or a white-box approach.
  • Please email to herttua_at_uic.asso.fr by
  • 24 of April 2008
  • References KnowGravity, I-Logix, Contesse
    project, Prover
Write a Comment
User Comments (0)
About PowerShow.com