ModelBased Testing - PowerPoint PPT Presentation

1 / 119
About This Presentation
Title:

ModelBased Testing

Description:

... states whether the triangle is scalene, isosceles, or equilateral. ... valid isosceles triangle ? 3 permutations of previous ? side = 0 ? negative side ? ... – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 120
Provided by: tarotBr
Category:

less

Transcript and Presenter's Notes

Title: ModelBased Testing


1
Model-Based Testing
Jan Tretmans jan.tretmans_at_esi.nl
Embedded Systems Institute Eindhoven, NL
Radboud UniversityNijmegen, NL
2
Overview
  • Software testing
  • Model-based testing
  • Model-based testingwith labelled transition
    systems
  • Testing of component-based systems
  • Model-based testing of components

3
Software TestingWhat? How?Who? Sorts?
4
Paradox of Software Testing
  • But also
  • ad-hoc, manual, error-prone
  • hardly theory / research
  • no attention in curricula
  • not cool if youre a bad programmer you might
    be a tester
  • Testing is
  • important
  • much practiced
  • 30-50 of project effort
  • expensive
  • time critical
  • not constructive(but sadistic?)
  • Attitude is changing
  • more awareness
  • more professional

5
The Triangle Program Myers
  • A program reads three integer values. The three
    values are interpreted as representing the
    lengths of the sides of a triangle. The program
    prints a message that states whether the triangle
    is scalene, isosceles, or equilateral.

Write a set of test cases to test this program
6
The Triangle Program Myers
Test cases for
  • valid scalene triangle ?
  • valid equilateral triangle ?
  • valid isosceles triangle ?
  • 3 permutations of previous ?
  • side 0 ?
  • negative side ?
  • one side is sum of others ?
  • 3 permutations of previous ?
  • one side larger than sum of others ?
  • 3 permutations of previous ?
  • all sides 0 ?
  • non-integer input ?
  • wrong number of values ?
  • for each test case is expected output specified
    ?
  • check behaviour after output was produced ?

7
The Triangle Program Myers
Test cases for
  • is it fast enough ?
  • doesnt it use too much memory ?
  • is it learnable ?
  • is it usable for intended users ?
  • is it secure ?
  • does it run on different platforms ?
  • is it portable ?
  • is it easily modifiable ?
  • is the availability sufficient ?
  • is it reliable ?
  • does it comply with relevant laws ?
  • doesnt it do harm to other applications ?
  • . . . . .

8
Testing A Definition
  • Software testing is
  • a technical process,
  • performed by executing / experimenting with a
    product,
  • in a controlled environment, following a
    specified procedure,
  • with the intent of measuring one or more
    characteristics / quality of the software product
  • by demonstrating the deviation of the actual
    status of the product from the required status /
    specification.

9
Testing and Software Quality
  • Quality totality of characteristics of an
    entity(IS 9126) that bear on its ability to
    satisfy stated and implied needs
  • Testing measuring the quality of a
    product obtaining confidence in the quality of
    product
  • How to specify quality ?
  • explicit / implicit / legal requirements
  • How to classify quality ?
  • How to quantify quality ?
  • How to measure quality ?
  • How to obtain good tests ?

10
Quality There is more than Testing
requirement management
quality control
testing
software process
verification
reviewing
inspection
CMM
static analysis
certification
debugging
walk-through
QUALITY
11
Sorts of Testing
  • Many different types and sorts of testing
  • functional testing, acceptance testing, duration
    testing,
  • performance testing, interoperability testing,
    unit testing,
  • black-box testing, white-box testing,
  • regression testing, reliability testing,
    usability testing,
  • portability testing, security testing,
    compliance testing,
  • recovery testing, integration testing, factory
    test,
  • robustness testing, stress testing, conformance
    testing,
  • developer testing, acceptance, production
    testing,
  • module testing, system testing, alpha test,
    beta test
  • third-party testing, specification-based
    testing,

12
Sorts of Testing
Level of detail
system
integration
module
unit
Accessibility
maintainability
white box
black box
portability
efficiency
usability
  • Other dimensions
  • - phases in development
  • - who does it
  • goal of testing
  • . . . . .

reliability
functionality
Characteristics
13
Model-Based Testing
14
Towards Model-Based Testing Trends
  • Increase in complexity, and quest for higher
    quality software
  • More abstraction
  • less detail
  • model based development OMGs UML, MDA
  • Checking quality
  • practice testing - ad hoc, too late,
    expensive, lot of time
  • research formal verification - proofs, model
    checking, . . . . with disappointing practical
    impact

Software bugs / errors cost US economy yearly
59.500.000.000 ( 50 billion) (www.nist.gov)
22 billion could be eliminated
15
Towards Model-Based Testing Trends
Increase testing effort grows exponentially with
system size testing cannot keep pace with
development of complexity and size of systems
16
Model-Based Testing
  • Model based testing has potential to combine
  • practice - testing
  • theory - formal methods
  • Model Based Testing
  • testing with respect to a (formal) model /
    specification state model, pre/post, CSP,
    Promela, UML, Spec, . . . .
  • promises better, faster, cheaper testing
  • algorithmic generation of tests and test oracles
    tools
  • formal and unambiguous basis for testing
  • measuring the completeness of tests
  • maintenance of tests through model modification

17
Types of Testing
Level of detail
system
integration
module
unit
Accessibility
portability
white box
black box
maintainability
efficiency
usability
reliability
functionality
Characteristics
18
Model-Based TestingFormal Models
  • Use mathematics to model relevant parts of
    software
  • Precise, formal semantics no room for ambiguity
    or misinterpretation
  • Allow formal validation and reasoning about
    systems
  • Amenable to tools automation
  • Examples
  • Z
  • Temporal Logic
  • First order logic
  • SDL
  • LOTOS
  • Promela
  • Labelled Transition Systems
  • Finite state machine
  • Process algebra
  • UML
  • ......

19
Automated Model-Based Testing
Model-Based
Automated
Testing
model
IUT confto model
IUT
?
20
A Model-Based Development Process
informalideas
specification
design
model-based
code
realization
21
Model-Based TestingFormal Specification-Based
Functional Testing
models
Testing functional behaviour of black-box
implementation with respect to a model in a
well-defined language based on a formal
definition of correctness
model-based testing
implementationunder test IUT
specification/model is basis for testing and
assumed to be correct
22
Approaches to Model-Based Testing
  • Several modelling paradigms
  • Finite State Machine
  • Pre/post-conditions
  • Labelled Transition Systems
  • Programs as Functions
  • Abstract Data Type testing
  • . . . . . . .

Labelled Transition Systems
23
Model-Based Testingwith Labelled Transition
Systems
24
Model Based Testing
with Transition Systems
25
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness imp
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
26
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness imp
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
27
Labelled Transition Systems
  • Labelled Transition System ? S, L, T, s0 ?

28
Labelled Transition Systems
29
Labelled Transition Systems
L dub,kwart, coffee,tea,soup
30
Labelled Transition Systems
Sequences of observable actions
s
traces(s) ?, dub, dub coffee, dub tea
Reachable states
s after dub S1, S2
s after dub tea S4
31
Representation of LTS
S
  • Explicit ? S0,S1,S2,S3, dub,coffee,tea,
    (S0,dub,S1), (S1,coffee,S2), (S1,tea,S3) ,
    S0 ?
  • Transition tree / graph
  • Language / behaviour expression S dub (
    coffee stop ? tea stop )

32
Labelled Transition Systems
stop
Pwhere P a P
a stop b stop
a b stop
33
Labelled Transition Systems
Q where Q a ( b stop Q )
a b stop a c stop
a stop ? c stop
a stop b stop
34
Example Parallel Composition
35
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness imp
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
36
Equivalences onLabelled Transition Systems
Observable Behaviour
Some transition systems are more equal than
others
37
Comparing Transition Systems
  • Suppose an environment interacts with the
    systems
  • the environment tests the system as black boxby
    observing and actively controlling it
  • the environment acts as a tester
  • Two systems are equivalent if they pass the same
    tests.

38
Comparing Transition Systems
S1 ? S2 ? ? e ? E . obs ( e, S1 ) obs
(e, S2 )
? ? ? ?
39
Trace Equivalence
s1 ?tr s2 ? traces ( s1 ) traces (
s2 )
Traces
40
(Completed) Trace Equivalence
?tr
?tr
?ctr
?tr
?ctr
?tr
41
Equivalences on Transition Systems
now you need to observe ?'s
test an LTS with another LTS, and undo, copy,
repeat as often as you like
test an LTS with another LTS, and try again
(continue) after failure
test an LTS with another LTS
observing sequences of actions and their end
observing sequences of actions
42
Equivalences Examples
43
Preorders on Transition Systems
?
implementationi
specifications
  • Suppose an environment interacts with the black
    box implementation i and with the specification
    s
  • i correctly implements sif all observation of
    i can be related to observations of s

44
Preorders on Transition Systems
implementationi
specifications
i ? LTS
s ? LTS
? ? ? ? ? ?
45
Trace Preorder
i ?tr s traces(i) ? traces(s)
46
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness imp
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
47
Input-Output Transition Systems
dub, kwart coffee, tea from user to
machine from machine to userinitiative with
user initiative with machinemachine cannot
refuse user cannot refuse input outputLI LU
LI ? LU ? LI ? LU L
LI ?dub, ?kwart LU !coffee, !tea
48
Input-Output Transition Systems
Input-Output Transition Systems IOTS (LI
,,LU ) ? LTS (LI ,? LU ) IOTS is LTS with
Input-Outputand always enabled inputs for all
states s,for all inputs ?a ? LI
LI ?dub, ?kwart LU !coffee, !tea
49
Example Models
Input-Output Transition Systems
50
Preorders onInput-Output Transition Systems
implementationi
specifications
imp
s ? LTS(LI,LU)
i ? IOTS(LI,LU)
imp ? IOTS (LI,LU) x LTS (LI,LU)
Observing IOTS where system inputs interact with
environment outputs, and v.v.
51
CorrectnessImplementation Relation ioco
i ioco s def ?? ? Straces (s) out (i after
?) ? out (s after ?)
52
CorrectnessImplementation Relation ioco
i ioco s def ?? ? Straces (s) out (i after
?) ? out (s after ?)
  • Intuition
  • i ioco-conforms to s, iff
  • if i produces output x after trace ?,
    then s can produce x after ?
  • if i cannot produce any output after trace
    ?, then s cannot produce any output after ?
    ( quiescence ? )

53
Implementation Relation ioco
54
Implementation Relation ioco
i ioco s def ?? ? Straces (s) out (i after
?) ? out (s after ?)
equation solver for y2 x
i ioco s
55
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
56
Test Cases
?coffee
Model of a test case transition system
!dub

?tea
fail
fail
?coffee
!kwart
  • labels in L ? ?
  • quiescence label ?
  • tree-structured
  • finite, deterministic
  • sink states pass and fail
  • from each state
  • either one input !a and all outputs ?x
  • or all outputs ?x and ?

?tea
pass
fail
?
?coffee
?tea
fail
fail
?coffee
!dub
?tea
fail
fail
?coffee
?tea
?
pass
fail
pass
pass
fail
LU ? ?
LU ? ?
57
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
58
Test Generation
i ioco s def ?? ? Straces (s) out (i after
?) ? out (s after ?)
59
Test Generation Algorithm
Algorithm To generate a test case t(S) from a
transition system specification S, with S ? ?
set of states ( initially S s0 after ? )
Apply the following steps recursively,
non-deterministically
60
Test Generation Example
test
61
Test Generation Example
  • Equation solver for y2x

test
To cope with non-deterministic behaviour, tests
are not linear traces, but trees
62
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

imp
63
Test Execution
Test execution all possible parallel
executions (test runs) oftest t with
implementation i going to state pass or fail
64
Test Execution Example
i'
i''
Two test runs
i fails t
65
Validity of Test Generation
  • For every test t generated with algorithm we
    have
  • Soundness t will never fail with correct
    implementation i ioco s implies i
    passes t
  • Exhaustiveness each incorrect implementation
    can be detectedwith a generated test t i ioco
    s implies ? t i fails t

66
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness ioco
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

ioco
67
Model-Based Testing with LTS
  • Involves
  • specification model
  • implementation IUT models of IUTs
  • correctness ioco
  • test cases
  • test generation
  • test execution
  • test result analysis

testgeneration

ioco
68
Test Generation Tools
Some Model-Based Testing Approaches and Tools
  • AETG
  • Agatha
  • Agedis
  • Autolink
  • Conformiq
  • Cooper
  • Cover
  • G?st
  • Gotcha
  • Leirios
  • TestComposer
  • TGV
  • TorX
  • TorXakis
  • Uppaal-Tron
  • Tveda
  • . . . . . .
  • Phact/The Kit
  • QuickCheck
  • Reactis
  • RT-Tester
  • SaMsTaG
  • SpecExplorer
  • Statemate
  • STG
  • TestGen (Stirling)
  • TestGen (INT)

TorX
69
A Tool for Transition Systems Testing TorX
  • On-the-fly test generation and test execution
  • Implementation relation ioco
  • Mainly applicable to reactive systems / state
    based systems
  • specification languages LOTOS, Promela, FSP,
    Automata

70
TorX Case Studies
academic Philips LogicaCMG Interpay Lucent LogicaC
MG academic LogicaCMG ASML/Tangram ASML/Tangram Mi
n. of Int. Aff.
  • Conference Protocol
  • EasyLink TV-VCR protocol
  • Cell Broadcast Centre component
  • Rekeningrijden Payment Box protocol
  • V5.1 Access Network protocol
  • Easy Mail Melder
  • FTP Client
  • Oosterschelde storm surge barrier-control
  • DO/DG dose control
  • Laser interface
  • The new Dutch electronic passport

71
TorX Case Study Lessons
  • model construction
  • difficult missing or bad specs
  • leads to detection of design errors
  • not supported by integrated environment yet
  • research on test based modelling
  • adapter/test environment
  • development is cumbersome
  • specific for each system
  • longer and more flexible tests
  • full automation test generation execution
    analysis
  • no notion of test selection or specification
    coverage yet
  • only random coverage or user guided test purposes

72
Model Based Testing,Verification,and the Test
Assumption
73
Formal Testing with Transition Systems
Test assumption ?IUT?IMP . ?iIUT ?IOTS .
?t?TTS . IUT passes t ? iIUT
passes t
gen LTS ? ?(TTS)

ioco
Proof soundness and exhaustiveness ?i?IOTS .
( ?t?gen(s) . i passes t ) ? i ioco s
74
Comparing Transition SystemsAn Implementation
and a Model
IUT ? iIUT ? ? e ? E . obs ( e, IUT )
obs (e, iIUT )
75
Formal Testing Test Assumption
Test assumption ? IUT . ? iIUT ? MOD. ? t
? TEST . IUT passes t ? iIUT passes t
IUT
iIUT
test t
test t
76
Completeness of Formal Testing
IUT passes Ts
IUT passes Ts ?def ? t ? Ts . IUT
passes t
? ? t ? Ts . IUT passes t
Test assumption ? t ? TEST . IUT passes t ?
iIUT passes t
? ? t ? Ts . iIUT passes t
Proof obligation ? i ? MOD ( ? t ? Ts . i
passes t ) ? i imp s
? iIUT imp s
Definition IUT confto s
? IUT confto s
77
Variations of ioco
78
Variations on a Theme
  • i ioco s ? ?? ? Straces(s) out ( i after ?)
    ? out ( s after ?)
  • i ?ior s ? ?? ? ( L ?? ) out ( i after ?)
    ? out ( s after ?)
  • i ioconf s ? ?? ? traces(s) out ( i after ?)
    ? out ( s after ?)
  • i iocoF s ? ?? ? F out ( i after
    ?) ? out ( s after ?)
  • i uioco s ? ?? ? Utraces(s) out ( i after ?)
    ? out ( s after ?)
  • i mioco s multi-channel ioco
  • i wioco s non-input-enabled ioco
  • i eco e environmental conformance
  • i sioco s symbolic ioco
  • i (r)tioco s (real) timed tioco (Aalborg,
    Twente, Grenoble, Bordeaux,. . . .)
  • i iocor s refinement ioco
  • i hioco s hybrid ioco
  • i qioco s quantified ioco
  • . . . . . .

79
Underspecification uioco
i ioco s ? ?? ? Straces(s) out ( i after ?)
? out ( s0 after ?)
s0
out ( s0 after ?b ) ?but ?b ? Straces(s)
under-specification anything allowed after ?b
s1
s2
out ( s0 after ?a ?a ) !x and ?a ?a ?
Straces(s)but from s2 , ?a ?a is
under-specified anything allowed after ?a ?a ?
80
Underspecification uioco
i uioco s ? ?? ? Utraces(s) out ( i after ?)
? out ( s0 after ?)
Now s is under-specified in s2 for ?a anything
is allowed.
ioco ? uioco
81
Underspecification uioco
i uioco s ? ?? ? Utraces(s) out ( i after ?)
? out ( s0 after ?)
s0
?b
?
s1
s2
?a
?
Alternatively, via chaos process ?for
under-specified inputs
82
Testing ofComponent-Based Systems
83
Component-Based Development
  • Software .......
  • is very complex
  • has many quality issues
  • is very expensive to develop
  • but we increasingly depend it !
  • Contribution to a solution Component-Based
    Development
  • Lego approachbuilding components and
    combining components into systems
  • divide and conquer
  • common in many other engineering
    disciplinesconstruction, electronics, cars

84
Component-Based Development
system
component
component
component
component
component
component
85
Component-Based Development
  • Potential benefits
  • independence of component en system development
  • master complexity
  • third party development, outsourcing
  • specialization
  • reuse
  • standard components
  • reduce system development cost and time
  • improve quality
  • substitution
  • choice and replacements from other suppliers
  • update components

86
Component-Based Development
  • Prerequisites
  • good components
  • precise specification
  • correct implementation
  • infrastructure and glue to connect components
  • standardization of interfaces
  • at component supplier side, and
  • at componenent user side
  • Good components
  • independent from system
  • reusable components that fit in many systems
  • substitutable system must work with
    substituted components

87
Components and Systems
environment
  • system (autonomous) entity interacting with
    other entities ( environment)
  • system boundary common frontier between system
    and environment
  • function of systemwhat system is intended to
    do
  • behaviour what the system does to implement
    its function sequence of states and actions

system
88
Components and Systems
  • structure internal composition that enables
    system to generate behaviour
  • set of components
  • with possible interactions
  • component yet another system

environment
component A
component B
system
89
Components and Systems
  • user part of the environment who or that is
    interested in the function of the system role
    of a system
  • service behaviour of a systemas perceived and
    received by its user
  • provider role of systemin delivering service
  • system can simultaneously beprovider and user

environment
user
component A
component B
system
90
Components and Systems
  • service interface part of the boundary where
    service is delivered
  • use interface part of the boundary of the user
    at which the user receives service
  • service specification specificationof all
    possible behaviours that shall be perceivable at
    the service interface

environment
user
useinterface
serviceinterface
component A
useinterface
serviceinterface
component B
system
91
Testing Component-Based Systems
  • Testing of
  • components
  • interactions (integration)
  • whole system
  • where later detection of errorsis more costly

system test
driver for 1
component A
driver for 2
  • Traditional approaches
  • bottom-up (with drivers)
  • top-down (with stubs)
  • big-bang
  • mixed

stub for 1
component B
system
92
Testing Component-Based Systems
  • Testing challenges
  • reusability testing in many/all different
    environments (cf. Ariane)
  • substitutability system must accept substitute
    components
  • independence
  • component tester does not know use environment
  • integration tester has no code (black-box),no
    expertise about component,cannot repair and
    maintain component
  • Functionality is compositional, to a certain
    extent
  • But emerging and non-functional properties are
    not
  • performance, memory usage, reliability, security
  • Components developed once, but tested often ?

93
Testing Component-Based Systems
  • Testing of components A and B
  • testing B
  • does B provide the correct service
  • testing A
  • does A provide the correct service
  • does A use the service of B correctly
  • Disadvantages of traditional approaches
  • bottom-up A is not tested with different B
  • top-downB is not tested with different A

providedservice
component A
useservice
providedservice
component B
system
94
Model-Based Testingof Components
95
Model-Based Testing of Components
  • Testing component A
  • ideal
  • access to all interfaces
  • formal specfication model of complete behaviour
    at all interfaces
  • coordinated testing at all interfaces based on
    formal model, e.g. with ioco
  • often in practice
  • only specification of provided service available
  • unspecified how component should use other
    componets

driver
tester
IUTcomponent A
stub
96
Model-Based Testing of Components
  • Testing component A
  • specification of A only prescribes provided
    service of A, typically
  • what A provides to its user
  • what A expects from its user
  • ideal for bottum-up testing at service interface
    of A
  • but not for testing of A in isolation

test based onmodel of A
model of A
component A
component B
system
97
Model-Based Testing of Components
  • Testing component A
  • suppose also such a specification of B is
    available that prescribes
  • what B provides to its user A
  • what B expects from its user A
  • use specification of B to test whether A behaves
    following expectations of B, using what B
    provides
  • model of B as intelligent stub
  • requires something else than ioco !

test based onmodel of A
model of A
component A
model of B
test based onmodel of B
component B
98
Model-Based Testing of Components
  • Formalization of components
  • (in a client-server like setting)
  • labelled transition systems
  • what are labels ?
  • input enabled ?
  • labels
  • all interactions between components must be
    modelled atomically as labels
  • methods calls
  • method returns
  • methods called
  • methods returned
  • (assuming no side-effects)

99
Model-Based Testing of Components
specifications ideal sA ?
LTS(LI , LU ) practice sA ? LTS(LI p, LU p
)sB ? LTS(LI u, LU u )
LI LI p ? LI u provided methods calls
? used methods returns LU LU p ? LU u
provided methods returns ? used methods calls
100
Model-Based Testing of Components
Input-enabledness? s of IUT, ? ?a ? LI
For method returns ?
asdffdsf
No ! ?
101
Implementation Relation wioco
(u)ioco domain-extended, conservativelyfor
non-input enabled implementations wioco
i uioco s def ?? ? Utraces (s) out (i
after ?) ? out (s after ?)
i wioco s def ?? ? Utraces (s) out (i
after ?) ? out (s after ?) and
in (i after ?) ? in (s after ?)
in (s after ?) a? ? LI s after ?
must a?
102
Implementation Relation wioco
  • Test implementation of A IA
  • with respect to model of A MA
  • according to implementation relation wioco
  • IA wioco MA def ?? ? Utraces (MA)
    out (IA after ?) ? out (MA after ?)
  • and in (IA after ?) ? in (MA after ?)

test based onmodel of A
wioco
model of A
component A
component B
103
Implementation Relation wioco
104
Implementation Relation wioco
105
Implementation Relation wioco
?dub
!choc
!tea
?dub
?dub
!coffee
106
Implementation Relation wioco
?dub
!choc
!tea
?dub
s
?dub
!coffee
107
Implementation Relation eco
(u)ioco for testing with a specification of the
environment eco
i uioco s def ?? ? Utraces (s) out (i
after ?) ? out (s after ?)
i eco e def ?? ? Utraces (s) ? L uit (i
after ?) ? in (e after ?)
and in (i after ?) ? uit (e
after ?)
uit (i after ?) out (i after ?) \ d

in (e after ?) a? ? LI e after ?
must a?
108
Implementation Relation eco
  • Test implementation of A IA
  • with respect to model of B MB
  • according to implementation relation eco

test based onmodel of A
wioco
component A
IA eco MB def ?? ? Utraces (MB) ? L
uit (IA after ?) ? in (MB after ?) and in
(IA after ?) ? uit (MB after ?)
eco
model of B
test based onmodel of B
component B
109
Implementation Relation eco
110
Implementation Relation eco
!dub
!dub
?coffee
111
Implementation Relation eco
!dub
?choc
?tea
!dub
!dub
?coffee
112
Test Generation Algorithm for eco
Algorithm To generate a test case t(E) from a
transition system specification E, with E ? ?
set of states ( initially E e0 after ? )
Apply the following steps recursively,
non-deterministically
t (E) for ?
!a ? uit (E)
113
Test Generation for eco
i
e
114
wioco and eco
  • Testing with wioco and eco can be performed
    concurrently but it still is partial testing !
  • coordination and dependence between
    actions at both interfaces is not tested
  • Example A shall look up information in a
    database ( component B) A queries B, but does
    not use this information, but invents itself the
    information.
  • This is not the ideal testing of component A in
    isolation, but only step in that direction !
  • More research required ........

model-based wioco
IUTcomponent A
model-based eco
115
Compositional Testing
i1 ioco s1
i2 ioco s2
i1i2
ioco
s1s2
116
Compositional Testing
i1
s1
i1 ioco s1
i2
s2
i2 ioco s2
ioco
s1 s2
i1 i2
If s1, s2 input enabled - s1, s2 ? IOTS - then
ioco is preserved !
117
Concluding
  • Testing can be formal, too (M.-C. Gaudel,
    TACAS'95)
  • Testing shall be formal, too
  • A test generation algorithm is not just another
    algorithm
  • Proof of soundness and exhaustiveness
  • Definition of test assumption and implementation
    relation
  • For labelled transition systems
  • (u/w)ioco, eco for expressing conformance between
    imp and spec
  • sound and exhaustive test generation algorithms
  • tools generating and executing testsTGV,
    TestGen, Agedis, TorX, . . . .

118
Perspectives
  • Model based formal testing can improve the
    testing process
  • model is precise and unambiguous basis for
    testing
  • design errors found during validation of model
  • longer, cheaper, more flexible, and provably
    correct tests
  • easier test maintenance and regression testing
  • automatic test generation and execution
  • full automation test generation execution
    analysis
  • extra effort of modelling compensated by better
    tests

119
Thank You
Write a Comment
User Comments (0)
About PowerShow.com