Title: Path selection criteria
1Path selection criteria
- Tor Stålhane Wande Daramola
2Why path selection criteria
- Doing white box testing (Control flow testing,
data flow testing, coverage testing) using the
test-all-paths strategy can be a tedious and
expensive affair. The strategies discussed here
are alternative ways to reduce the number of
paths to be tested. - As with all white box tests, it should only be
used for small chunks of code less than say 200
lines.
3Data flow testing -1
- Data flow testing is a powerful tool to detect
improper use of data values due to coding errors.
- main()
- int x
- if (x42) ...
-
4Data flow testing -2
- Variables that contain data values have a defined
life cycle. They are created, they are used, and
they are killed (destroyed) - Scope - // begin outer block
- int x // x is defined as an integer
within this outer block - // x can be accessed here
- // begin inner block
- int y // y is defined within
this inner block ... // both x and y
can be accessed here - // y is automatically
destroyed at the end of this block ...
- // x can still be accessed, but y is gone
- // x is automatically destroyed
5Static data flow testing
- Variables can be used
- in computation
- in conditionals
- Possibilities for the first occurrence of a
variable through a program path - d the variable does not exist, then it is
defined (d) - u the variable does not exist, then it is used
(u) - k the variable does not exist, then it is killed
or destroyed (k)
6define, use, kill (duk) 1
- We define three usages of a variable
- d define the variable
- u use the variable
- k kill the variable.
- A large part of those who use this approach will
only use define and use du. - Based on the usages we can define a set of
patterns potential problems.
7duk 2
- We have the following nine patterns
- dd define and then define again error
- dk define and then kill error
- ku kill and then used error
- kk kill and then kill again error
- du define and then use OK
- kd kill and then redefine OK
- ud use and then redefine OK
- uk use and then kill OK
- uu use and then use OK
8Example Static data flow testing
For each variable within the module we will
examine define-use-kill patterns along the
control flow paths
9Example contd Consider
variable x as we traverse the left and then the
right path
define correct, the normal case define-define
suspicious, perhaps a programming
error define-use correct, the normal case
ddu
du
10duk examples (x) 1
Define x
Define x
Define x Use x
Use x
Use x
Define x Use x
du
ddu
11Example Contd Consider variable y
use major blunder use-define
acceptable define-use correct, the normal
case use-kill acceptable
udk
uduk
12duk examples (y)- 2
Use y
Use y
Define y
Define y
Use y
Use y
Kill y
Kill y
udk
uduk
13Example Contd Consider variable z
kill programming error kill-use major
blunder use-use correct, the normal
case use-define acceptable
kill-kill probably a programming error
kill-define acceptable define-use
correct, the normal case
kkduud
kuuud
14duk examples (z) - 3
Kill z
Kill z
Use z
Kill z Define z
Kill z Define z
Use z
Use z
Use z
Use z
Use z
Define z
Define z
kuuud
kkduud
15Dynamic data flow testingTest strategy 1
- Based on the three usages we can define a total
of seven testing strategies. We will have a quick
look at each - All definitions (AD) test cases cover each
definition of each variable for at least one use
of the variable. - All predicate-uses (APU) there is at least one
path of each definition to p-use of the variable
16Test strategy 2
- All computational uses (ACU) there is at least
one path of each variable to each c-use of the
variable - All p-use/some c-use (APUC) there is at least
one path of each variable to each c-use of the
variable. If there are any variable definitions
that are not covered then cover a c-use
17Test strategy 3
- All c-uses/some p-uses (ACUP) there is at least
one path of each variable to each c-use of the
variable. If there are any variable definitions
that are not covered then cover a p-use. - All uses (AU) there is at least one path of each
variable to each c-use and each p-use of the
variable.
18Test strategy 4
- All du paths (ADUP) test cases cover every
simple sub-path from each variable definition to
every p-use and c-use of that variable. - Note that the kill usage is not included in any
of the test strategies.
19Application of test strategies 1
20Application of test strategies 2
ACU APUC
Define x
p-use y Kill z
Define x c-use x c-use z
Kill z c-use x Define z
Define y p-use z
c-use c-use z
Kill y Define z
21Relationship between strategies
The higher up in the hierarchy, the better is
thetest strategy
22Acknowledgement
- The material on the duk patterns and testing
strategies are taken from a presentation made by
L. Williams at the North Carolina State
University. - Available at http//agile.csc.ncsu.edu/testing/D
ataFlowTesting.pdf - Further Reading
- An Introduction to data flow testing Janvi
Badlaney et al., 2006 - Available at ftp//ftp.ncsu.edu/pub/tech/2006/TR-
2006-22.pdf
23Use of coverage measures
24Model 1
- We will use the following notation
- c a coverage measure
- r(c) reliability
- 1 r(c) failure rate
- r(c) 1 kexp(-bc)
- Thus, we also have that
- ln1 r(c) ln(k) bc
25Model 2
- The equation ln1 r(c) ln(k) bc is of the
same type as Y aX ß. - We can thus use linear regression to estimate
the parameters k and b by doing as follows - Use linear regression to estimate a and ß
- We then have
- k exp(a)
- b - ß
26Coverage measures considered
- We have studied the following coverage measures
- Statement coverage percentage of statements
executed. - Branch coveragepercentage of branches executed
- LCSAJLinear Code Sequence And Jump
27Statement coverage
28Graph summary
29Equation summary
- Statements
- -ln(F) 6.5 6.4 Cstatement, R2(adj) 85.3
- Branches
- -ln(F) 7.5 6.2 Cbranches, R2(adj) 82.6
- LCSAJ
- -ln(F) 6.5 6.4 CLCSAJ, R2(adj) 77.8
30Usage patterns 1
- Not all parts of the code are used equally often.
When it comes to reliability, we will get the
greatest effect if we have a high coverage for
the code that is used most often. - This also explains why companies or user groups
disagrees so much when discussing the reliability
of a software product.
31Usage patterns 2
32Usage patterns 3
- As long as we do not change our input space
usage pattern we will experience no further
errors. - New user groups with new ways to use the system
will experience new errors.
33Usage patterns 4
input domain
Input space B
X
X
X
X
X
X
Input space A
34Extended model 1
- We will use the following notation
- c coverage measure
- r(c) reliability
- 1 r(c) failure rate
- r(c) 1 kexp(-apc)
- p the strength of the relationship between c and
r. p will depend the coupling between coverage
and faults. - a scaling constant
35Extended model 2
Residual unreliability
36Extended model - comments
- The following relation holds
- ln1 r(c) ln(k) apc
- Strong coupling between coverage and faults will
increase the effect of test coverage on the
reliability. - Weak coupling will create a residual gap for
reliability that cannot be fixed by more testing,
only by increasing the coupling factor p thus
changing the usage pattern.
37Bishops coverage model 1
- Bishops model for predicting remaining errors is
different from the models we have looked at
earlier. It has a - Simpler relationship between number of remaining
errors and coverage - More complex relationship between number of tests
and achieved coverage
38Bishops coverage model 2
- We will use f P(executed code fails). Thus, the
number of observed errors will depend on three
factors - Whether the code
- Is executed C
- Fails during execution f
- Coupling between coverage and faults - p
- N0 N(n) F(f, C(n, p))
- C(n) 1 1/(1 knp)
39Bishops coverage model 3
- Based on the assumptions and expression
previously presented , we find that - If we use the expression on the previous slide to
eliminate C(n) we get
40A limit result
- It is possible to show that the following
relation holds under a rather wide set of
conditions - The initial number of defects N0 must be
estimated e.g. based on experience from earlier
projects as number of defects per KLOC.
41An example from telecom