Lesson 09 Software Verification, Validation and Testing - PowerPoint PPT Presentation

1 / 78
About This Presentation
Title:

Lesson 09 Software Verification, Validation and Testing

Description:

exercise the input and output to uncover errors in program function, behavior, and performance ... Exercise the logical conditions in a program module. Boolean ... – PowerPoint PPT presentation

Number of Views:180
Avg rating:3.0/5.0
Slides: 79
Provided by: polari
Category:

less

Transcript and Presenter's Notes

Title: Lesson 09 Software Verification, Validation and Testing


1
Lesson 09Software Verification, Validation and
Testing
  • Includes
  • Software Testing Techniques
  • Intro to Testing

Includes materials adapted from Pressman
Software Engineering A Practitioners Approach,
Fifth Edition, McGraw-Hill, 2000
2
Software Testing
Testing is the process of exercising a program
with the specific intent of finding errors prior
to delivery to the end user.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
3
We Design Test Cases to...
  • have high likelihood of finding errors
  • exercise the internal logic of software
    components
  • exercise the input and output to uncover errors
    in program function, behavior, and performance

Goal is to find maximum number of errors with the
minimum amount of effort and time!
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
4
Testing is Destructive Activity
  • designing and executing test cases to break or
    demolish the software.
  • Must change your mindset during this activity

The objective is to find errors therefore errors
found are good not bad. Tell that to a manager!
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
5
Testing Objectives
  • Execute a program with intent of finding an
    error
  • Good test case has high probability of finding
    an as-yet undiscovered error
  • Successful test case finds an as-yet
    undiscovered error

Successful testing uncovers errors.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
6
Testing demonstrates ...
  • Software functions work as specified
  • Behavioral and performance requirements appear
    to be met
  • Data collected is an indicator of reliability
    and quality

TESTING CANNOT SHOW THE ABSENCE OF ERRORS AND
DEFECTS. Testing only shows that errors and
defects are present.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
7
Basic Principles of Testing
  • All testing traceable to requirement
  • Plan testing long before testing begins. Plan
    and design tests during design before any code
    has been generated.
  • Pareto Principle - 80 errors in 20 of
    components
  • Start small and progress to large. First test
    individual components (unit test), then on
    clusters of integrated components (integration
    test), then on whole system
  • Exhaustive testing not possible but we can
    assure that all conditions have been exercised
  • All testing should not be done by developer -
    need independent 3rd party

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
8
Testability
  • Operability it operates cleanly
  • Observability the results of each test case are
    readily observed
  • Controllability the degree to which testing can
    be automated and optimized
  • Decomposability control scope of testing
  • Simplicity reduce complex architecture and logic
    to simplify tests
  • Stability few changes are requested during
    testing
  • Understandability of the design and documents

Testability refers to how easily product can be
tested. Design software with Testability in
mind.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
9
What Testing Shows
  • Errors
  • Requirements conformance
  • Performance
  • An indication of quality

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
10
Who Tests the Software?
Developer
Independent Tester
Understands the system but will test gently and
is driven by delivery
Must learn about the system but will attempt to
break it and is driven by quality




From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
11
Software Testing
  • Black Box Testing Methods
  • White Box Testing Methods
  • Strategies for Testing

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
12
Black Box Testing
  • Based on specified function, on the requirements
  • Tests conducted at the software interface
  • Demonstrates that the software functions are
    operational, input is properly accepted, output
    is correctly produced, and integrity of external
    info is maintained
  • Uses the SRS as basis for construction of tests
  • Usually performed by independent group

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
13
White-Box Testing
Our goal is to ensure that all statements and
conditions have been executed at least once.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
14
White Box Testing -- I
  • Based on internal workings of a product
    requires close examination of software
  • Logical paths are tested by providing test cases
    that exercise specific sets of conditions and/or
    loops
  • Check status of program by comparing actual
    results to expected results at selected points in
    the software

Exhaustive path testing is impossible
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
15
Exhaustive Testing
loop lt 20 times
14
There are 10 possible paths! If we execute one
test per millisecond, it would take 3,170 years
to
test this program!!
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
16
White Box Testing -- II
  • Logic errors and incorrect assumptions usually
    occur with special case processing
  • Our assumptions about flow of control and data
    may lead to errors that are only uncovered during
    path testing
  • We make typing errors some uncovered by
    compiler (syntax, type checking) BUT others only
    uncovered by testing. Typo may be on obscure
    path

Black box testing can miss these types of errors
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
17
Selective Testing
Selected path
loop lt 20 times
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
18
Software Testing Techniques Testing Analysis
Includes materials adapted from Pressman
Software Engineering A Practitioners Approach,
Fifth Edition, McGraw-Hill, 2000
19
Test Case Design
  • Uncover errors in a complete manner with a
    minimum of effort and time!

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
20
Basis Path Testing -- I
  • A white box testing technique - McCabe
  • Use this technique to derive a logical measure of
    complexity
  • Use as a guide for defining a basis set of
    execution paths
  • Test cases derived to execute the basis set are
    guaranteed to execute every statement at least
    one time during testing

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
21
Cyclomatic Complexity
  • This is a quantitative measure of the logical
    complexity of a program.
  • Used in conjunction with basis set testing it
    defines the number of independent paths in the
    basis set
  • It provides an upper bound for the number of
    tests that ensure all statements have been
    executed at least once.
  • See http//www.mccabe.com/pdf/nist235r.pdf for
    more detailed paper McCabes Cyclomatic
    Complexity.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
22
Basis Path Testing -- II
First, we compute the cyclomatic
Complexity
1
A
number of simple decisions 1 1,2,33
decisions1
B
2
C
or
number of enclosed areas 1 A,B,C3 areas 1
3
In this case, V(G) 4
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
23
Cyclomatic Complexity
A number of industry studies have indicated that
the higher the V(G), the higher the probability
of errors.
modules
V(G)
modules in this range are more error prone
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
24
Basis Path Testing -- III
Next, we derive the
independent paths
Since V(G) 4,
there are four paths
Path 1 1,2,3,6,7,8
Path 2 1,2,3,5,7,8
Path 3 1,2,4,7,8
Path 4 1,2,4,7,2,...7,8 Note the implies
insertion of path 1, 2, or 3 here.
Finally, we derive test
cases to exercise these
8
Paths.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
25
Creating Flow Graphs
  • Circle (node) represents one or more statements
  • Arrows (edges) represent flow or control. Must
    terminate in a node.
  • Region is an area bounded by edges and nodes.
    The area outside the flow graph is included as a
    region.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
26
Calculating Cyclomatic Complexity from Flow Graph
  • Count the number of regions
  • V(G) E - N 2
  • where E number of edges
  • N number of nodes
  • V(G) P 1
  • where P number of predicate nodes (2 or more
    edges leave the node)

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
27
Basis Path Testing Notes
  • - You dont need a flow chart or graph but the
    picture helps when you trace program paths
  • - Count each simple logical test as 1, compound
    tests count as 2 or more (depending on number of
    tests)
  • Basis Path Testing should be applied to critical
    modules.
  • Some Development Environments will automate
    calculation of V(G)

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
28
Deriving Test Cases
  • Using design or code as a foundation, draw a
    corresponding flow graph
  • Determine the cyclomatic complexity
  • Identify the basis set of linearly independent
    paths
  • Prepare test cases that will force execution of
    each path in the basis set
  • Exercise - create flow graph from example

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
29
Graph Matrices
  • Software tools exist that use a graph matrix to
    derive the flow graph and determine the set of
    basis paths
  • Square matrix whose size equals the number of
    nodes on the flow graph
  • Each node is identified by number and each edge
    by letter
  • Can add link weight for other more interesting
    properties (e.g. processing time, memory
    required, etc.)

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
30
Control Structure Testing
  • Basis path testing is not enough
  • Must broaden testing coverage and improve quality
    of testing
  • Condition Testing
  • Data Flow Testing
  • Loop Testing

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
31
Condition Testing --I
  • Exercise the logical conditions in a program
    module
  • Boolean variable or relational expression
  • Compound conditions - one or more conditions
  • Detect errors in conditions AND also in rest of
    program. If test set is effective for
    conditions, likely also for other errors.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
32
Condition Testing --II
  • Branch Testing - test each True and False branch
    at least once
  • Domain Testing - 3 or 4 tests for a relational
    expression. Test for greater than, equal to, less
    than. Also a test which makes the difference
    between the 2 values as small as possible.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
33
Data Flow Testing
  • Selects test paths according to the locations of
    definitions and uses of variables in the program.
  • Cant use for large system but can target for
    suspect areas of the software
  • Useful for selecting test paths containing nested
    if and loop statements

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
34
Loop Testing
  • White box technique focuses on validity of loop
    constructs
  • Four different types of loops
  • Simple loops
  • Nested loops
  • Concatenated loops
  • Unstructured loops - should redesign to reflect
    structured constructs

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
35
Loop Testing
Simple loop
Nested Loops
Concatenated
Loops
Unstructured
Loops
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
36
Loop Testing Simple Loops
Minimum conditionsSimple Loops
1. skip the loop entirely

2. only one pass through the loop

3. two passes through the loop
4. m passes through the loop m lt n

5. (n-1), n, and (n1) passes through
the loop

where n is the maximum number
of allowable passes
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
37
Loop Testing Nested Loops
Nested Loops
  • Start at the innermost loop. Set all outermost
    loops to their minimum values.
  • Test the min1, typical, max-1 and max for the
    innermost loop while holding the outermost loops
    at minimum values.
  • 3. Move out one loop and set it up as in step
    2 holding all loops at typical values until the
    outermost loop has been tested.



Concatenated Loops
If the loops are independent of each other then
treat as simple loops. Otherwise treat as nested
loops.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
38
Black-Box Testing
requirements
output
input
events
Also called behavioral testing
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
39
Black Box Testing
  • Does not replace white box testing
  • A complementary approach
  • Focuses on functional requirements of the
    software
  • Tries to find following types of errors
  • incorrect or missing functions
  • interface errors
  • errors in data structures or database access
  • behavior or performance errors
  • initialization or termination errors

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
40
Black Box Testing
  • Done during later stages of testing
  • Tests designed to answer following questions
  • how is functional validity tested?
  • How is system behavior and perf tested?
  • What classes of input will make good test cases?
  • Is system sensitive to certain input values?
  • How are the boundaries of a data class isolated?
  • What data rates and data volume can the system
    take?
  • What effect will specific data comb. have on
    system

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
41
Equivalence Partitioning
  • Black box method that divides the input domain of
    a program into classes of data from which test
    cases can be derived
  • Strive to design a test case that uncovers
    classes of errors and reduces the total number of
    test cases that must be developed and run. E.g.
    incorrect processing of all character data

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
42
Equivalence Partitioning
user queries
data
output formats
mouse picks
errors
prompts
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
43
Sample Equivalence Classes
Valid data
  • User supplied commands
  • Responses to system prompts
  • Filenames
  • Computational Data
  • physical parameters
  • bounding values
  • initiation values
  • Output data formatting
  • Responses to error msgs
  • Graphical data (e.g. mouse picks)

Invalid data
  • Data outside bounds of the program
  • Physically impossible data
  • Proper value supplied in the wrong place

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
44
Equivalence Class DefinitionGuidelines
  • Input condition specified range - one valid and 2
    invalid classes defined
  • Input condition requires specific value, one
    valid and 2 invalid classes defined
  • Input condition specifies a number of a set, one
    valid and one invalid class defined
  • Input condition is Boolean, one valid and one
    invalid class defined
  • E.g. prefix - 3 digit number not beginning with 0
    or 1 Input condition range - specified value
    gt200 Input condition value - 4 digit length

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
45
Boundary Value Analysis
  • More errors occur at boundary of the input domain
  • BVA leads to selection of test cases that
    exercise the boundaries
  • Guidelines
  • Input in range a..b select a, b, just above and
    just below a and b
  • Inputs with number of values select min and max,
    just above and below min, max
  • Use same guidelines for output conditions
  • boundaries on data structures (array with 100
    entries) test at boundary

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
46
Software Testing Strategies
47
Testing Goals - Review
  • Goal is to discover as many errors as possible
    with minimum effort and time
  • Destructive activity - people who constructed the
    sw now asked to test it
  • Vested interest in showing sw is error-free,
    meets requirements, and will meet budget and
    schedule
  • Works against thorough testing
  • Therefore, should the developer do no testing?
    Should all testing be done independently and
    testers get involved only when developers
    finished with construction?

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
48
Testing Strategies ...
  • In the past, only defense against programming
    errors was careful design and the intelligence of
    the programmer
  • Now we have modern design techniques and formal
    technical reviews to reduce the number of initial
    errors in the code
  • In Chapter 17 we discussed how to design
    effective test cases, now we discuss the strategy
    we use to execute them.
  • Strategy is developed by project manager,
    software engineer, and testing specialists. It
    may also be mandated by customer.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
49
Why is Testing Important?
  • Testing often accounts for more effort than any
    other sw engineering activity
  • If done haphazardly, we
  • waste time
  • waste effort
  • errors sneak thru
  • Therefore need a systematic approach for testing
    software
  • Work product is a Test Specification (Test Plan)

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
50
What is a Test Plan?
  • a road map describing the steps to be conducted
  • specifies when the steps are planned and then
    undertaken
  • states how much effort, time, and resources will
    be required
  • must incorporate test planning, test case
    design, test execution, and data collection and
    evaluation

Should be flexible for customized testing but
rigid enough for planning and management tracking.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
51
Strategic Issues
  • Specify requirements in a quantifiable manner so
    the requirement can be tested.
  • State testing objectives explicitly
  • Understand potential users and develop profiles
  • Develop testing plan - in increments quickly
  • Build robust software with error checking
  • Use effective Formal Test Reviews (FTRs) to find
    errors early - save time/
  • Conduct FTRs on tests and test strategy
  • Develop continuous improvement - collect metrics

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
52
Testing Strategy
unit test
integration test
Component level
Integrate components
validation test
system test
Requirements level
System elements tested as a whole
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
53
Verification and Validation
  • Verification - ensure sw correctly implements
    specified function
  • Are we building the product right?
  • Validation - ensure sw is traceable to
    requirements
  • Are we building the right product?
  • Independent Test Group (ITG) performs VV - works
    closely with developer to fix errors as they are
    found
  • ITG starts at beginning of project thru finish
  • ITG reports to organization apart from SW

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
54
Comparison of Testing Types
Eliminate duplication of testing between
different groups to save time/
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
55
Unit Testing
module to be tested

Types of testing

interface
local data structures
boundary conditions
independent paths (basis paths)

error handling paths

test cases
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
56
Unit Test Environment
driver

interface
local data structures

Module
boundary conditions
independent paths

error handling paths
stub
stub
test cases
Testing is simplified if unit has only one
function (hi cohesion) - fewer test cases
RESULTS
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
57
Drivers and Stubs
  • Driver - Main program that accepts test case
    data, passes such data to the module, and prints
    relevant results
  • Stub - replace modules that are subordinate to
    unit under test uses the subordinate modules
    I/F, may do minimal data manipulation, prints
    verification of entry, returns control to module
    undergoing testing.
  • Overhead when writing drivers and stubs
  • Sometimes, cant adequately unit test with simple
    overhead sw - then wait till integration (drivers
    and stubs may be used here)

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
58
Types of Computation Errors
  • Misunderstood or incorrect arithmetic precedence
  • Mixed Mode operations
  • Incorrect initialization
  • Precision inaccuracy
  • Incorrect symbolic representation of an expression

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
59
Types of Control Flow Errors
  • Comparison of different data types
  • Incorrect logical operators or precedence
  • Expectation of equality when precision error
    makes unlikely
  • Incorrect comparison of variables
  • Improper or nonexistent loop termination
  • Failure to exit
  • Improperly modified loop variables

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
60
Error Handling Evaluation
  • Error conditions must be anticipated and error
    handling must reroute or cleanly terminate
    processing Antibugging
  • Typical Antibugging Errors
  • Error description is unintelligible
  • Error noted doesnt match error encountered
  • Error condition causes system intervention
  • Exception condition processing is incorrect
  • Error description doesnt provide enough info

Make sure error handling is tested!
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
61
Integration Testing Strategies
  • Options
  • The big bang approach
  • OR
  • An incremental construction strategy
  • Top Down
  • Bottom Up
  • Sandwich

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
62
What is Integration Testing?
  • Take unit tested components and build a program
    structure by joining the components while testing
    to find errors associated with interfaces between
    components
  • Data can be lost across an interface, one module
    can have inadvertent adverse affect on another,
    etc.
  • Program is constructed and tested in small
    increments. Errors are easier to isolate,
    interfaces are more likely to be tested
    completely, systematic test approach is applied.
  • Software gains maturity as integrate modules.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
63
Top Down Integration
A
top module is tested with
Stubs for B, F, G
B
F
G
stubs are replaced one at
a time with real components, "depth first"
C
as new modules are integrated,
some subset of tests is re-run - regression
D
E
What would be replaced next?
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
64
Bottom-Up Integration
A
B
F
G
drivers are removed and builds combined
Moving upward one at a time, "depth first"
C
worker modules are grouped into
builds that perform specific subfunction and
integrated
D
E
cluster
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
65
Top-down vs Bottom Up Integration
  • Top Down
  • Stubs replace low level modules which normally
    supply data
  • Therefore may delay some testing (not good)
  • Simulate the actual module in the stub (high
    overhead)
  • Verifies major control early
  • Bottom up
  • First integrate the low level modules that supply
    data
  • Program doesnt exist until last module
    integrated
  • Easier test case design
  • Dont need stubs - need drivers.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
66
Sandwich Testing
A
Top modules are tested with stubs
B
F
G
C
Worker modules are grouped into
builds and integrated
D
E
cluster
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
67
Critical Modules
  • Identify critical modules and target for early
    testing and focus regression testing on them.
    Critical modules
  • Address several sw req
  • High level of control (high in sw structure)
  • Complex or error prone (high V(G))
  • Has definite performance requirements

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
68
High Order Testing
  • Validation Test - Test Plan outlines classes of
    tests to be performed. Test Procedures have
    specific test cases.
  • After each test case runs, either passes or have
    deviation which is recorded as a Software Trouble
    Report (STR)
  • Resolution of STRs is monitored
  • Alpha and Beta Testing - Alpha at developers
    site and Beta at customer site
  • System Test - tests to verify system elements
    have been properly integrated and perform
    required functions
  • This was performed as a System Level Acceptance
    Test (SLAT) at IBM.



From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
69
Debugging A Diagnostic Process
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
70
Debugging Process
  • Debugging is a consequence of testing
  • Debugging effort is combination of the time
    required to diagnose the symptom and determine
    the cause of the error AND the time required to
    correct the error and conduct regression tests.
  • Regression test is a selective re-running of
    tests to assure that nothing has been broken when
    fix or modification was implemented.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
71
Symptoms Causes
  • Symptom and cause may be geographically remote
  • Symptom may disappear when another error is
    fixed
  • Symptom may be caused by nonerror (e.g.
    roundoff)
  • Symptom caused by human error compiler error
    assumptions
  • Symptom caused by timing problems
  • Hard to duplicate conditions (real-time
    application)
  • Symptom intermittent - with embedded systems
  • Symptom due to causes distributed across number
    of tasks on different processors.





symptom
cause

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
72
Consequences of Bugs
infectious
damage
catastrophic
extreme
serious
disturbing
annoying
mild
Bug Type
Bug Categories
function-related bugs,
system-related bugs, data bugs, coding bugs,
design bugs, documentation bugs, standards
violations, etc.
From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
73
Debugging Techniques
brute force / testing

backtracking

Cause elimination

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
74
Brute Force Debugging
  • Let the computer find the error - memory dumps,
    run-time traces, WRITE statements all over
    program
  • Most common and least efficient method for
    isolating cause of error
  • Wasted effort and time
  • Think first!

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
75
Backtracking Debugging
  • Begin at the site where symptom uncovered, source
    code is traced backward manually until cause is
    found.
  • As number of LOC increases, number of backward
    paths becomes unmanageably large
  • Fairly common debugging approach - successful for
    small programs

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
76
Cause Elimination - Debugging
  • Data related to the error occurrence is organized
    to isolate potential causes
  • Cause hypothesis is devised and data used to
    prove or disprove the hypothesis
  • Or, a list of all possible causes is developed
    and tests run to eliminate each.

From Pressman Software Engineering A
Practitioners Approach, Fifth Edition,
McGraw-Hill, 2000
77
Debugging Final Thoughts
  • Think about the symptom you are seeing
  • Use tools such as dynamic debugger to gain more
    insight about the bug.
  • Get help from somebody else if you are stuck.
    Just talking to another person can help you see
    the cause of the bug.
  • Every time you touch existing code, you run the
    risk of injecting errors. Therefore ALWAYS run
    regression tests on all fixes.
  • Ask the following questions Is the bug also in
    another part of program? How can we prevent the
    bug in the first place?

78
Webliography
  • Check the Webliography for some interesting
    cases of software bugs such as the Therac
    radiation bug and other information about testing.
Write a Comment
User Comments (0)
About PowerShow.com