Jason I. Hong - PowerPoint PPT Presentation

About This Presentation
Title:

Jason I. Hong

Description:

Testing Strategies Jason I. Hong CS169 - Software Engineering, Fall 1998 – PowerPoint PPT presentation

Number of Views:102
Avg rating:3.0/5.0
Slides: 20
Provided by: JasonH177
Learn more at: http://www.cs.cmu.edu
Category:
Tags: casting | defects | hong | jason

less

Transcript and Presenter's Notes

Title: Jason I. Hong


1
Testing Strategies
  • Jason I. Hong
  • CS169 - Software Engineering, Fall 1998

2
Types of Testing
3
Types of Testing
  • Scope
  • Unit Testing tests a subroutine, module, or class
  • Integration Testing tests multiple units together
  • System Testing tests entire system as a whole
  • Some Testing Goals
  • Functional Testing tests functions in system
  • Performance Testing tests system performance
  • primarily execution time, memory footprint
  • Reliability Testing certifies system reliability
  • predictable execution time, low number of
    failures, etc.

4
Black Box Testing
  • Define test cases based only on specifications
  • Try all boundary conditions
  • Go from empty to full, full to empty
  • Array bounds
  • Empty or null values
  • Should be done by someone other than the module
    programmer
  • Needs a shift in mindset away from programming
  • Programmers dont want to find bugs, testers do

5
White Box Testing
  • Define test cases by looking at code
  • Not as useful as code review or code coverage
  • Useful for regression testing

6
Code Coverage
  • Create test cases that execute every block of
    code at least once
  • Complete code coverage does not imply correct
    code
  • Ideally, we would like to test every path but
    this is exponential
  • e.g. all possibilities in switch statements
  • Useful for regression testing

7
Code Review
  • Read through someone elses code
  • Extremely effective at finding defects
  • Look for common errors
  • Possible unhandled error conditions
  • Potential array out of bounds (e.g. gets)
  • Bad or dangerous type casting
  • Memory leaks (e.g. alloc w/o dealloc)
  • Too many global variables (high complexity)
  • Not cleaning up after self (e.g. goto, return)
  • ...

8
Regression Testing
  • Maintain suite of tests
  • Build product periodically and run test suite
  • Automatically verify that output is correct
  • Find reason for failure (test wrong or code
    wrong)
  • Good metric for quality
  • Can always tell if going forward or not
  • Requires significant process maturity from
    company
  • Continually add new tests as bugs are fixed
  • Ensures that old bugs do not reappear
  • Forces discipline on developers

9
Stress Testing
  • Place large load on product (typically a server)
  • Each client sends multiple requests
  • Some valid requests, some junk requests
  • Look for
  • Increasing memory footprint (memory leak?)
  • Running out of resources (sockets, file
    descriptor)
  • Increased response time (slows over time?)
  • Incorrect results (possible race conditions?)

10
User Testing
  • Perhaps the most important kind of testing
  • Can customers use the product easily?
  • How many errors do users make?
  • How long does it take users to learn the system?
  • Do customers like the product? Enjoy using it?
  • Without customers, you have no product.
  • Best strategy is iterative development
  • Expensive to change product at end, so build
    cheap prototypes and evaluate with users at
    beginning
  • Hire real designers! Programmers ! designers
  • Take CS160 for more information

11
Alpha and Beta Testing
  • Alpha testing
  • Product is released to very small and focused
    group
  • Beta testing
  • Product is released to wider range of people
  • Difficult to make significant changes once Alpha
    and Beta Testing is reached
  • Use for polishing, finding compatibility
    problems, fixing simple errors, and finding
    workarounds
  • Do not use for evaluating usability or quality,
    a common mistake still made by many
    companies

12
Some Statistics
  • Unit testing finds 10-50 of defects
  • System testing finds 20-60 of defects
  • Combined rate lt 60 of defects
  • Informal code review finds 30-70 of defects
  • Formal inspection finds 60-90 of defects, net
    savings of 10-30
  • Reviews are more cost effective than testing
  • Using reviews with testing helps software quality

See Rapid Development pp72-75
13
Development Notes
  • Back-to-Back algorithms
  • Use two implementations and compare results
  • Microsoft Excel does this for cell calculations
  • UWash Kimera Verifier found bugs in Netscape,
    Sun, and Microsoft implementations of Java
  • Use Static Verification
  • Use tools that go through code statically
  • Have no errors or warnings on Lint and gcc -Wall
  • Self-Testing Code
  • Have each class be able to test itself
  • Each Java class can have its own main() method

14
Development Notes (cont.)
  • Use good tools
  • Purify, test harnesses, code coverage analysis,
    etc
  • Iterative Development
  • Plan a little, build a little, verify a little,
    and repeat
  • Fix defects as you find them, not at the end
  • If not, others will waste time on known problem
  • 80/20 rule applies for defects
  • 80 of bugs likely to be in 20 of code
  • Use defect tracking to find this code
  • If a unit seems to have lots of errors, it may
    be easier just to rewrite from scratch!

15
Testing Process Notes
  • Begin test planning from beginning
  • Start test cases once requirements are known
  • Use User Manual and Specification Document
  • Microsoft uses one tester per developer
  • Space Shuttle people use ten testers per
    developer
  • Daily Build
  • Product is built every day and put through tests
  • Supports incremental development
  • Minimizes integration (prevents dis-integration)
  • Provides project visibility

16
Testing Process Notes (cont.)
  • Find the reason for error and ask questions
  • Is it a common occurrence? Mistake? Sloppiness?
  • Programmers fix own bugs at Microsoft, prevents
    programmer from introducing new bugs elsewhere
  • How much will it cost to fix? Is it worth it?
  • How can bugs like this can be prevented in
    future?
  • Keep a database of bugs
  • Track all defects to closure
  • Graph open bugs and closed bugs over time

17
Testing Process Notes (cont.)
  • Remember that testing only shows the presence of
    defects, not absence
  • Testing assesses quality, does not assure quality
  • Have an environment of quality
  • Inktomi hangs a toy pig over persons cubicle
    for a short while if he/she messes up
  • Microsoft head once threw chair out window
    on broken build
  • Lives depend on Space Shuttle software developers
  • Lives may depend on software you write!

18
Summary
  • Plan quality from the very beginning
  • Plan testing from the very beginning
  • People and resources needed, test cases
  • Preventing bugs is easier than finding bugs
  • Use assertions, preconditions, invariants,
    reviews
  • Use a variety of methods for reducing defects,
    assessing quality, and assuring quality
  • The earlier you find a defect, the cheaper and
    easier it is to correct

19
References
  • Rapid Development by Steve McConnell
  • Code Complete by Steve McConnell
  • Software Engineering by Ian Sommerville
  • Software Project Dynamics by Jim McCarthy
  • Software Management by Donald Reifer
  • Construx Software
  • Eric Brewer (CS169 Spring 1998)
Write a Comment
User Comments (0)
About PowerShow.com