Software Testing - PowerPoint PPT Presentation

1 / 78
About This Presentation
Title:

Software Testing

Description:

... ideas about a quality car. Different Quality Scenarios. Online banking ... Can this problem lead to new test case ideas? Test Control, Management and Review ... – PowerPoint PPT presentation

Number of Views:105
Avg rating:3.0/5.0
Slides: 79
Provided by: markmi1
Category:
Tags: software | testing

less

Transcript and Presenter's Notes

Title: Software Testing


1
Software Testing
  • Mark Micallef
  • mmica01_at_um.edu.mt

2
This is boring
  • or is it!!

3
What is quality?
4
What makes quality software?
  • There is no clear-cut answer
  • It depends on
  • Stakeholders
  • Type of system
  • Type of users
  • Quality is a
  • multifaceted concept

5
Different Quality Scenarios
  • Online banking system
  • Security
  • Correctness
  • Reliability
  • Air Traffic Control System
  • Robustness
  • Real Time Responses
  • Educational Game for Children
  • Userfriendliness

6
Important s/w process product qualities
  • Correctness
  • Reliability
  • Robustness
  • Efficiency
  • User friendliness
  • Verifiability
  • Maintainability
  • Reusability
  • Portability
  • Understandability
  • Interoperability
  • Productivity
  • Timeliness
  • Visibility

7
The importance of Software Quality
  • Four Marines were killed when their Osprey
    crashed on December 11th 2000 on approach to the
    Marine Corps Air Station New River, North
    Carolina. An enquiry concluded that the crash
    was caused by the failure of a hydraulic system
    component compounded by an anomaly in the
    vehicle's computer software.

8
The importance of Software Quality
  • Between 1985 and 1987, seven people died while
    receiving radiation therapy from a medical linear
    accelerator at a Texas hospital. Investigations
    revealed that software controlling the apparatus
    caused the accidents. If the operator entered an
    unusual but nonetheless possible sequence of
    commands, the computer controls would put the
    machines internals into an erroneous and very
    hazardous state, subjecting patients to a massive
    overdose.

9
The importance of Software Quality
  • In June 1996, the Ariane 5 satellite launcher
    malfunction was caused by a faulty software
    exception routine resulting from a bad 64-bit
    floating point to 16-bit integer conversion.

10
The importance of Software Quality
  • Society is becoming increasingly dependant on
    computer systems
  • Cars are becoming computer controlled
  • Increase in use of e-commerce systems
  • Online banking
  • Aircraft are computer controlled

11
Quality Assurance vs Testing
Quality Assurance
Testing
12
Quality Assurance vs Testing
Quality Assurance
Testing
13
Quality Assurance
  • Multiple activities throughout the dev process
  • Development standards
  • Source-code control
  • Change/Configuration management
  • Release management
  • Testing
  • Quality measurement
  • Defect analysis
  • Training

14
Testing
  • Also consists of multiple activities
  • Unit testing
  • Whitebox Testing
  • Blackbox Testing
  • Data boundary testing
  • Code coverage analysis
  • Exploratory testing
  • Ad-hoc testing

15
Cost of problems
16
Origins of software defects
17
What is testing?
  • Testing is a process of executing a software
    application with the intent of finding errors and
    to verify that is satisfies specified
    requirements (BS 7925-1)

18
Testing Axioms
  • Testing cannot show that bugs do not exist
  • It is impossible to test a program completely
  • Testing should start as early as possible in the
    software development life cycle
  • Software Testing is a Risk-Based Exercise.
    Testing is done differently in different
    contexts, i.e. safety-critical software is tested
    differently from an e-commerce site.
  • The More Bugs you find, the More bugs there are.

19
Errors, Faults and Failures
  • Error a human action that produces an incorrect
    result
  • Fault/defect/bug an incorrect step, process or
    data definition in a computer program,
    specification, documentation, etc.
  • Failure The deviation of the product from its
    expected behaviour. This is a manifestation of
    one or more faults.

20
Common Error Categories
  • Boundary-Related
  • Calculation/Algorithmic
  • Control flow
  • Errors in handling/interpretting data
  • User Interface
  • Exception handling errors
  • Version control errors

21
Testing Principles
  • All tests should be traceable to customer
    requirements
  • The objective of software testing is to uncover
    errors.
  • The most severe defects are those that cause the
    program to fail to meet its requirements.
  • Tests should be planned long before testing
    begins
  • Detailed tests can be defined as soon as the
    system design is complete
  • Tests should be prioritised by risk since it is
    impossible to exhaustively test a system

22
What do we test? When do we test it?
  • All artefacts, throughout the development life
    cycle.
  • Requirements
  • Are the complete?
  • Do they conflict?
  • Are they reasonable?
  • Are they testable?

23
What do we test? When do we test it?
  • Specifications
  • Do they accurately specify what is defined in the
    requirements?
  • Are they clear and unambiguous?
  • Design
  • Does this satisfy the specification?
  • Does it conform to the required criteria?
  • Will this facilitate integration with existing
    systems?

24
What do we test? When do we test it?
  • Implemented Systems
  • Does the system do what is it supposed to do?
  • Documentation
  • Is this documentation accurate?
  • Is it up to date?
  • Does it convey the information that it is meant
    to convey?

25
Summary from last lecture
  • Quality is a subjective concept
  • Testing is an important part of the software
    development process
  • Testing should be done throughout
  • Definitions

26
The Testing Process
  • Test Planning
  • Test Design and Specification
  • Test Implementation (if automated)
  • Test Execution
  • Test Result Analysis and Reporting
  • Test case status updates
  • Problem reports
  • Test Control, Management and Review

27
Test Planning
  • Test planning involves the establishment of a
    test plan
  • Common test plan elements
  • Testing activities and schedule
  • Testing tasks assignments
  • Selected test strategy and techniques
  • Required tools, environment, resources
  • Problem tracking and reporting
  • Exit criteria

28
Test Design and Specification
  • Review the test basis (requirements,
    architecture, design, etc)
  • Evaluate the testability of the requirements of a
    system
  • Identifying test conditions and required test
    data
  • Design the test cases
  • Identifier
  • Short description
  • Priority of the test case
  • Preconditions
  • Execution
  • Post conditions
  • Design the test environment setup (Software,
    Hardware, Network Architecture, Database, etc)

29
Test Implementation
  • Only when using automated testing
  • Can start right after system design
  • May require some core parts of the system to have
    been developed
  • Use of record/playback tools vs writing test
    drivers

30
Test Execution
  • Verify that the environment is properly set up
  • Execute test cases
  • Record results of tests (PASS FAIL NOT
    EXECUTED)
  • Repeat test activities
  • Confirmation testing
  • Regression testing

31
Result Analysis and Reporting
  • Reporting problems
  • Short Description
  • Where the problem was found
  • How to reproduce it
  • Severity
  • Priority
  • Can this problem lead to new test case ideas?

32
Test Control, Management and Review
  • Exit criteria should be used to determine when
    testing should stop. Criteria may include
  • Coverage analysis
  • Faults pending
  • Time
  • Cost
  • Tasks in this stage include
  • Checking test logs against exit criteria
  • Assessing if more tests are needed
  • Write a test summary report for stakeholders

33
Levels of testing
  • Component (Unit) Testing
  • Integration Testing
  • System Testing
  • Acceptance Testing

34
Component (Unit) Testing
  • Goal To search for defects in each testable
    component and verify that it functions as
    intended
  • Created and owned by developers
  • May include
  • Functionality Testing
  • Resource Usage Testing
  • Increase use of test-first philosophy

35
Component (Unit) Testing
Component A
Component B
Component C
36
Component (Unit) Testing
Component A
calculateAge(String dob)
calculateYOB(int age)
37
Component (Unit) Testing
  • calculateAge(01/01/1985)
  • Should return 22
  • calculateAge(03/09/2150)
  • Should return ERROR
  • calculateAge(55/55/55)
  • Should return ERROR
  • calculateAge(Bob)
  • Should Return ERROR
  • calculateAge(29/02/1987)
  • Should Return ERROR

calculateAge(String dob)
38
Integration Testing
  • Goal to test interfaces between different
    components of a system.
  • Created and owned by lead developers / test
    engineers
  • Components may include
  • Classes
  • Sub-systems
  • Databases
  • File Systems

39
Integration Testing
Component A
Component B
Component C
Database
40
System Testing
  • Goal to verify that the system meets the
    functional and non-functional specification and
    all the non-functional requirements.
  • Created and owned by test engineers, domain
    experts

41
Integration Testing
Component A
Component B
Component C
Database
42
Acceptance Testing
  • Goal to establish confidence in the system or
    specific non-functional characteristics of the
    system.
  • Owned by User, business representatives

43
YOUR ASSIGNMENT!!
  • Option 1 Research Assignment
  • Option 2 Practical Assignment

44
Research Assignment
  • E-Commerce Systems become increasingly popular
    year after year
  • Important quality attributes in e-commerce
    systems
  • Security
  • Reliability
  • Performance
  • Usability and Navigability
  • Portability

45
Research Assignment
  • Define e-commerce systems and each of these
    quality attributes in the context of e-commerce.
  • Why do you think these quality attributes are
    important?
  • Research and create a report on how you would
    create tests to verify the level of 3 of these
    quality attributes.
  • What automated tools are available to help you
    test for these quality attributes

46
Practical Assignment
Practical Assignment
Unit Testing and Coverage Analysis
System and Portability Test of Googles Calculator
You have to do BOTH of these
47
Practical Unit Test
  • Create a small calculator application
  • Addition
  • Subtraction
  • Multiplication
  • Division
  • Create unit tests for this application
  • JUnit for Java
  • NUnit for C
  • Conduct coverage analysis on your unit tests

48
Practical Unit Test
2
Num1
7
Num2
Calculate
Op

9
Result
49
Practical Unit Test
50
Practical System Test
51
Practical System Test
52
Practical System Test
53
Practical System and Portability Test
  • Choose a minimum of 5 test cases
  • Test for
  • Correct functionality and results
  • Correct automatic bracketing
  • Test should execute all tests on
  • Internet Explorer
  • Firefox
  • Report results per browser

54
Selenium RC
  • http//www.openqa.org/selenium-rc/
  • An automation tool for testing web applications
  • Supports multiple browsers
  • Supports multiple languages

55
Selenium RC
PC with Firefox
Test Driver
Selenium Server
PC with Internet Explorer
Selenium Client
Selenium Server
56
Testing Techniques
57
Testing Techniques
Testing Techniques
Dynamic
Static
58
Static Testing
  • Testing artefacts without actually executing a
    system
  • Can be done from early stages of the development
    process
  • Can include
  • Requirement Reviews
  • Code walk-throughs
  • Generic code review
  • Enforcement of coding standards
  • Code-smell analysis

59
Typical faults found in reviews
  • Deviation from Coding Standard
  • Requirements defect
  • Design Defects
  • Incorrect Interface Specification
  • Insufficient Maintainability
  • Lack of error checking

60
Verification vs Validation
Stolen from ISEB slides by Olena Sammut
61
Verification vs Validation
  • Verification Are we building the system right?
  • Validation Are we building the right system?

62
Types of Reviews
  • Buddy checking
  • Walkthroughs
  • Technical Review
  • Formal Inspections

63
Code Smells
  • An indication that something may be wrong with
    your code.
  • Pragmatic vs Puritan smells
  • A few examples
  • Very long methods
  • Duplicated code
  • Long parameter lists
  • Large classes
  • Unused variables / class properties
  • Shotgun surgery (one change leads to cascading
    changes)

64
Dynamic Testing Techniques
  • Testing a system by executing it
  • Two main types
  • Black box testing
  • White box testing

65
Black box Testing
Inputs
Outputs
  • Confirms that requirements are satisfied
  • Examples of black box techniques
  • Boundary Value Analysis
  • Error Guessing
  • State transition analysis

66
White box Testing
Method1(a,b) Method2(a) while(xlt5)

Inputs
Outputs
  • Examines code while it executes
  • Examples of white box techniques
  • Testing individual functions, libraries, etc
  • Designing test cases based on your knowledge of
    the code
  • Monitoring the values of variables, time spent in
    each method, etc-
  • Code coverage analysis which code is executing?

67
Test case design techniques
  • A good test case
  • Has a reasonable probability of uncovering an
    error
  • Is not redundant
  • Is neither too simple, nor too complex
  • Various test case design techniques exist

68
Test to Pass vs Test to Fail
  • Test to pass
  • Minimally assures only that the software works
  • Software is not pushed to its limits
  • Test to fail
  • Assumes software works when treated in the right
    way
  • Attempts to force errors

69
Various Testing Techniques
  • Experience-based
  • Ad-hoc
  • Exploratory
  • Specification-based
  • Functional Testing
  • Domain Testing

70
Experience-based Testing
  • Use of experience to design test cases
  • Experience can include
  • Domain knowledge
  • Knowledge of developers involved
  • Knowledge of typical problems
  • Two main types
  • Ad Hoc Testing
  • Exploratory Testing

71
Ad-hoc vs Exploratory Testing
  • Ad-hoc Testing
  • Informal testing
  • No preparation
  • Not repeatable
  • Cannot be tracked
  • Exploratory Testing
  • Informal
  • Actually involves test design and control
  • Useful when no specification is available
  • Notes are taken and progress tracked

72
Specification-Based Testing
  • Designing test-cases to test specific
    specifications and designs
  • Various categories
  • Functional Testing
  • Decomposes funtionality and tests for it
  • Domain Testing
  • Random Testing
  • Equivalence Classes
  • Combinatorial testing
  • Boundary Value Analysis

73
Equivalence Classes
74
Coverage Analysis
  • Used to check how much of your code is actually
    being tested
  • Three main types
  • Statement coverage analysis
  • Branch Coverage Analysis
  • Condition Coverage Analysis

75
Statement vs Branch Coverage
void checkEvenNumber(int num) if ( (num 2)
1) System.out.println(not )
System.out.println(an even number)
  • In this example
  • You achieve 100 statement coverage with one test
    case
  • You achieve 100 branch coverage with 2 test cases

76
Conditional Coverage
void checkRealisticAge(int age) if ( age gt
0 age lt 105) System.out.println(age
is realistic) else
System.out.println(not realistic)
77
Conclusion
  • This was a crash course in testing
  • Testing is an essential part of the development
    process
  • Even if you are not a test engineer, you have to
    be familiar with testing techniques
  • Good luck with your assignment and exams
  • Questions mmica01_at_um.edu.mt

78
The end
Write a Comment
User Comments (0)
About PowerShow.com