Title: Testing%20Terminology
1Testing Terminology
CEN 5076 Class 3 09/19
- Review Class 2
- Test Process Overview
- Risk Analysis
- The Testing Process
- Planning Activities
- Team Meeting
2Laws of Testing Handbook of SSE
- Online debugging is more efficient than offline
debugging (Sackmans). - Testing can show the presence but not the absence
of errors (Dijkstra). - A developer is unsuited to test his or her code
(Weinbergs). - Approximately 80 percent of defects come from 20
percent of modules (Paret-Zipf-type) - Performance testing benefits significantly from
system-level benchmarks (Gray-Serlin)
3Laws of Testing Handbook of SSE
- Usability is quantifiable (Nielsen-Norman law).
- Partition testing is more effective than random
testing (Gutjahrs hypothesis). - The adequacy of a coverage criterion can only be
intuitively defined (Weyukers hypothesis). - The test suite needed to verify an arithmetic
path expression can be determined
(Endres-Glatthaar hypothesis). - Suspicion-based testing can be more effective
than most other approaches (Hamlets hypothesis).
4Taxonomy of OO Classes
- Class Abstraction Technique (CAT) that support
implementation-based testing. - Catalogs classes based on the characteristics of
a class. - Class characteristics for a given class C are
the properties of the features in C and the
relationships C has with other classes in the
implementation. The properties of the features
in C describe how criteria such as types,
accessibility, shared class data, polymorphism,
dynamic binding, deferred features, exception
handling, and concurrency are represented in the
attributes and routines of C. - Clarke and Malloy SEA 04
5Taxonomy of OO Classes cont
Descriptors Type Nomenclature Attributes
Routines Families (Nested) New
(Constant) NA no type (Multi-Parents)
Recursive New P primitive type (Friend)
Concurrent Recursive P reference to P
(Has-Friend) Polymorphic Redefined U
user-defined type Generic Private
Concurrent U reference to U Concurrent
Protected Synchronized L library Abstract
Public Exception-R L reference to L
Inheritance-free Constant Exception-H A
any type (generics) Parent Static
Has-Polymorphic A reference to A External
Child - Non-Virtual m ltngt parameterized
type Internal Child - Virtual m ltn gt
reference to - - Deferred parameterized
type - - Private where m e U,
L - - Protected n is any combination of
- - Public P, P, U, U, L, L, A, A
- - Static -
6Taxonomy of OO Classes cont
7Types of Inheritance
- Model Inheritance is-a relation between
abstractions in the model. - Subtype
- View
- Restriction
- Extension
- Variation inheritance expresses relations
within the software itself rather than the model. - Functional
- Type
- Uneffecting
8Types of Inheritance
- Software inheritance describes a class by how
it differs from another class. - Reification
- Structure
- Implementation
- Facility constant, machine
- Bertrand Meyer 96
9Planning for Testing
- Testing requires considerable resources.
- Good planning and good management is required for
the effective utilization of these resources. - Want to plan a test process that complements your
development process. - Want to analyze the risks associated with
verifying the required functionality. - Need to develop test plans for the different
levels and types of testing required for a
comprehensive test process.
10Testing Process Overview
- Testing is applied at various points during
development. - Development and testing processes have different
goals and different measures of success. - Development strives to build a product that meets
a need. - Testing strives to answer questions about the
product, including whether the product meets the
need that it is intended to meet.
11Testing Process Overview
- Example
- The lower the defect rate (ratio of test cases
that fail to the total number used), the more
successful the development process is considered
to be. - The higher the defect rate, the more successful
the testing process is considered to be. - The roles of development and testing are usually
assigned to different people. - Developers are responsible for some testing,
e.g., unit testing and integration testing.
12Testing Process Overview
- To ensure testing is carried out with the same
vigor when it is done by developers a buddy
system is used. - Buddy system functionality is partitioned and 2
developers take turns in writing code and
testing. - The development and testing processes are in a
feedback loop, recall testing model USDP diagram. - Recall in the USDP increments of the system are
usually developed.
13Testing Process Overview
- Incremental approach
- Increment 1 Analysis , Design, Implementation,
Testing. - Increment 2 Analysis , Design, Implementation,
Testing. -
- Increment n Analysis , Design, Implementation,
Testing.
14Testing Process Overview
- The testing perspective must be considered,
preferably by professional testers, when
development methods and tools are selected. - The form and quality of the requirements
specification also affects the testing process. - Product requirements comprise the source of test
cases in system and acceptance testing. - System testers should participate in the
gathering and validation of the reqs. need to
understand reqs., assess risks, and check for
testability.
15Testing Process Overview
- Testability
- The degree to which a system or component
facilitates the establishment of test criteria
and the performance of tests to determine whether
those criteria have been met. IEEE 610 - The degree to which a requirement is stated in
terms that permit establishment of test criteria
and performance of tests to determine whether
those criteria have been met. IEEE 610 - Test criteria
- - The criteria that a system or component must
meet in order to pass a given test. IEEE 610
16Testing Process Overview
- There are two types of testing criteria
- Test data selection criterion represents a rule
used to determine which test case to select. - Test data adequacy criterion a rule used
determine whether or not sufficient testing has
been performed. - Test data selection criterion serves as the basis
for picking a test set to satisfy some goal,
while a test data adequacy criterion checks to
see whether a previously selected test set
satisfies the goal. Weyuker 93
17Testing Process Overview
- STEP Testing technique Hetzel 84
- Analysis product to be tested is examined to
identify any special features that must receive
particular attention and to determine test cases
that should be constructed. - Construction artifacts that are needed for
testing are created. Test cases are translated
into programming languages and scripting
languages, or they are entered in a tool-specific
language. Data sets required for testing are
built.
18Testing Process Overview
- STEP Testing technique Hetzel 84 cont
- Execution and Evaluation most visible and
recognized part of the test effort. Test cases
are executed and the results examined to
determine if the software passed or failed the
test suite. - Test suites must be maintained, i.e.,
- as reqs. changed so does the test suite,
- as problems are found by users test cases will be
added to catch those problems.
19Risk Analysis A Tool for Testing
- Risk
- Anything that threatens the successful
achievement of a projects goal. - An event that has some probability of occurring,
and if it does there will be some lost. - Risk-based testing principle
- Test most heavily those portions of the system
that pose the highest risk to the project to
ensure that the most harmful faults are
identified. - Risk analysis a procedure for identifying risks
and for identifying ways to prevent potential
problems from becoming real.
20Risk Analysis A Tool for Testing
- Each project requires it own individual analysis,
therefore it is important to apply testing
strategies that make sense. - Effective software testing 50 specific ways
to improve your testing. E. Dustin 2003 see
updated online course syllabus - Some of these strategies include
- Select test-design techniques numerous
techniques are available.
21Risk Analysis A Tool for Testing
- Select testing tools decide on the
vendor-provided tools how the tools will be
used, which team member will use them. - Develop in-house test harness or scripts.
- Determine test personnel and expertise required.
- - To write test harness and scripts a developer
must be included in the testing team. - - Automation skills required for capture/playback
skills. - - Domain expertise usually required.
22Risk Analysis A Tool for Testing
- Determine testing coverage
- It is essential that testers understand the
coverage required e.g., may be a contractual
agreement in SRD. - There might be a code coverage requirement e.g.,
DOD. - Determine test coverage given the resources,
schedules, tools, task at hand, and risks of not
testing an item. - Beizer estimates that from 2 to 80 of the
application size can be code to support testing.
23Risk Analysis A Tool for Testing
- Establish release criteria
- - Indicate when testing can be considered
complete. - Try to state in a quantifiable manner e.g., use
cases 1, 3, and 4 need to be defect free. - Set the testing schedule
- Consider the testing phases
- - Different test strategies apply to different
test phases. (see V-model for testing phases in
traditional model)
24Risk Analysis A Tool for Testing
- High risk factors
- Short time-to-market
- Can prevent adequate testing.
- Test strategies adapted for the time available.
- New design process
- Intro of new design tools, techniques.
- Example change from USDP to eXtreme programming
approach or vise versa. - New technology
- Complexity
- - Identify high risk uses case during reqs.
analysis.
25Risk Analysis A Tool for Testing
- Frequency of use
- This functionality usually represents the core of
the application. - Untestable functional and non-functional reqs.
- A reason to quantify non-functional requirements.
- Must make all requirements testable.
- Other sources of risk
- - Prog. Languages permits certain classes of
errors and inhibit others e.g., strong typing vs.
weak typing.
26Risk Analysis A Tool for Testing
- Class Project
- Each use case should have a risk, frequency, and
criticality associated with it. - Review each of the use cases to ensure the above
properties are quantified and valid. - Produce a ranked list of use cases. Focus on the
implemented use cases. - Use the risk, frequency, and criticality to
identify the testing effort (time and personnel)
required for each use case during system testing.
27A Testing Process
- Dimensions of software testing (think of the
attributes in italics as a continuum) - Who performs the testing?
- Developer, tester, independent tester.
- Which pieces will be tested?
- Test nothing, test a sample, test everything.
- Use a systematic approach. P. 81
28A Testing Process
- Dimensions of software testing cont
- When will testing be performed?
- When components are developed, when all
components are developed and integrated. - How will testing be performed?
- Specification, implementation.
- How much testing is adequate?
- Not testing, exhaustive testing.
29Roles in Testing Process
- Test plan should
- Identify the roles each person will be assigned.
- For each role allocate time and effort.
- Schedule time allocated for each part of the
testing effort. - Development schedule drives much of the testing
schedule. - Identify the resources needed for the testing
effort e.g., h/w, s/w, expertise
30Roles in Testing Process
- Roles
- Unit tester responsibility is to test the
individual classes (cluster of classes) as they
are produced. - Integration tester responsibility testing a set
of objects that are being brought together from
different development sources e.g., individuals
or teams. - System tester has domain knowledge and is
responsible for independently verifying that the
completed application satisfies the reqs.
31Roles in Testing Process
- Test manager responsible for managing the test
process i.e., requesting, coordinating, and
making effective use of the resources allocated. - Other Roles see handout Dustin
- Team lead technical leadership for the test
program, including test approach. - Test engineers (usability, manual, automated,
network, security) specialist testers in each
of the areas. - Test environment specialist installs test tools
and establishes test-tool environment
32Test Plan
- Class Project
- Roles for testing each model test manager, test
engineer (system, integration, class) tester, and
minute keeper. - Keep a diary of all activities performed at each
meeting. - Use information in handout to identify the
duties/skills of each person on the team e.g., P.
73 first entry in table 13.2 .
33Roles in Testing Process
- Each diary entry should contain start time of
meeting, end time of meeting, persons present,
location of meeting, description of topics
discuss (bullet form), tasks assigned to team
members. - Roles in the team will be rotated as each USDP
model is tested. Initially students that
developed the applications will be domain
experts. - Use an automated tool to generate a testing
schedule e.g., MS project. Identify tasks,
milestones, and deliverables for the testing
activity.
34A Detailed Set of Test Activities
- Fig 3.11 Synopsis of testing activities, P. 88
- Domain analysis
- Application analysis
- Architectural design
- Detailed design
- Class implementation
- Application implementation
35Documentation
- IEEE 829 Standard Test Plan outline
- 1.0 Introduction high level view of the
testing, includes type of testing e.g., class,
subsystem, system, acceptance, release. - 2.0 Test Items defines the scope, h/w and s/w
to be tested. - 3.0 Tested Features parts of the s/w spec to
be tested. - 4.0 Features Not Tested includes features
already tested - 5.0 Testing Strategy and Approach
- 5.1 Syntax
36Documentation
- 5.2 Description of Functionality
- 5.3 Arguments for Test includes preconditions
- 5.4 Expected Output
- 5.5 Specific Exclusions
- 5.6 Dependencies
- 5.7 Test Case Success/Failure Criteria
- 6.0 Pass/Fail Criteria for the Complete Test
Cycle - 7.0 Entrance Criteria/Exit Criteria
- 8.0 Test-Suspension Criteria and Resumption
Criteria
37Documentation
- 9.0 Test Deliverables/Status Communication
Vehicles - 10.0 Testing Tasks
- 11.0 H/w and S/w reqs.
- 12.0 Problem Determination and Correction
Responsibilities - 13.0 Staffing and Training Needs/Assignments
- 14.0 Test Schedules
- 15.0 Risks and Contingencies
- 16.0 Approvals
38Documentation
- Visit the following site for more information
- http//www.coleyconsulting.co.uk/IEEE829.htm
- http//www.evolutif.co.uk/tkb/guidelines/ieee829/e
xample.html
39Project Test Plan
- Summarizes testing strategy to be employed for
the project. - Fig. 3.15 P. 97 summarizes the activities that
are required, the frequency with which each
activity will be employed, and the entity
responsible for this testing phase. Required for
class project! - Fig. 3.16 P. 97 associates each of the testing
phases with the specific strategy for that phase.
Required for class project!
40Component Test Plan
- Defines the overall strategy and specific test
cases that will be used to test a certain
component. - One plan per significant component (see uses
cases). - Fig. 3.17 P. 100 shows a template for the test
plan. Required for class project! - Each section of the plan contains two types of
guided information
41Component Test Plan
- Project criteria standards that have been
agreed upon and how thoroughly each component
will be tested, - e.g., 100 of the postconditions on modifier
methods should be tested. - Project procedures techniques that have been
agreed upon as the best way to handle a task.
Provides the details of the test strategies that
were identified in the project plan. - e.g., constructing a Parallel Architecture for
Class Testing (PACT) class for each component
that will be tested.
42Component Test Plan
- Sections of template
- Objectives for the class - prioritized list of
objectives for the component. - Guided Inspection Requirements (see P.99).
- Building and Retaining Test Suites
- Process of creating test driver classes.
- Scheduled deadline for delivery of test cases.
- The specification of the test driver
- Relative number of test cases (prioritized ) in
each of the following categories.
43Component Test Plan
- Functional Test Cases
- Approach used to develop test cases from the
specification. - The class invariant method
- Identify the types of objects being tested.
Based on the initial state of the object. - Structural Test Cases
- Info about test cases developed for code coverage
and code-review process. - How to use the required test-coverage tool
44Component Test Plan
- State-Based Test Cases
- State representation for object.
- Approach used to generate test cases.
- Interaction Test Cases
- How dependencies will be handled.
- Creation of stubs required to handle cycles.
45Use Case Test Plan
- Describes the system level test to be derived
from a single use case. - Incorporated by reference into both the
integration and systems test plan. - Types of use cases
- High-level are abstract use cases that are the
basis for being extended to end-to-end use cases. - Functional sub-use cases are aggregated into
end-to-end system-level use cases. - End-to-end represents a complete transaction in
the operation of the system.
46Use Case Test Plan
- Other uses cases
- Report access information in the system,
summarize it, and format it for presentation to
the user. - Boundary describe startup, shutdown, and
exceptional conditions. - Figs 3.18, 3.19, and 3.20, Pgs 102, 103, and 104.
47Integration Test Plan
- Very important in the incremental development
environment. - Integrates individual classes into a cluster
(component), components into subsystems,
subsystems into the system. - In most cases there is a need for test drivers
and stubs. - Integration test provides information on the
order of testing the individual classes,
components, subsystems.
48Integration Test Plan
- Test cases span the parts of the system being
integrated. - More complex and comprehensive than the typical
unit test. - Format follows that of the system test plan (Fig.
3.21 P. 105)
49System Test Plan
- Summarizes the individual use case test plans and
provides information on additional types of
testing . - Note that the most of the information provided by
the IEEE test plan format have already been
captured in the individual test plans.