Quality Week 2000 - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Quality Week 2000

Description:

System and Software Architecture 8. ... and unit bug and change request history degree and type of reuse Test ... Times New Roman Arial Symbol testpres.ppt ... – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 32
Provided by: B485
Category:

less

Transcript and Presenter's Notes

Title: Quality Week 2000


1
Quality Week 2000
  • Experience-Based Approaches to
  • Process Improvement
  • -
  • Otto Vinter
  • Project Manager Software Process Improvements,
  • DELTA Software Engineering,
  • Tel 45 4586 7722, Fax 45 4586 5898
  • otv_at_delta.dk
  • Software Engineering Mentor
  • Tel/Fax 45 4399 2662, Mobile 45 4045 0771
  • vinter_at_inet.uni2.dk http//inet.uni2.dk/vinter

2
Normative Models for SPI
  • CMM
  • BOOTSTRAP
  • SPICE (ISO15504)
  • Defined comprehensive process
  • Levels of maturity (capability)
  • Key process areas (KPAs) for each level
  • Assessed by certified assessors

3
Fundamentals in Normative Models
Software Process Assessment Improvement (The
BOOTSTRAP Approach) ISBN 0-631-19663-3
4
Problems with Normative Models
  • Abstract model
  • Assessment by external body
  • Costs money
  • Organisational focus
  • Points out KPAs to be improved
  • Little help on precisely what to do
  • Raises a lot of expectations

5
Alternative Approaches to SPI
  • Experience-based improvement actions
  • Analyse problems from previous projects to
    extract knowledge on frequently occurring
    problems
  • Change the development process through the use of
    an optimum set of methods and tools available to
    prevent these problems from reappearing
  • Measure the impact of the changes in a real-life
    development project
  • Diffuse the results to the rest of the
    organisation

6
Characteristics of Alternative Approaches
  • Experience-based
  • No specific model
  • Hot-spot driven
  • Focus on prevention
  • One issue at a time (incremental)
  • Piloting at project level
  • Evolve rather than define (feed-back)
  • Fits CMM level 1-2 cultures (where most of
    us are)

7
Examples of Alternative Approaches
  • Analysis of
  • defects
  • progress reports, etc.
  • Structured/Selective interviews
  • project managers
  • project members
  • customers, etc.
  • Goal-Question-Metric paradigm (GQM)
  • Some frameworks for alternative approaches
  • Experience Factory (V. Basili e.a.)
  • Product Process Dependency Models (PROFES
    www.ele.vtt.fi/profes)
  • but primarily you must find your own way

8
Brüel Kjær - Sound Vibration Measurement
9
Defect Analysis from Error Logs
  • Definitions
  • Bugs are anything between serious defects and
    suggestions for improvements
  • Problem reporting starts in the integration phase
  • Error Logs Analysed
  • Embedded and PC Windows development projects
  • Project sizes app. 5-7 person years
  • In total app. 1000 bugs analyzed in detail
  • Problem reports covered a period until 18 months
    after first release

10
Problem Report Distribution over Time
11
Defect Analysis Technique
  • Interview sessions
  • 1-2 developers and 1-2 process consultants
  • app. 5 minutes / bug
  • Bug Categorisation
  • based on a bug taxonomy by Boris Beizer
  • Boris Beizer Software Testing Techniques, Van
    Nostrand Reinhold, 1990
  • comprehensive set of bug categories and
    statistics
  • Capture Subjective Information on the Bugs
  • where and how the bug occurred
  • quality factor (reliability, usability,
    functionality ...)
  • complexity of bug (correction cost)
  • what could prevent the bug

12
The Beizer Bug Taxonomy
  • 1. Requirements and Features
  • 2. Functionality as Implemented
  • 3. Structural Bugs
  • 4. Data
  • 5. Implementation (standards violation,
    and documentation)
  • 6. Integration
  • 7. System and Software Architecture
  • 8. Test Definition or Execution Bugs
  • 9. Other Bugs, Unspecified
  • Each category detailed to a depth of up to 4
    levels

13
Problem Report Categorization
  • Category

14
1st Action The Prevention of Errors through
Experience-driven Test Efforts (PET)
  • Results of the analysis of error logs
  • no special bug class dominates embedded software
    development
  • requirements problems, and requirements related
    problems, are the prime bug cause (gt36)
  • problems due to lack of systematic unit testing
    is the second largest bug cause (22)
  • Action Improve Testing Processes
  • static analysis
  • source code complexity, data flow anomalies,
    coding standards
  • dynamic analysis
  • code coverage by test cases, cross references
    code to test cases
  • Funded by CEC. Final Report http//www.esi.es/ESS
    I/Reports/All/10438

15
Static Flowgraph (McCabe 10)
16
Static Flowgraph (McCabe 20)
17
Test Coverage Strategy
  • Test coverage must take into consideration
  • perceived complexity
  • based on a visual inspection
  • criticality
  • product and project issues
  • developer experience
  • domain, product, and unit
  • bug and change request history
  • degree and type of reuse
  • Test cases for dynamic analysis are selected by
    normal testing techniques
  • equivalence partitioning, boundary value analysis
    etc.

18
Results of the Improved Testing Process
  • 46 Improvement in Testing Efficiency
  • Removing static bugs
  • Increasing unit branch coverage to gt 85
  • 75 Reduction in Production-Release Bugs
  • Compared to Trial-Release
  • 70 Requirements Bugs in Production-Release
  • Next Action Improve Requirements Engineering

19
2nd Action A Methodology for Preventing
Requirements Issues from Becoming Defects (PRIDE)
  • Results of the analysis of error logs
  • Requirements related bugs 51
  • Usability issues dominate 64
  • External software (3rd party MS products) 28
  • Action Introduce requirements techniques
  • Use Situations (Scenarios)
  • Relate demands to use situations. Describe tasks
    for each scenario.
  • Usability Test, Daily Tasks, Navigational
    Prototype
  • Check that the users are able to use the system
    for daily tasks, based on a navigational
    prototype of the user interface.
  • Funded by CEC. Final Report http//www.esi.es/ESS
    I/Reports/All/21167

20
Scenarios
  • The defining property of a scenario is that it
    projects a concrete description of activity that
    the user engages in when performing a specific
    task, a description sufficiently detailed so that
    design implications can be inferred and reasoned
    about
  • John M. Carroll,
  • Scenario-Based Design
  • Envisioning Work and Technology in System
    Development,
  • Wiley 1995.
  • Many names for the same concept
  • scenarios
  • use situations
  • work situations
  • work setting

21
Road Test Scenario
  • Road tests are done in the car when it is
    driving on special test roads. The purpose of the
    recordings is to identify noise sources,
    comparing them to earlier measurements, and
    eventually removing the noise through changes to
    the car design.
  • The engineer will record noises from various
    parts of the car when it is driving at various
    speeds, when it is turning, when it is breaking,
    etc. The microphone will have to be mounted at
    various places not accessible from the drivers
    seat.
  • Usually, the engineer has a plan for what to
    measure, but circumstances may change so that he
    has to do something different and later find out
    what he actually did and which sounds relate to
    what.
  • Back at the lab the sounds will be analyzed by
    the engineer himself, or - in many cases -
    someone else.

22
Usability Test Environment
PC
Microphone
Experiment Leader
Tape Recorder
User
Log Keeper
23
Results of the 2nd Improvement Action
  • Product is selling steadily more than twice as
    many copies
  • compared to a similar product developed for a
    similar domain by the same team
  • Usability is exceptional in the market
  • Users interaction with the product was totally
    changed as a result of the early usability tests
  • Almost 3 times increase in productivity for the
    development of the user interface
  • 27 reduction in problem reports

24
Improvement Approach Based on Project Manager
Perceived Problems
  • Interview project managers
  • 1 project manager and 3 process consultants
  • app. 1 hour per interview
  • process-oriented interview guide
  • Analyze problems raised
  • classify the problems according to software
    process
  • present the issues at a project manager workshop
  • establish consensus on a few major issues
  • let each manager select an issue to improve
  • Monitor development projects
  • introduce and train the team on the selected
    issue
  • coach, and support regularly and on request
  • collect experience and results

25
Improvement Approach Based on Project Manager
Perceived Problems
  • Major Issues
  • iterative software development model
  • requirements
  • project monitoring (estimation, progress
    evaluation, etc.)
  • project conclusion (test, configuration
    management, etc.)
  • Improvement Actions
  • diffuse and adopt the test and requirements
    techniques
  • define new iterative development models
  • improve project monitoring
  • Results
  • improvement actions were performed with
    enthusiasm
  • project related actions completed successfully

26
3rd Action Iterative Software Development Model
Customer Req. Spec. Validation
Final CustomerValidation
Intermediate CustomerValidation
Pre-Project
Customer Views
Release Vers. 1.0
Test
Test
Test
Test
Spec.
Spec.
Spec.
Spec.
Development Views
Code
Code
Code
Code
Design
Design
Design
Design
Specification
Cycles
Stabilization
Partial Implementation
Full Implementation
Phases
Specification
Development
Operation
Product Views
Demo Prototype
Functionally Complete Model
Specification and user interface mock-up
Released Product
27
Iterative Software Development Model
  • Types of Iteration
  • user validation (usability test)
  • exploration (e.g. performance)
  • proof-of-concept (e.g. architecture)
  • well-defined functionality (increment)
  • Duration
  • time-boxed
  • less than 6 weeks, with 2 weeks spacing
  • full increments
  • app. 3 months, with 2 weeks of stabilization
  • Essential for Success
  • user validation
  • stabilization

28
Results of the 3rd Improvement Action
  • drive and motivation of team increased
    dramatically
  • turbulence within and around team decreased
  • control over project progress increased
  • requirements creep was kept under control
  • resources used more efficiently
  • serious problems were uncovered early
  • quality improved step-by-step

29
Comparison of Recommendations
  • Recommendation Defect 1st Bootstrap Project
    Mgr. 2nd Bootstrap
    Analysis Assessment Interviews Assessment
  • Development Model
  • - iterations x x x
    ?
  • - risk management x
  • Requirements x x x
    ?
  • Project Monitoring
  • - estimation x x x
  • - time resource usage x
    x
  • - monitor progress x x x
  • Project Conclusion
  • - configuration mgmt. x x
    x
  • - testing x x x (?)
  • - release criteria x x
  • Reuse x
  • Process Descriptions x ?

30
Problem Diagnosis Results
  • Defect analysis from error logs
  • has established a basic process for testing and
    release
  • has improved our requirements process and
    products
  • Improvements from interviews
  • successful diffusion and adoption of previous
    improvement actions
  • established a new software development model
  • Impacts on maturity
  • development model, test, and requirement issues
    are no longer on the Bootstrap recommendation list

31
In Conclusion
  • Problem diagnosis approach
  • is a simple and effective way to find problems in
    the software development process
  • good starting point for process improvement
    programmes in companies
  • step-wise improvements with quick wins
  • changes assessment recommendations
  • However, normative models are needed
  • comprehensive framework
  • KPAs are important, levels are not
  • established assessments
  • effect of improvements
Write a Comment
User Comments (0)
About PowerShow.com