Title: Software Quality Management and Opportunity Trees
1Software Quality Management and Opportunity Trees
- Barry Boehm, USC
- CS 510
- Fall 2001
2Outline
- What is Software Quality?
- A Conceptual Framework for Achieving Quality
- The WinWin Spiral Model
- Win Win Negotiation Aids for Quality Requirements
- Software Quality Opportunity Trees
3What Is Software Quality?
- Conformance to specifications?
- Accuracy, adaptability, affordability,
availability, ? - Satisfying the customer?
- Something else?
4Software Quality Is ...
- Conformance to specifications?
- Some difficulties What if
- The specifications are wrong?
- The budget and schedule are way overrun?
- The code is inscrutable, scary to modify?
5Software Quality Is ...
- Accuracy, adaptability, affordability,
availability, ? - Some difficulties What if
- These attributes conflict with each other?
- Customers cant specify how much of each they
need? - Tradeoffs are impractical to analyze or
prioritize? - Attribute levels cant be accurately measured?
6Software Quality Is ...
- Satisfying the customer?
- Some difficulties What if
- Customer is unhappy next month?
- Users are unhappy?
- Maintainers are unhappy, disempowered?
- Developers are burned out?
7Hypothesis Software Quality Is
- Making winners of all your systems stakeholders
- Users, customers, developers, maintainers,
interoperators, general public, other significant
affected parties
8Stakeholder/Attribute Relationship
9Outline
- What is Software Quality?
- A Conceptual Framework for Achieving Quality
- The WinWin Spiral Model
- Win Win Negotiation Aids for Quality Requirements
- Software Quality Opportunity Trees
10The Fundamental Success Condition
- Your project will succeed
- if and only if
- You make winners of all the critical stakeholders
- Usually Users, customers, developers,
maintainers - Sometimes Interfacers, testers, reusers, general
public
11Win-Lose Evolves into Lose-Lose
Proposed Solution
Winner
Loser
Cheap, Sloppy Product (Buyer knows best)
Developer Customer
User
Lots of bells and whistles (Cost-plus)
Developer User
Customer
Driving too hard a bargain (Best and Final
offers)
Customer User
Developer
12Model-Clash Spider Web chart
13Theory W Management Steps
- 1. Identify success-critical stakeholders
- 2. Identify stakeholders win conditions
- 3. Identify win condition conflict issues
- 4. Negotiate top-level win-win agreements
- Invent options for mutual gain
- Explore option tradeoffs
- Manage expectations
- 5. Embody win-win agreements into specs and plans
- 6. Elaborate steps 1-5 until product is fully
developed - Confront, resolve new win-lose, lose-lose risk
items
14Theory W Extension to Spiral Model
2. Identify Stakeholders win conditions
Reconcile win conditions. Establish next level
objectives, constraints, alternatives
3.
1. Identify next-level Stakeholders
Review, commitment
7.
Evaluate product and process alternatives. Resolv
e Risks
4.
6.
Validate product and process definitions
Define next level of product and process -
including partitions
5.
15WinWin Negotiation Model
Win Condition
Issue
Involves
Rationale
Rationale
Attachments
Attachments
Addresses
Covers
Agreement
Option
Rationale
Rationale
Adopts
Attachments
Attachments
16EasyWinWin OnLine Negotiation Steps
17People submit and share ideas about their win
conditions using electronic discussion sheets
18Team builds a clean list of win conditions and
organizes win conditions into pre-defined buckets
19(5) Prioritize win conditions
- Objective Scope project, gain focus
- How Vote on Business Importance Ease of
Realization - Result Prioritized win conditions
20(No Transcript)
21(No Transcript)
22Outline
- What is Software Quality?
- A Conceptual Framework for Achieving Quality
- The WinWin Spiral Model
- Win Win Negotiation Aids for Quality Requirements
- Software Quality Opportunity Trees
23Software Dependability Opportunity Tree
24Software Defect Prevention Opportunity Tree
- IPT, JAD, WinWin,
- PSP, Cleanroom, Dual development,
- Manual execution, scenarios,...
- Staffing for dependability
- Rqts., Design, Code,
- Interfaces, traceability,
- Checklists
Peoplepractices Standards Languages Protot
yping Modeling Simulation Reuse Root cause
analysis
DefectPrevention
25People Practices Some Empirical Data
- Cleanroom Software Engineering Lab
- 25-75 reduction in failure rates
- 5 vs 60 of fix efforts over 1 hour
- Personal Software Process/Team Software Process
- 50-75 defect reduction in CMM Level 5
organization - Even higher reductions for less mature
organizations - Staffing
- Many experiments find factor-of-10 differences in
peoples defect rates
26Root Cause Analysis
- Each defect found can trigger five analyses
- Debugging eliminating the defect
- Regression ensuring that the fix doesnt create
new defects - Similarity looking for similar defects elsewhere
- Insertion catching future similar defects
earlier - Prevention finding ways to avoid such defects
- How many does your organization do?
27Pareto 80-20 Phenomena
- 80 of the rework comes from 20 of the defects
- 80 of the defects come from 20 of the modules
- About half the modules are defect-free
- 90 of the downtime comes from lt 10 of the
defects
28Pareto Analysis of Rework Costs
TRW Project B 1005 SPRs
100
90
80
TRW Project A 373 SPRs
70
of Cost to Fix SPRs
60
50
Major Rework Sources Off-Nominal
Architecture-Breakers A - Network Failover B -
Extra-Long Messages
40
30
20
10
0
0
10
20
30
40
50
60
70
80
90
100
of Software Problem Reports (SPRs)
29Software Defect Detection Opportunity Tree
- Completeness checking
- Consistency checking
- - Views, interfaces, behavior, pre/post
conditions - Traceability checking
- Compliance checking
- - Models, assertions, standards
Automated Analysis
Defect Detection and Removal -
Rqts. - Design - Code
Peer reviews, inspections Architecture Review
Boards Pair programming
Reviewing
Requirements design Structural Operational
profile Usage (alpha, beta) Regression Value/Ri
sk - based Test automation
Testing
30Orthogonal Defect Classification- Chillarege,
1996
31UMD-USC CeBASE Experience Comparisons-
http//www.cebase.org
CeBASE
- Under specified conditions,
- Technique Selection Guidance (UMD, USC)
- Peer reviews are more effective than functional
testing for faults of omission and incorrect
specification - Peer reviews catch 60 of the defects
- Functional testing is more effective than reviews
for faults concerning numerical approximations
and control flow - Technique Definition Guidance (UMD)
- For a reviewer with an average experience level,
a procedural approach to defect detection is more
effective than a less procedural one. - Readers of a software artifact are more effective
in uncovering defects when each uses a different
and specific focus. - Perspective - based reviews catch 35 more defects
31
01/10/01
32Factor-of-100 Growth in Software Cost-to-Fix
1000
Larger software projects
500
IBM-SSD
GTE
200
100
80
Relative cost to fix defect
50
20
SAFEGUARD
20
10
Smaller software projects
5
2
1
Requirements Design Code
Development Acceptance Operation
test test
Phase in which defect was fixed
33Reducing Software Cost-to-Fix CCPDS-R- Royce,
1998
-Integration during the design phase -Demonstratio
n-based evaluation
- Configuration baseline change metrics
34COQUALMO Constructive Quality Model
COCOMO II
Software development effort, cost
COQUALMO
and schedule estimate
Software Size estimate
Software product,
Defect Introduction
process, computer and
Model
personnel attributes
Number of residual defects
Defect density per unit of size
Defect Removal
Defect removal capability
Model
levels
35Defect Impact Reduction Opportunity Tree
- Business case analysis
- Pareto (80-20) analysis
- V/R-based reviews
- V/R-based testing
- Cost/schedule/quality
- as independent variable
Value/Risk - Based Defect Reduction
Decrease Defect Impact, Size (Loss)
Fault tolerance Self-stabilizing
SW Reduced-capability models Manual
Backup Rapid recovery
Graceful Degredation
36Software Dependability Opportunity Tree
Defect Prevention
Decrease Defect Prob (Loss)
Defect Detection and Removal
Decrease Defect Risk Exposure
Decrease Defect Impact, Size (Loss)
Value/Risk - Based Defect Reduction
Graceful Degradation
CI Methods and Metrics
Continuous Improvement
Process, Product, People
Technology
37CeBASE
The Experience Factory Organization- Exemplar
NASA Software Engineering Lab
Project Organization Experience Factory
environment characteristics
1. Characterize 2. Set Goals 3. Choose Process
Project Support
6. Package
tailorable
Generalize
knowledge, consulting
products, lessons learned, models
Execution plans
Tailor
Experience Base
Formalize
project analysis, process modification
Disseminate
5. Analyze
4. Execute Process
data, lessons learned
38Goal-Model-Question Metric Approach to Testing
Note No one-size-fits-all solution
39GMQM Paradigm Value/Risk - Driven Testing
- Goal Minimize adverse effects of defects
- Models Business-case or mission models of value
- By feature, quality attribute, or delivery time
- Attribute tradeoff relationships
- Questions What HDC techniques best address
high-value elements? - How well are project techniques minimizing
adverse effects? - Metrics Risk exposure reduction vs. time
- Business-case based earned value
- Value-weighted defect detection yields by
technique
40Dependability Attributes and Tradeoffs
- Robustness reliability, availability,
survivability - Protection security, safety
- Quality of Service accuracy, fidelity,
performance assurance - Integrity correctness, verifiability
- Attributes mostly compatible and synergetic
- Some conflicts and tradeoffs
- Spreading information survivability vs. security
- Fail-safe safety vs. quality of service
- Graceful degredation survivability vs. quality
of service
41Resulting Reduction in Risk Exposure
Acceptance, Defect-density
testing (all defects equal)
Risk Exposure from Defects P(L) S(L)
Value/Risk- Driven Testing (80-20 value
distribution)
Time to Ship (amount of testing)
4220 of Features Provide 80 of Value Focus
Testing on These (Bullock, 2000)
100
80
of Value for Correct Customer Billing
60
40
20
5
10
15
Customer Type
43Cost, Schedule, Quality Pick any Two?
C
Q
S
44Cost, Schedule, Quality Pick any Two?
- Consider C, S, Q as Independent Variable
- Feature Set as Dependent Variable
C
C
Q
S
Q
S
45C, S, Q as Independent Variable
- Determine Desired Delivered Defect Density (D4)
- Or a value-based equivalent
- Prioritize desired features
- Via QFD, IPT, stakeholder win-win
- Determine Core Capability
- 90 confidence of D4 within cost and schedule
- Balance parametric models and expert judgment
- Architect for ease of adding next-priority
features - Hide sources of change within modules (Parnas)
- Develop core capability to D4 quality level
- Usually in less than available cost and schedule
- Add next priority features as resources permit
- Versions used successfully on 17 of 19 USC
digital library projects
46Rawls Theory of Justice (1971)
- Fair rules of conduct
- Principles of justice
- Participants and obligations
- Provider (developer)
- Buyer (acquirer)
- User(s)
- Penumbra (general public)
- Negotiate mutually satisfactory (win-win)
agreements
47Rawls Theory of Justice - II
- Fair rules of conduct
- Negotiation among interested parties
- Veil of ignorance (about what affects whom)
- Rationality
- Principles
- Least Advantaged - dont increase harm to them
- Harm probability x magnitude (risk exposure)
- Risking harm - dont risk increasing harm
- Dont use low-threat software in high-threat
context - Publicity test - defensible with honor before an
informed public - Use for difficult cost-benefit tradeoffs
48Obligations of the Software Provider
49Obligations of the Software Buyer
50Obligations of the Software User
51Obligations of the Software Penumbra
52Penumbra Negotiation Example Fire Dispatching
System
- Dispatch to minimize value of property loss
- Neglect safety, least-advantaged property owners
- English-only dispatcher service
- Neglect least-advantaged immigrants
- Minimal recordkeeping
- Reduced accountability
- Tight budget design for nominal case
- Neglect reliability, safety, crisis performance
53Conclusions
- Stakeholder win-win approach provides
- Tailorable definition of software quality
- Procedures for negotiating quality attribute
tradeoffs - Process (WinWin Spiral Model) for achieving
quality software products - Integration of ethical issues into daily practice
- WinWin groupware system provides
- Collaborative support for stakeholder win-win
approach - Rapid stakeholders achievement of shared system
vision - Increased trust