Software Quality Management - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Software Quality Management

Description:

CARA provides a systematic procedure for rank ordering development program ... CARA has been applied to space systems including : ... – PowerPoint PPT presentation

Number of Views:28
Avg rating:3.0/5.0
Slides: 24
Provided by: ianh161
Category:

less

Transcript and Presenter's Notes

Title: Software Quality Management


1
Software Quality Management
Software Quality Management Processes V V of
Critical Software Systems Ian Hirst
2
Agenda
  • review
  • risk based V V
  • software systems criticality
  • criticality analysis risk assessment (CARA)
    method
  • impact risk driver categories values
  • CARA steps
  • CARA implementation recommendations
  • CARA benefits

3
Review
  • many software projects fail to meet objectives
  • lack of objective quality evidence is common
  • a complex solution requires significant test
    planning and management effort and a complex set
    of testing activities
  • complete testing is usually not possible
  • testing should be focused on assisting with
    evaluation of the success of the project and the
    quality of the delivered solution.

4
Risk based V V
  • Risk probability of occurrence x impact
  • V V is primarily a risk management activity
  • risks can be associated with both product
    processes
  • high impact elements are critical elements
  • V V is aimed at reducing or eliminating
    uncertainty by the provision of evidence of the
    capability quality of software systems
  • V V on all software is not necessary or
    financially feasible
  • risk based V V is a targeted activity.

5
Software systems criticality
  • Criticality is a measure of the impact of errors
    on
  • system performance and operations
  • safety / security
  • cost schedule
  • Risk is a measure of the likelihood of errors
    based on
  • complexity
  • maturity of technology
  • requirements definition stability
  • testability
  • developer experience.

6
The Criticality Analysis Risk Assessment
(CARA) method
  • CARA is a formalised methodology which evolved
    from a US Air Force V V initiative.
  • CARA provides a systematic procedure for rank
    ordering development program elements with
    respect to well defined scoring factors
    associated with criticality risks drivers
  • CARA is a means for evaluating the risk exposure
    for software or systems and sizing the V V
    effort
  • CARA has been applied to space systems including
  • space shuttle software and critical mission
    support software
  • space station flight software.

7
The criticality risk driver categories values
  • error impact categories values
  • Catastrophic (4)
  • Critical (3)
  • Marginal (2)
  • Negligible (1)
  • risk driver categories values
  • Complexity (high3, moderate2, low 1)
  • Maturity of technology (high3, moderate2, low
    1)
  • Requirements definition stability (high3,
    moderate2, low1)
  • Testability (high3, moderate2, low1).
  • refer tables 14.3 14.4

8
The criticality categories values
9
The risk driver categories values
10
CARA steps
  • Step 1 Identify software functions
  • Step 2 Establish the evaluation team
  • Step 3 Develop the CARA evaluation criteria
  • Step 4 Perform criticality analysis risk
    assessment
  • Step 5 Set VV analysis level (VAL) thresholds
  • Step 6 Estimate software size
  • Step 7 Generate VV effort estimates
  • Step 8 Evaluate effort estimate results
  • Step 9 Revise VV scope.

11
Step 1 Identify software functions
  • Collect systems information (from CONOPS,specs,
    business cases)
  • Identify the required software capabilities
    (functions and performance)
  • Build a scoring matrix
  • A solution decomposition/ PBS based structure is
    useful
  • Values may be assigned at any appropriate level
    of abstraction (requirements, requirements
    groups, components, sub systems, etc)
  • Identify related systems domains / areas of
    specialisation.

12
Step 2 Establish the evaluation team
  • Engage system domain experts
  • Engage development process experts
  • Establish management team processes.

13
Step 3 Develop the CARA evaluation criteria
  • Collect evaluation criteria from similar domains
  • Develop an understanding of the mission the
    system is to perform
  • Tailor criticality evaluation criteria in terms
    of what is catastrophic, critical, or of moderate
    impact to users, customers, and acquirers of the
    system
  • Tailor risk evaluation criteria ( by inclusion of
    additional drivers)
  • Identify criticality area or risk driver
    weightings, if necessary
  • Review criteria with the customer.

14
Step 4 Perform criticality analysis risk
assessment
  • Perform criticality analysis
  • consider the systems components, their
    interactions, failure modes effects, and
    concepts of operations
  • rate the functions according to criteria and
    scoring rationale
  • Perform risk analysis
  • review software system development, test and
    verification plans
  • review development methods, testing approach,
    reuse plans, organisational interfaces,
    integration requirements, risks risk mitigation
    techniques
  • rate the functions according to criteria and
    scoring rationale
  • Calculate CARA scores (n CriticalityxW x
    RiskxW)
  • Rank elements in score order.

15
Step 5 Set VV analysis level (VAL) thresholds
  • Functions with higher CARA scores receive higher
    VALs
  • Example VAL thresholds
  • CARA score VAL
  • 1ltCARAlt2 None
  • 2ltCARAlt5 Limited
  • 5ltCARAlt8 Focused
  • 8ltCARAlt12 Comprehensive
  • e.g. 1 Safety impact of 4, complexity risk of 3,
    n 12 (unweighted)
  • e.g. 2 Cost impact of 1, maturity risk of 2, n
    2 (unweighted).

16
Step 6 Estimate software size
  • This step can be performed anytime before step 7
  • The size measurement is the V V workload
  • VV work f (no. of requirements, external
    interfaces, output products)
  • An alternative size measure is the developer
    software size estimate (e.g. SLOC)

17
Step 7 Generate V V effort estimates
(including an independent V V effort estimate)
  • Apply VV productivity factors to size estimates
    (these may vary according to VALs, software
    complexity size, development methods,
    development types (initial production, block
    update), domains, developer maturity and
    experience, VV agent experience)
  • Apply program project management (schedule and
    effort estimation) standards conventions .

18
Step 8 Evaluate effort estimate results
  • Review the results with the customer
  • If the prescribed VALs for the software functions
    and associated costs are acceptable, generate the
    critical functions list (this defines the VV
    scope and priorities) .

19
Step 9 Revise VV scope
  • If the results are not acceptable, use the
    independent VV estimate to re-scope the effort
  • VAL threshold adjustments may be used to aid
    breadth v. depth tradeoffs
  • VAL selective exceptions/adjustments may be used
    e.g. for safety critical functions with a score
    of 3 or more - apply focused VV (instead of
    limited) ....

20
CARA implementation recommendations
  • CARA scoring must be done by domain experts
  • CARA scoring must be done in a peer review
    environment
  • Project management should participate in scoring
    activities
  • CARA training should be done with all personnel
  • Automation tools are necessary for large projects
  • Capturing the scoring rationale is important
  • CARA should be repeated at least once every major
    development milestone.

21
CARA benefits
  • The results can be used to support VV planning
    management
  • assessment of overall and relative risks
  • allocation of fixed resources across a set of VV
    objectives and tasks
  • assessment of the need for future VV resources
  • establishing VV importance levels / focus
    points/ priorities
  • prioritisation of items for work sequencing
  • CARA establishes a structured approach to VV
    which increased customer visibility into risk,
    risk mitigation and VV activities.

22
Review
  • V V is primarily a risk management activity
  • V V on all software is not necessary or
    financially feasible
  • Risk based V V is a targeted activity
  • The criticality analysis risk assessment (CARA)
    method may be used to analyse, plan and justify a
    structured risk based V V program

23
References
  • Sommerville edition 7
  • chapter 20 Critical Systems Development
  • chapter 24 Critical Systems Validation
  • handout Determining the Required Level of IVV
    Program Boughton
  • Marvin V. Zelkowitz and Iona Rus,Understanding
    IV V in a safety critical and complex
    evolutionary environment the NASA space shuttle
    program, ICSE 23 (23rd International Conference
    on Software Engineering) p 349-357, 2001, IEEE
    Computer Society
Write a Comment
User Comments (0)
About PowerShow.com