CS577a Fall 2003 September 15, 2003

1 / 40
About This Presentation
Title:

CS577a Fall 2003 September 15, 2003

Description:

... to managing and tracking the development and maintenance of a software product. ... Desk Check; Personal (PSP) Review. Walkthroughs. Buddy Check: like an ... – PowerPoint PPT presentation

Number of Views:24
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: CS577a Fall 2003 September 15, 2003


1
CS577a Fall 2003 September 15, 2003
  • Software Quality Management 1Introduction to
    Reviews
  • A. Winsor Brown

2
Goals of Presentation
  • Reviews per IEEE J-STD-016-1995Standard for
    Information Technology Software Life Cycle
    Processes Software Development
    Acquirer-Supplier Agreement
  • Introductions
  • You should learn about
  • Quality Management
  • What is it where it fits
  • Principles behind it
  • How its distinct from QA and QI
  • IVV option for DEN-remote students
  • Peer Reviews as practiced in CS577

3
IEEE J-STD-016-1995
  • Standard for Information Technology Software
    Life Cycle Processes Software Development
    Acquirer-Supplier Agreement
  • ls usable with any development strategy
    structured to better accommodate incremental,
    evolutionary, and other development models than
    the traditional waterfall model. It is
    structured to avoid time-oriented dependencies
    and implications, provides alternatives to formal
    reviews (that can force a waterfall development
    model), and explains how to apply the standard
    across multiple builds or iterations.

4
Reviews per IEEE J-STD-016-1995
  • joint review A process or meeting involving
    representatives of both the acquirer and the
    developer, during which project status, software
    products, and/or project issues are examined and
    discussed.
  • Joint technical and management reviews
  • joint acquirer/developer
  • Joint technical reviews
  • Joint management reviews

5
Reviews per IEEE J-STD-016-1995
  • Joint technical reviews objectives
  • a) Review evolving software products, using as
    criteria the software product evaluation criteria
    in annex L review and demonstrate proposed
    technical solutions provide insight and obtain
    feedback on the technical effort surface and
    resolve technical issues.
  • b) Review project status surface near- and
    long-term risks regarding technical, cost, and
    schedule issues.
  • c) ...
  • d)
  • e) ....

6
Reviews per IEEE J-STD-016-1995
  • Joint management reviews objectives
  • a) Keep management informed about project status,
    directions being taken, technical agreements
    reached, and overall status of evolving software
    products.
  • b) Resolve issues that could not be resolved at
    joint technical reviews.
  • c) Arrive at agreed-upon mitigation strategies
    for near- and long-term risks that could not be
    resolved at joint technical reviews.
  • d) Identify and resolve management-level issues
    and risks not raised at joint technical reviews.
  • e) Obtain commitments and acquirer approvals
    needed for timely accomplishment of the project.

7
A little about me
  • Assistant Director, USC Center for Software
    Engineering
  • Teach Quality Management and related areas in
    CS577 CS511 -- Personal Software Process
    Project
  • Assisted with projects (a client, along with RAs,
    for many) over the entire year (including
    summer)!
  • Assisted with DEN-remote students IVV
  • Direct very large CSE efforts
  • eBASE system development
  • FAA Macro Software Engineering Assistance
  • FCS use of spiral model System of system
    estimating
  • Manage most of Dr. Boehms DR students

8
Agenda
  • Reviews per IEEE J-STD-016-1995
  • Introductions
  • CS577a Quality Management ?
  • What is it where it fits
  • Principles behind it
  • How its distinct from QA and QI
  • IVV by DEN-remote students
  • Peer Reviews as practiced in CS577

9
CS577
  • Software Engineering of Large Systems
  • Not enough time to do a large system, but
  • Real client
  • Real deliverables
  • Teach and Use best practices techniques, tools
    and approaches
  • MBASE WinWin Spiral Model Risk Driven COCOMO
    II
  • OO Analysis and Design
  • Quality Management
  • Configuration Management
  • Projects run documented like Large Systems
    projects
  • Evolving

10
Quality Management
  • The Q tasks in QM
  • Quality Assessment
  • Quality Tracking
  • Quality Improvement
  • The pre-requisites
  • Configuration Management
  • Early Defect Finding (Identification) mechanisms
  • Defect and Effort Data submittal
  • Defect and Effort Data analysis and correction
    not really possible in a course done across the
    years

11
Configuration Management (CM)
  • Why?
  • Large Systems best practice
  • How to keep track of the pieces, and correctly
    associate defects or problems or issues
    with an artifact
  • Allow IVVers truly remote access to the most
    update version
  • Why NOT just use CVS?
  • What about word documents, Rose Models,
  • What about remote access in USC/ISD setting
  • What about comparison and change tracking

12
Software CM Concepts
  • Definition (one of many possible)
  • A disciplined approach to managing and tracking
    the development and maintenance of a software
    product.
  • Same Issues
  • Components Code, COTS products
  • Baselines LCA,LCO, RLCA
  • Versions or Models Target platform, Versioned
    Releases, Current Configuration (Microsoft
    Nightly Build)

13
Software CM Concepts
  • Small projects with one developer
  • Large scale software engineering projects with
    multiple developers working in parallel
  • Basic ideas (easily explored with Help)
  • Check in - Check out
  • Compare - Add
  • History
  • Identify what technique will be used in LCP
  • Manual ex. Zipped up version-named files
  • Clear Case
  • CVS
  • Advanced Ideas (used in 577b, if at all)
  • Branching
  • Merging

14
QM vs CM
  • Tracking Assessments of products is of little
    value if you dont know what is in the
    product/program.
  • Quality tracking defect/issues
  • Quality Improvement Tracking Assessments
    Quality Tracking

15
Agenda
  • Reviews per IEEE J-STD-016-199u5
  • Introductions
  • CS577a Quality Management ?
  • What is it where it fits
  • Principles behind it
  • How its distinct from QA and QI
  • IVV by DEN-remote students
  • Peer Reviews as practiced in CS577

16
Quality Model Types
  • All four types exist Process, Product, Property,
    Success
  • Product
  • Whats a defect?
  • Problem reports
  • Property
  • Defects over time Removal and residual
    injection rates
  • Defect Density
  • Success
  • Defect removal rate
  • Problem/Trouble Reports Open over time
  • Process
  • Macro Defect injection and removal workflow
  • Micro ETVX, defect removal techniques, etc.

17
Product Models Related to Quality
  • Whats a defect?
  • An instance of non-conformance with the
    initiating requirements, standards, or exit
    criteria
  • Can exist in the accuracy/completeness of
    requirements, standards, and associated
    interface/reference documents
  • Determined ONLY by the responsible Author of an
    artifact
  • Typically start out as concerns in informal or
    agile reviews
  • Whats an issue
  • Concerns that can NOT be fixed by the author of
    the artifact under review
  • In developments with large number of people or
    cycles, issues are usually tracked to closure.

18
Defect Categories
  • Severity
  • Major
  • A Condition that causes an operational failure,
    malfunction, or prevents attainment of an
    expected or specified result
  • Information that would lead to an incorrect
    response or misinterpretation of the information
    by the user
  • An instance of non-conformance that would lead to
    a discrepancy report if implemented as is
  • Minor
  • A violation of standards, guidelines, or rules,
    but would not lead to a discrepancy report
  • Information that is undesirable but would not
    cause a malfunction or unexpected results (bad
    workmanship)
  • Information that, if left uncorrected, may
    decrease maintainability

19
Defect Categories (continued)
  • Class
  • Missing
  • Information that is specified in the requirements
    or standard, but is not present in the document
  • Wrong
  • Information that is specified in the requirements
    or standards and is present in the document, but
    the information is incorrect
  • Extra
  • Information that is not specified in the
    requirements or standards but is present in the
    document

20
Defect Categories (continued)
  • Type
  • Avoidable
  • Unavoidable defects (AKA changes) arise because
    of the methods, techniques or approaches being
    followed necessitate changes. Examples include
    changes arising because of the dynamics of
    learning, exploration in IKIWISI situations, code
    or screen contents reorganizations taken on as an
    "afterthought", replacement of stubs or
    place-holders in code, etc. Such situations are
    often "planned for" and expected to occur.
  • Unavoidable
  • Changes in analysis, design, code or
    documentation arising from human error, and which
    could be avoided through better analysis, design,
    training, etc. Examples include stub replacement
    that violates win conditions or requirements such
    as execution time, memory space for instance
    the replacement of a "stub" which breaks a
    critical timing constraint.

21
Defect Categories (continued)
Defect Categories
Severity
Class
Type
Major
Missing
Avoidable
Wrong
Unavoidable
Minor
Extra
22
Quality Management Manifestations in MBASE
  • Quality Management Quality Assurance Quality
    Improvement
  • Quality Assurance and Quality Improvement
    require Quality Assessment (against property
    models)

23
Defect Identification/Removal Techniques (AKA
Assessment Methods)
  • Testing formal and informal (debugging)
  • User Feedback formal and informal
  • Reviews
  • External (technical) IVV (Agile Artifact
    Review)See 9/5 webcast on IVV
  • Internal Peer Reviews
  • Desk Check Personal (PSP) Review
  • Walkthroughs
  • Buddy Check like an Agile Artifact Review, but
  • Informal/Internal reviews (various flavors)
  • Other inspection methods
  • Fagans inspections
  • Management Reviews
  • Combined Management Technical Reviews (like
    ARBs)

24
Defect Identification/Removal Techniques And
Recording Techniques/Tools
  • Process descriptions (steps, roles, etc.) to be
    available through course webpage
  • Agile Internal/Informal Review On-Campus
    students
  • Agile Artifact Review DEN-remote students
  • Quality Reports (form sets) to be available from
    course webpage
  • Agile report forms (less paperwork) Agile
    Internal/Informal Review, Agile Artifact Review
  • All must be turned in for vetting and/or grading

25
Agile Internal/Informal Review
  • The main activities are following
  • Planning
  • Overview (optional)
  • Preparation
  • Review Meeting
  • Rework
  • Participants
  • Review Leader (recommended quality focal point
    for cs577)
  • Reviewer at least 1 person
  • Author

26
Agile Internal/Informal Review
Planning
Overview
Preparation
Problem List
Concern Log
Review
Rework
Review Result Summary
27
Agile Internal/Informal Review
28
Agile Internal/Informal Review
29
Agile Internal/Informal Review
30
Agile Internal/Informal Review
31
ETVX ParadigmSoftware Development
ProcessFagan's Inspection
Relationships
A software development process, e.g., specifying,
designing, coding
Tasks distributed to team members
Fagan's Inspection
32
CS577 Model/Document Assessment
33
Life Cycle Of A CS577 Document
34
CS577 MBASE Defect Reporting Concepts
  • Range of Defect identification and reporting
    mechanisms
  • One at a time Problem report system
  • Multiple issues/problems found by a single
    reviewer Agile Artifact review (only two types
    of forms Issues/Concern and Defect List)?
  • Agile Internal/Informal Review Three types of
    forms
  • Agile Formal Review Three different types of
    forms
  • Internal/Informal Review Four different types of
    bigger forms
  • Formal Review Four different types of bigger
    forms
  • Fagans Inspection Five different types of forms

35
Team Defect Identification Techniques(AKA Peer
Reviews)
  • Apply to SSRD and SSAD possibly OCD
  • Some form of team based peer review
  • Agile Internal/Informal Review Three types of
    forms
  • Internal/Informal Review Four different types of
    bigger forms
  • Whats the difference
  • Amount of data captured by the team
  • Both have COQUALMO People Review levels between
  • Low plus well defined sequence of preparation,
    review, follow-up. Informal review roles and
    procedures
  • Normal All to the left, plus formal review roles
    and procedures applied to detailed design and
    code reviews

36
IVV for DEN-remote Students
  • Independent Verification Validation (IVV)
    person on one of the campus based projects.
    Details for the IVV role are evolving, but
    concepts is fairly straight forward
  • You do all the same individual homework
    assignment quizzes and learning, including the
    use of the class tools
  • Rather than producing a project TEAMs artifacts
    or homework assignments, you will be
  • Reviewing them Agile Artifact Review
  • Analyzing or evaluating them specifics
    provided
  • You do a thorough technical evaluation of the
    document packages and models
  • Ideally, you participate telephonically in the
    ARB for your team

37
IVV for DEN-remote Students (cont.)
  • Reviewing a project teams artifacts. As a
    results of the review,
  • IVVer may resolve simple questions directly with
    the Artifacts author
  • IVVer may resolve simple questions directly with
    the Artifacts author
  • dSend all concern logs to the team
    electronically a copy to the TAs
  • You may generate problem reports for open issues
    or the problem that you do not find during review
    session
  • Technical evaluation of the document packages and
    models
  • Made available prior to the Architecture Review
    Board
  • Submit concern logs for newly found concerns to
    the team
  • Submit evaluation report to CS577a
    instructional staff and team
  • Before team receives feedback from instructional
    staff
  • Before the scheduled ARB

38
IVV for DEN-remote Students (cont.)
  • HINTS for IVVer
  • Look at general specific guidelines for package
    evaluation
  • Read LCP first (for any process tailoring)
  • Read OCD, SSRD, SSAD, FRD (in that order)
  • BUT order of IMPORTANCE for concerns is SSRD,
    SSAD, OCD, FRD and LCP
  • Record your effort
  • Full effort while reading/reviewing documents
  • 50 to 90 of effort spent on analysis and
    generation of evaluation report (i.e. discount
    for learning)
  • Participate in ARB for your team
  • Telephonically (we can/will call you)
  • Possibly supplemented by remote meeting
    capabilities, like NetMeeting

39
LCP
  • For Process Deviations/Tailoring
  • If your project differs from the standard CS577
    approach, document that in section 2.1 of LCP
  • For Quality Management
  • Record quality assessment techniques planned for
    the current or next phase, as appropriate
  • Record quality assessment techniques applied
    during the previous current or previous phase and
    where the quality reports can be found

40
Software Quality Plan Section of LCP
  • Identify what Assessment Activities will be used
    by reference is best, use inline definition ONLY
    if necessary
  • Identify IVVer, if you have one
  • Identify what "Quality Reports" will be generated
  • For all assessment activities
  • For all postfacto changes
  • Identify when the Activities takes place and/or
    Quality Reports are generated. This can be done
    in a relative to project plan fashion, rather
    than absolute e.g. date fashion.
Write a Comment
User Comments (0)