Software Engineering Process I Reviews - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Software Engineering Process I Reviews

Description:

Personal reviews. Personal review is the work product owner reviewing their own stuff ... Size of product being reviewed (in pages or LOC) The review time, in minutes ... – PowerPoint PPT presentation

Number of Views:100
Avg rating:3.0/5.0
Slides: 34
Provided by: users3
Category:

less

Transcript and Presenter's Notes

Title: Software Engineering Process I Reviews


1
Software Engineering Process IReviews
  • INFO 636
  • Glenn Booker

2
Reviews
  • Conducting reviews of requirements, design, and
    code is one of the best ways to improve your
    works quality and your productivity
  • Here well look at various types of reviews and
    how to document them

3
Reviews
  • Review types, in descending order of formality,
    include
  • Inspections
  • Walk-throughs
  • Personal reviews

4
Inspections
  • Inspections follow a structured procedure for
    evaluating a work product
  • Fagan inspections are among the best known brand
    of inspection
  • Inspections start with preparation, where each
    participant reviews the work separately, and
    makes note of defects found

5
Inspections
  • Then theres an inspection meeting to discuss
    the findings of each participant, and put
    together a cumulative list of defects
  • Then the work product owner fixes the defects,
    and puts together a report to say so, in the
    repair and report phase

6
Walk-throughs
  • Walk-throughs require little preparation, except
    by the work product owner
  • A presentation is given, and participants provide
    feedback during it
  • Follow-up is informal, with the work product
    owner responding to the comments received

7
Personal reviews
  • Personal review is the work product owner
    reviewing their own stuff
  • As compiling code has gotten trivially easy, many
    programmers have dropped reviewing their own
    work in the hopes that the computer will find
    their mistakes
  • Not a good strategy!

8
Target of Reviews
  • Any work product can be the subject of reviews
  • Any document
  • Requirements specification
  • Design models
  • Test plans
  • Internal project processes procedures
  • Source code
  • Scripts too!

9
Commentary
  • For those taking INFO 637, the Team Software
    Process uses formal reviews extensively, so pay
    extra attention!
  • N track people - while the text obviously focuses
    on reviews related to code, keep in mind that
    these methods and tools for reviews can be used
    to plan and conduct reviews for anything

10
Why Review Software?
  • The history of the PSP has shown that most
    people
  • Initially spend much of their time (30-50) in
    compiling and testing
  • By the end of this course, only about 10 of
    their time is spent testing
  • Good reviews are a key to reducing testing time

11
Review Efficiency
  • Finding and fixing defects is much faster to do
    in review than in testing
  • Humphrey found 8x faster fix time in review than
    testing
  • Code reviews are 3-5 times as efficient at
    finding defects than testing
  • Part of the reason is that testing finds symptoms
    of the defect, which has to be investigated by
    debugging

12
Severity of Review
  • We dont mean to imply that every piece of code
    needs exhaustive review
  • Different approaches can be used, depending on
    the complexity, risk, and importance of the code
  • Hence you might use inspections for critical
    code, walk-throughs for typical code, and just
    personal review for low risk code

13
Review Principles
  • Any kind of review process typically follows
    three principles
  • Establish defined review goals
  • Follow a defined process for conducting a review
    (here, well use scripts)
  • Measure and improve your review process

14
Separate Design and Code Reviews
  • Design and code should be reviewed separately
  • Forces making a design before coding
  • Its hard to decipher design from the code
  • Helps spot logic errors in design, and identify
    design improvements
  • Helps focus review scope

15
Design Reviews
  • Make your design reviewable
  • Follow a standard notation for design, such as
    UML, DFD, ERD, etc.
  • Make sure design addresses both functional and
    non-functional requirements
  • Follow personal design standards, hopefully in
    concert with organizational standards

16
Design Reviews
  • Follow a design review strategy
  • Look at various elements of design systematically
    dont try to assess it all at once
  • Design review strategy stages might include
  • Check for required program elements

17
Design Reviews
  • Examine overall program structure and flow
  • Check for logical completeness
  • Check for robustness - handling errors, etc.
  • Check parameters and types for methods and
    procedure calls
  • Check special variables, data types, and files,
    including aliases

18
Design Reviews
  • Check design against the requirements
  • More elaborate inspections might use
  • A traceability matrix to prove completeness, or
  • Use formal methods (Z, Larch) to show
    correctness mathematically

19
Measuring Reviews
  • Key basic measures for reviews are
  • Size of product being reviewed (in pages or LOC)
  • The review time, in minutes
  • The number of defects found
  • And based on later work, the defects that
    werent found by the review

20
Measuring Reviews
  • Derived metrics for reviews are
  • Review yield, the percent of defects found by
    review
  • Yield 100(defects found) /
    (defects found defects not found)
  • Number of defects found per kLOC or page
  • Number of defects found per hour of review time

21
Measuring Reviews
  • The number of LOC or pages reviewed per hour
  • Defect Removal Leverage (DRL)
  • The ratio of defects removed per hour for any
    two phases or activities
  • DRL(coding) Defects/hour(coding)/
    Defects/hour(design)

22
Checklists
  • Checklists are used to help make sure a process
    or procedure is followed consistently each time
  • A sample code review checklist for C is on
    page 242 variations can be developed for other
    languages
  • It has several blank columns so each module can
    be checked off separately

23
Designing Checklists
  • Checklists should be designed so that you have to
    focus on only one topic at a time
  • Similar to reviewing a book for grammar versus
    plot development its hard to look for both at
    once
  • To use a checklist most effectively, completely
    review one module

24
Using Checklists
  • Different strategies should be considered for
    different types of reviews
  • Design review for a large application might
    prefer to be from the top down
  • Code review often works better from the bottom
    up for your code, but top down for someone elses

25
Building Checklists
  • Dont take the example on p. 242 as the ultimate
    final perfect most-wonderful-of-all checklist
    that ever was breathe
  • Study the kinds of problems you encounter (in
    your defect log) to see what you need to
    emphasize in your checklist

26
Building Checklists
  • The types of defects are given on page 260
    again, consider adapting this to your needs and
    other languages
  • One way to look for your most common types of
    defects is to lump all your defect logs together,
    and generate a Pareto chart by defect type

27
Building Checklists
  • A refined defect type list is shown on page 262
    you can use a Pareto diagram to figure out which
    kinds of defects you need to expand upon
  • This also connects to the coding standard
    developed ages ago you can use lessons learned
    from defect analysis to help refine the coding
    standard

28
Review Before or After Compile?
  • A contentious issue in PSP is whether to review
    code before compile or after
  • A non-issue in some languages, which arent
    compiled!
  • In Humphreys experience, about 9 of all syntax
    errors arent caught by a compiler, so dont
    expect it to catch everything

29
Reviews vs. Inspections
  • As a matter of courtesy, make sure a program or
    document is in pretty good shape before
    submitting it for review or inspection
  • Very formal inspections might require code to
    pass unit testing, and show test results as part
    of the inspection
  • Humphrey doesnt like testing before inspection,
    however

30
(P track) Report R4
  • Report R4 (p. 771) analyzes the defects from all
    the previous assignments
  • Tasks are
  • Develop a process and scripts to create your
    report
  • Follow that process and show the completed report

31
(P track) Report R4
  • Sample contents of the report should include, at
    a minimum
  • An analysis of estimating accuracy for size and
    time for the programs to date
  • Analysis of defects injected and removed, using
    table D23 as an example
  • Analysis of defects found by the compiler (if
    any), ala table C24

32
(P track) Report R4
  • Analysis of defect fix times, using table D22
    again
  • Develop a design checklist for use during design
    review
  • Develop a code checklist for use during code
    review
  • Discuss the results of the report, and set
    improvement goals for yourself

33
(P track) Report R4
  • Use graphs where possible, but dont forget to
    discuss the trends observed on them
  • A graph with no text is lonely
  • This report is the culmination of the PSP 1.x
    level of process, leading us to PSP 2
Write a Comment
User Comments (0)
About PowerShow.com