Metrics Programmes - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Metrics Programmes

Description:

Slippage, requirements volatility, estimation accuracy ... into organization processes, so that they are done consistently and persistently ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 23
Provided by: swaminatha
Category:

less

Transcript and Presenter's Notes

Title: Metrics Programmes


1
Metrics Programmes
2
Metrics at MIEL
  • COQ cost of quality
  • COPQ cost of poor quality
  • PCE phase containment effectiveness
  • Post-release defects
  • Slippage, requirements volatility, estimation
    accuracy
  • Cycletime, effort (productivity), age of open and
    closed problems

3
MIEL Metrics Programme
  • Monthly management presentations metrics-based,
    using tools
  • Integration with project management and
    development processes, tools
  • Process required metrics-based causal analysis
    diagnostic value of metrics for team
  • Estimation accuracy /- 20 less pressure on
    team
  • Could then be basis for continuous improvement
    focus
  • Still had lots of gaps and warts!

4
Monthly Presentation Chart
  • Code size, schedule, defects
  • original estimate, current estimate, actual to
    date
  • Requirements volatility, staffing vs. time
  • Actual and planned earned values
  • PCE (phase containment effectiveness) for each
    phase
  • Cycle time reduction target, planned, current
  • COQ, COPQ (cost of poor quality)

5
Diagnosis examples
  • Slippage / staffing relationship
  • Low COQ, defect rate below target lack of
    proper reviews / testing
  • High COQ, low defect rates over-reviewing
  • Low COQ, high COPQ, high defect rates, poor PCE
    need to improve review effectiveness
  • Relationship between requirements volatility
    defects, between req. vol. slippage
  • Excessive code size, slippage inadequate
    planning
  • High requirements volatility improve req mgmt

6
Causal analysis
  • Defects classified by type
  • If number of defects of particular type rise,
    find cause, add to defect checklists
  • If high COPQ and slippage, do causal analysis for
    what went wrong, how to do things differently and
    prevent the problems
  • If significant slippage but all other data looks
    OK, then causal analysis for what caused slippage

7
Key Ingredients
  • Selection of appropriate metrics
  • Matched to team and organization
  • Tool support
  • Interpretation and careful use of results
  • Lies, damn lies and statistics
  • Team commitment
  • Management commitment

8
Criteria for Selecting Metrics
  • Matched with organizational business and
    engineering processes
  • Ability to collect measurements
  • Quality of metric to what extent does it
    correlate with the objective?
  • Interlocking sets of metrics
  • Value to team, to organization
  • Perceived value

9
Matched with Processes
  • Dont expect people to change their processes so
    you can get metrics!
  • e.g. do formal reviews so that we will have
    defect data
  • If team prefers informal reviews, will have
    difficulty getting meaningful reviews and good
    data
  • First change the process (if desirable!) or
    better, design metrics that can be obtained with
    existing processes
  • Need deep understanding of organizational
    culture, existing processes, metrics theory
  • Even then, major metrics design challenge!

10
Ability to Collect
  • Minimize intrusion work to put in data should
    be absolutely minimal ignore metrics that
    cannot be gathered easily
  • Many measurements can be obtained automatically
    by tools
  • Code size, reported defects (if defect reporting
    tool is used), number and scope of revisions
  • But team will not use tool (e.g. Clearcase) just
    because it provides metrics!
  • Workflow and process automation tools can help
    provide much of the data
  • Staffing profiles, phasewise cycletimes,
    slippage
  • Ideally, it should be easier to get work done in
    ways that provide the needed data

11
More on Matched with Processes
  • This is why metrics are level 4 CMM
  • Need to have strong organizational processes in
    place before we can get meaningful metrics
  • Metrics is really a part of process design!
  • Need to be fully integrated into organization
    processes, so that they are done consistently and
    persistently
  • Also metrics take a long time to show value
  • Without high maturity level, wont last long
    enough (relates to management commitment)

12
Quality of Metric
  • We try hard to come up with numbers that tell us
    something about the project
  • e.g. defect rates, cost of quality, cycletime
  • But there are many gaps in these
  • e.g. low defect rates may show good development
    practices, or poor testing, or low code density,
    or poor reporting, or less ambitious engineering,
    or less successful product that did not get used
  • So is low defect rates a good thing?
  • Hard to translate across projects, because of so
    many variables

13
Interlocking Sets of Metrics
  • Solution create a set of metrics that together
    paint a picture
  • Bad ways of improving one metric should impact
    another negatively
  • e.g. development defect rates post-release
    defect rates customer satisfaction business
    results cycletime productivity! (And still
    we cannot catch underreporting!)

14
Value to Team, Organization
  • Metrics need very high value/cost ratio to
    succeed
  • Partly due to perceived value problems
  • Partly due to built-in obsolescence any change
    in environment can invalidate previous data
  • Argues for minimal set of metrics, very
    non-intrusive data collection
  • Need to address real problems faced by
    organization team, not what metrics designer
    thinks is important!

15
Perceived Value
  • People need to see the value to be motivated
  • Problem separation between cost and value
  • The one who puts in effort to provide data is
    different from the one who benefits (developer /
    manager, team / management, team A / team B)
  • Time separation work now, gain later!
  • Address what is close to peoples heart
  • e.g. effort cycletime metrics can help improve
    estimates, thereby reducing overtime and crunches
  • Avoid overselling high expectation setting can
    make successes seem like failures!

16
More on Value to Team, Organization
  • Ideally, value should be immediate
  • Raise in-project flags for action
  • e.g. show which modules are more defect-prone
  • e.g. show that there are lots of requirements
    changes
  • At Motorola, metrics provided easy way to
    present project status to management
  • Management could also diagnose instantly,
    visually
  • Provide confidence to team, management!
  • Long-term value should be added incentive

17
Tool Support
  • Need tool support both for collection and
    analysis
  • Integration with development and workflow tools
  • Ideally, the users should not see the input end
    of the metric tools at all!
  • Tools enable users to do their own metrics
    analysis
  • Standardized output formats make it possible to
  • Problem tools end up dictating work processes
  • How about just creating your own spreadsheet with
    built in formulae?

18
Interpretation
  • What statistics reveal is interesting, but what
    they conceal is critical
  • The most important job of a metrics expert is to
    interpret metrics
  • Remember all the caveats in the metric all the
    reasons why the data may not be what it appears
  • Any chart should be accompanied by an explanation
    that indicates all the exceptions, and what lies
    behind the data outliers, exceptions,
    contributing factors, reasons for trends
    significant causes, impact

19
Interpretation - example
  • A cycletime graph shows a continuously improving
    trend across 2-3 years, and a sudden large
    improvement this year
  • How similar were the projects? (bucketization)
  • Has the way of measuring cycletime remained
    constant? (when clocks start, project/dev
    cycletime)
  • Was there some specific factor that caused a
    large improvement this year? (one project with
    high reuse)
  • Are the sample sizes significant?
  • What does the cycletime graph really cover? (does
    not count lead time for getting project started)

20
Interpretation - thoughts
  • Very critical for metrics expert to not believe
    the numbers
  • Be skeptical develop thorough understanding of
    what goes into the numbers, and what does not
  • Needs solid understanding of ground realities
  • Metrics role fits well with quality engineer role
  • Very subjective a lot of value contributed in
    the interpretation, lots of expertise goes into
    it
  • Metrics is about isolating specific factors an
    intrinsically impossible task

21
Developer Commitment
  • People need to enter accurate data for metrics to
    be valid
  • Will only do so if they have a commitment to
    metric fidelity
  • Never use for evaluation! (individuals or teams)
  • Creates built-in incentive to make data look good
  • Education programs to create buy-in
  • Cover what value developers get from metrics

22
Management Commitment
  • Metrics take long time to show value
  • Management must be seen to use metrics for team
    to believe they are important
  • However, managing to numbers has problems
  • If you manage (only) by numbers, all you manage
    is the numbers!
  • If metrics are bad, must not lead to witch-hunts
    and crisis mentality (same as evaluation problem)
Write a Comment
User Comments (0)
About PowerShow.com