Title: EEE493'20 Software Process Metrics
1EEE493.20 Software Process Metrics
Royal Military College of Canada Electrical and
Computer Engineering
- Dr. Terry Shepard
- shepard_at_rmc.ca
- 1-613-541-6000 ext. 6031
Major JW Paul Jeff.Paul_at_rmc.ca 1-613-541-6000
ext. 6091
2Why Software Metrics?
- Will they answer the question of how much
software weighs?
3(No Transcript)
4Le kilogramme est l'unité de masse il est égal
à la masse du prototype international du
kilogramme.
- The reason why "kilogram" is the name of a base
unit of the SI is an artefact of history. - Louis XVI charged a group of savants to develop a
new system of measurement. Their work laid the
foundation for the "decimal metric system", which
has evolved into the modern SI. The original idea
of the king's commission (which included such
notables as Lavoisier) was to create a unit of
mass that would be known as the "grave". By
definition it would be the mass of a litre of
water at the ice point (i.e. essentially 1 kg).
The definition was to be embodied in an artefact
mass standard. - After the Revolution, the new Republican
government took over the idea of the metric
system but made some significant changes. For
example, since many mass measurements of the time
concerned masses much smaller than the kilogram,
they decided that the unit of mass should be the
"gramme". However, since a one-gramme standard
would have been difficult to use as well as to
establish, they also decided that the new
definition should be embodied in a one-kilogramme
artefact. This artefact became known as the
"kilogram of the archives". By 1875 the unit of
mass had been redefined as the "kilogram",
embodied by a new artefact whose mass was
essentially the same as the kilogram of the
archives. - The decision of the Republican government may
have been politically motivated after all, these
were the same people who condemned Lavoisier to
the guillotine. In any case, we are now stuck
with the infelicity of a base unit whose name has
a "prefix".
http//www1.bipm.org/fr/si/history-si/name_kg.html
5http//encyclopedia.thefreedictionary.com/Kilogram
me
- The international prototype of the kilogram seems
to have lost about 50 micrograms in the last 100
years, and the reason for the loss is still
unknown (reported in Der Spiegel, 2003 26). The
observed variation in the prototype has
intensified the search for a new definition of
the kilogram. Although it is accurate to state
that all other objects in the universe have
gained 50 micrograms per kilogram, this
perspective is counterintuitive and defeats the
purpose of a standard unit of mass.
6Measuring is hard
- Even old, widely used, well established metrics
- have a complicated history
- are still changing
- What is a metric?
- the type of property being measured?
- eg. lines of code
- the unit used to measure a property?
- eg. a standard for counting lines of code in Java
- the result of measuring a property?
- eg. the number of lines of code in a particular
program - In software, the term metric is used in all
three senses - most often in the first two senses
7 four scales of measurement
- The four scales are defined by three qualities
- magnitude, equal intervals, and zero
- Most basic is the nominal scale,
- represents only names
- has none of the three qualities.
- eg. a list of students, the set of names in a
program - Second is the ordinal scale
- has magnitude only
- values are ordered but arithmetic operations are
pointless - eg. house numbers on a street, ranking of
software versions by market position - Third is the interval scale
- possesses both magnitude and equal intervals
- values can be usefully added or subtracted
- eg. temperature in degrees F, ranking of software
versions by date of release - Fourth is the ratio scale.
- contains all three qualities
- values can be usefully multiplied or divided
- eg. temperature in degrees K, number of lines of
code in a module
8A helpful table
9The following table shows the width and height of
ISO A and B paper formats, as well as the ISO C
envelope formats. The dimensions are in
millimeters
A8, A9, A10
10The ISO paper size concept
.
- which scale is being used?
- why does software not use A0 size paper?
- all other branches of engineering do...
11Is a process metricdifferent from a product
metric?
- What is the difference between them?
12Product vs Process Metrics
- Product Metrics
- Measures of the product at any stage of
development - Complexity of the software design, size of the
final program, number of pages of documentation - Process Metrics
- Measures of the software development process
- Overall development time, type of methodology
used, average level of experience of the
programming staff
13Properties of Process Metrics
- provide objective measures of
- project progress
- qualities of software products
- provide the means for estimating the cost and
schedule to complete a project with desired
levels of qualities - accuracy improves over time
- each metric has two dimensions
- a static value
- a dynamic trend
- measuring change (trend) is the most important
aspect of metrics for project management
process metrics provide reality checks
14So, what do we measure for Software?
15Does Software have weight?
- Software can exert very large forces
- Its weight depends on how it is represented
- paper, chips, circuit boards, chalk, CD, shrink
wrap, - Changing the software may not change its weight
- Is this useful or interesting?
- choosing useful metrics for software is a hard
problem - there are too many choices
- we will let Walker (not Winston) Royce choose for
us
16Seven Core Process Metrics (Walker Royce)
- Management Indicators
- Work and Progress
- Budgeted Cost and Expenditures
- Staffing and Team Dynamics
- Quality Indicators
- Change Traffic and Stability
- Breakage and Modularity
- Rework and Adaptability
- MTBF and Maturity
17M1 work and progress
- purpose
- iteration planning
- plan vs. actuals
- management indicator
- perspectives
- SLOC
- function points
- object points
- scenarios
- test cases
- Software Change Orders (SCOs)
18Work and Progress
100
planned progress 40
Work Plan
current time
actual progress 25
progress
schedule variance 15
time
100
19M2 - budgeted costs and expenditures
- purpose
- financial insight
- plans vs. actuals
- management indicator
- perspectives
- cost per month
- full-time staff per month
- of budget expended
- earned value calculations
20Planned and Actual Cost ? Progress
actual progress?
Expenditure Plan
100
current time
planned cost 40
expenditure
actual cost 15
cost variance -25
time
100
21Earned Value Example
The earned value at the current time is the
planned cost of the actual progress. The negative
cost variance represents an under-budget
situation.
100
planned cost 40
Expenditure Plan
current time
actual progress 25
Expenditure (planned and actual)
actual cost 15
schedule variance 15
cost variance -10
time
100
dangerous assumption actual progress of 25
equates to planned costs of 25?
22M3 staffing and team dynamics
- purpose
- resource plan vs. actuals
- hiring rate
- attrition rate
- perspectives
- people per month added
- people per month leaving
23Staffing and Team Dynamics
planned
actual
Staffing
the two charts dont correspond ???
24M4 - change traffic and stability
- purpose
- iteration planning
- management indicator of schedule convergence
- perspectives
- SCOs opened versus SCOs closed
- by type (0,1,2,3,4)
- by release / subsystem / component
25Change Traffic and Stability
Open SCOs
SCOs
Closed SCOs
time
does this graph represent stability?
26M5 breakage and modularity
- purpose
- convergence
- software scrap
- quality indicator (maintainability)
- perspectives
- reworked SLOC per change
- by type (0,1,2,3,4)
- by release / subsystem / component
27Breakage and Modularity
Healthy
Breakage ()
design changes
implementation changes
time
28M6 rework and adaptability
- purpose
- convergence
- software rework
- quality indicator (maintainability)
- perspectives
- average hours per change
- by type (0,1,2,3,4)
- by release / subsystem / component
29Rework and Adaptability
Rework (/change)
implementation changes
Sick
design changes
time
30M7 MTBF and maturity
- purpose
- test coverage / adequacy
- robustness for use
- quality indicator (reliability)
- perspectives
- failure counts
- by release / subsystem / component
- test hours until failure (MTBF)
- by release / subsystem / component
31MTBF and Maturity
MTBF (hours/failure)
Reliability Growth
time
32MTBF and Maturity
MTBF (hours/failure)
Reliability Decay
time
33Characteristics of Good metrics
34Characteristics of Good Metrics
- meaningful to the customer, manager and performer
- demonstrates quantifiable (real) correlation
between process changes and business performance - everything boils down to (cost reduction,
revenue growth and margin increase) - objective and unambiguous
- displays trend
- natural by-product of the process
- supported by automation
35Another View
http//www.armysoftwaremetrics.org/about/issues.as
p
- Army Software Management Issues Measures
- (6 Management Issues the 14 Army Metrics)
- In order to better assess the maturity of
software products, DISC4 requires PMs to address
six management issues. The PMs can respond to
this mandate by tracking the metrics that map to
each of the issues. The six management issues and
the metrics that may be collected to address each
issue are listed below. Remember that the
requirement is not to use all 14 measures, but to
adequately address these issues
36ASMO - 6 Management Issues
- Schedule and progress regarding work completion
- cost, schedule, and development progress
- Growth and stability regarding delivery of the
required capability - requirements traceability, requirements
stability, design stability, development
progress, and complexity - Funding and personnel resources regarding the
work to be performed - cost and manpower
- Product quality regarding the delivered products
- fault profiles, reliability, breadth of testing,
complexity, and depth of testing - Software development performance regarding the
capabilities to meet program needs - software engineering environment
- Technical adequacy regarding software reuse, Ada,
and use of standard data elements - computer resource utilization
37ASMO - Army Metrics 1-7
- COST - tracks software expenditures (dollars
spent vs. dollars allocated). - SCHEDULE - tracks progress vs. schedule
(event/deliverable progress). - COMPUTER RESOURCE UTILIZATION - tracks planned
vs. actual size (percent resource capacity
used). - SOFTWARE ENGINEERING ENVIRONMENT - rates the
developer's environment (the developer's
resources and software development process
maturity). - MANPOWER - indicates the developer's application
of human resources to the development program and
the developer's ability to maintain sufficient
staffing to complete the project. - DEVELOPMENT PROGRESS - indicates the degree of
completeness of the software development effort.
This metric can also be used to judge readiness
to proceed to the next stage of software
development. - REQUIREMENTS TRACEABILITY - tracks requirements
to code (percent requirements traced to design,
code, and test cases).
38ASMO - Army Metrics 8-14
- REQUIREMENTS STABILITY - tracks changes to
requirements (user/developer requirements changes
and their effects). -
- COMPLEXITY - assesses code quality.
- BREADTH OF TESTING - tracks testing of
requirements (percent functions/requirements
demonstrated). -
- DEPTH OF TESTING - tracks testing of code (degree
of testing). -
- FAULT PROFILES - tracks open vs. closed anomalies
(total faults, total number of faults resolved,
and the amount of time faults are open by
priority). - RELIABILITY - monitors potential downtime (the
software's contribution to mission failure). - DESIGN STABILITY - tracks design changes and
effects (changes to design and percent design
completion).
39References
- Royce, Walker. Software Project Management,
Addison-Wesley, 1998. Chapter 13. - Software Metrics - Carnagie Mellon Institute
- http//www.sei.cmu.edu/publications/documents
/cms/cm.012.html - Army Software Metrics Office
- http//www.armysoftwaremetrics.org/about/issu
es.asp