Development Process Metrics - PowerPoint PPT Presentation

About This Presentation
Title:

Development Process Metrics

Description:

A look at development process metrics with view to managing the development ... Size metric undecided (IFPUG 4.2, COSMIC FFP V2.2) Four selected indicators are ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 16
Provided by: Mik7310
Learn more at: https://www.cse.fau.edu
Category:

less

Transcript and Presenter's Notes

Title: Development Process Metrics


1
Development Process Metrics
  • A look at development process metrics with view
    to managing the development of secure software.
  • Dr. Michael VanHilst
  • 22 March 2007

2
Development Process Metrics
  • A look at development process metrics with view
    to managing the development of secure software.
  • Dr. Michael VanHilst
  • 22 March 2007

3
ISO-15504 (level 2), CMM (level 4)
  • Measure how well the process is being followed
    (using GQM).
  • Identify goals (i.e. reliable software products)
  • Analyze causal chains in the process
  • Identify process steps that contribute to the
    goals
  • Define measurements that assess if identified
    steps are followed.
  • Measurement can use indicators (presence of
    documents) or survey questions (How committed
    was management in the review meeting?)
  • Measure and hold assessment meetings every 2
    months

W. S. Humphrey. A Discipline for Software
Engineering. Addison-Wesley, January
1995. Jarvinen, J., Hamann, D., and van
Solingen, R., On integrating assessment and
measurement towards continuous assessment of
software engineering processes, IEEE Software
Metrics Symposium, 1999. Birk, A., Derks, P.,
Hamann, D., Hirvensalo, J., Oivo, M., Rodenbach,
E., van Solingen, R., and Taramaa, J.,
Applications of measurement in product-focused
process improvement a comparative industrial
case study, IEEE Software Metrics Symposium,
1998.
4
Personal Software Process
  • Requires developers to record lots of notes on
    how they spend time.
  • Developed by Watts Humphrey at SEI, promoted as
    part of Team Software Process within CMMI.
  • Experience shows dedication slacks off over
    time.
  • Studies by Johnson, et al., find self reports
    inaccurate, easily skewed.

W. S. Humphrey. A Discipline for Software
Engineering. Addison-Wesley, January 1995. Anne
M. Disney and Philip M. Johnson, Investigating
Data Quality Problems in the PSP (Experience
Paper), ACM SIGSOFT 1998 Philip M. Johnson,
Hongbing Kou, Joy Agustin, Christopher Chan,
Carleton Moore, Jitender Miglani, Shenyan Zhen,
and William E.J. Doane, Beyond the Personal
Software Process Metrics collection and analysis
for the differently disciplined., IEEE ICSE 2003
5
Causal Analysis (CMM level 5)
Dumke, R.R. Blazey, M. Hegewald, H. Reitz, D.
Richter, K. Causalities in Software Process
Measurement and Improvement. International
Conference on Software Process and Product
Measurement (MENSURA 2006), November, 6-8, 2006
Several other papers on process discovery,
inference, or compliance.
6
Low Level Non-Invasive Metrics
Instrument IDE, CM, Office Tools, OS Reports
which app/file in foreground Used to look for
indicators of lowered productivity. Johnson
example looked at contributing factors in build
failures.
Johnson, P.M., Kou, H., Paulding, M., Zhang, Q.,
Kagawa, A., and Yamashita, T, Improving software
development management through software project
telemetry, IEEE Software, July 2005 Andrea
Janes, Marco Scotto, Alberto Silliti, and
Giancarlo Succi, A Perspectrive on Non Invasive
Software Management, IEEE Instrumentation and
Measurement Technology Conference (IMTC), 2006
7
Siemens Measurement System
  • Four measurement categories Time / Size /
    Quality / Cost.
  • Size metric undecided (IFPUG 4.2, COSMIC FFP
    V2.2)
  • Four selected indicators are
  • Schedule / Budget compliance
  • Cost of Defect Correction
  • Feature changes
  • Defect Detection Profile
  • An annual Measurement Excellence Day helps to
    keep participants in the SMS loyal to the
    program.

8
Bias in the Data
  • Measures conformance to plans/predictions
  • Time spent
  • Code written (LOC,FP)
  • Defects found / fixed
  • Analysis always blames the developer
  • Developer productivity (slow, error prone)
  • Process conformance (followed process?)
  • Wont drive process change or innovation

9
Shell Information Technology
  • Five metrics are collected (Tom Dekker)
  • Project Delivery Rate,
  • Speed of Delivery,
  • Defect Density,
  • Time Schedule,
  • Effort Cost.
  • The target is 90 of the projects should be
    within schedule, budget, and within planned
    scope.
  • Revised estimation based on industry benchmarks
    proved much better than the original values based
    on Shells own history-based data. They were less
    biased than their own.
  • (Less biased, closer to business goals)

10
Typical Effort vs. Calendar Day
develop
test
fix
alpha release
beta release
start
11
Waste Delay in Typical Effort
develop
test
fix
fault caused
lag
fault fixed
fault found
production
waste
alpha release
beta release
requirements freeze
12
Toyota Product Development System
Design defects out of the process
test develop
A product with many defects at first release will
still have many defects at final release
release
start
13
IS Security Design Methods
  • Much literature on requirements for security
  • Very little on the development process

Richard Baskerville, Information systems security
design methods implications for information
systems development, ACM Computing Surveys, 25, 4
(December 1993) Pages 375 414 Chung, L.,
Nixon, B. Dealing with non-functional
requirements three experimental studies of a
process-oriented approach. Proceedings of
ICSE95, pp. 2537 Eloff, MM, von Solms, SH,
Information Security Management An Approach to
combine Process Certification, and product
Certification, Computers and Security, Vol 19,
pp 698 709 T. Tryfonas, E. Kiountouzis, A.
Poulymenakou, Embedding security practices in
contemporary information systems development
approaches, Information Management Computer
Security, Oct 2001 9,4 183-197
14
A Toyota Security Method?
  • A product with many security flaws at first
    release will still have many flaws at final
    release
  • We get what we measure. If we dont measure
    security we wont get security. So how do we
    measure security?
  • Is there an idiot-proof or test-first equivalent
    for driving security flaws out in the development
    process?
  • Suggestion from Nelly use automated code
    analysis for security anti-patterns, unsafe
    library calls,

Mylopolous has paper on process for transforming
insecure code to secure cope.
15
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com