Title: ISBSG Benchmarking process Standard
1ISBSG Benchmarking process Standard
- Dr Gargi Keeni
- NASSCOM Quality Forum (NR)
- February 3, 2006
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
2Contents
- Objective of this presentation
- ISBSG and NASSCOM
- Who is ISBSG
- Why ISBSG
- Our role in ISBSG
- Benchmarking process standard
- Introduction
- Scope
- Overview of this standard
- Establish and sustain benchmark commitment
- Identify information needs
- Establish benchmark parameters
- Plan the benchmark process
- Perform the benchmark process
- Evaluate the benchmark
- References
- Feedback on the Standard
- Discussions
3Objective
- Create awareness about NASSCOMs role in ISBSG
- Have a say in setting these International
Standards - Elicit your valuable feedback on the standard
- Collate the same and send it to ISBSG on behalf
of NASSCOM Quality Forum (Northern Region) - Identify early adaptors who would like to pilot
this standard within their own organization
Become leaders not followers
4NASSCOM and ISBSG
5Who is ISBSG
- ISBSG is a joint initiative of the following
organizations - ASMA Australian Software Metrics Association
- AEMES Espanola de Metricas de Software, Spain
- CSPI China SPI
- DASMA Deutschsprachige Anwendergruppe fur
Software Metrik and Aufwandschatzung , Germany - FiSMA Finland Software Metrics Association
- GUFPI Italian Users Group on Function Points
- IFPUG International Function Point Users Group,
USA - JFPUG Japanese Function Point Users Group
- KFPUG Korean Function Point Users Group
- NESMA Netherlands Software Metrieken Gebruikers
Associatie - NASSCOM National Association of Software
Service Companies - SwiSMA Switzerland Software Metrics Association
(associate) - UKSMA UK Software Metrics Association
- ISBSG provides products and services to allow
software developers to - benchmark themselves against the world's best
- more accurately estimate effort, time and cost
- lower development risk
6Why ISBSG
- ISBSG can tell you what productivity other
organizations attain that have similar profiles
to your organization. (e.g. organization type,
business area, development environment, etc.). - ISBSG offers a free data collection package to
enable collection of relevant metrics pertaining
to your projects. This can be used to establish
an in-house repository if required. - Project details can be mailed to ISBSG to be part
of the international repository and compare with
world's best practice in IT development - ISBSG publishes an analysis of its database, The
Software Metrics Compendium, helps IT
practitioners keep abreast of development trends - ISBSG releases a Data Disk of the Repository data
which can be used to conduct analysis of the
world's projects - All possible steps are taken to ensure the
anonymity of project contributions. No breach of
the confidentiality of data submissions has
occurred since the inception in the mid90s.
7Our (NASSCOM) role in ISBSG
- Participate in ISBSG activities and meetings to
establish the policies that govern the operations
of ISBSG, and to define the data collected, its
management, and the reports and other results
that will be made available to practitioners
around the world - Promote submission of data to the ISBSG
repository - Develop material to create awareness about the
uses that ISBSG data may be put to
Leadership and learning are indispensable to each
other John F Kennedy
8The Data Entry Flow
- ISBSG Repository Manager
- Rates project ABCD
- Clarifies open points via administrator with
original supplier of project data - Puts project into database
- Produces benchmarking report
- Sends report to supplier
- ISBSG Administrator
- removes name and all identification details
- Allocates random unique id
- Confirmation to the submitter advise unique id
- Sends project data with unique id but no
identification details
9Guidelines on data use
If you look at 10,000FP projects, I would say
that solid CMM 3s and up are about 15 higher in
productivity and 25 higher in quality.
Additionally these same organizations experience
a 50 lower risk of failure Caper Jones July
2005
- What is the ideal team size for your new
project? - The team size analysis in chapter 8 will provide
you with a good guide. - Are CASE tools really the silver bullet to
increased productivity? - Chapter 5 will give you the answer.
- Which languages deliver the best productivity?
- Chapter 6 has pages of information on languages
how they compare. - Is the anecdotal rule of 1,000 per function
point accurate? - The cost analysis in chapter 9 will tell you.
10ISBSG Benchmarking Process Standard v0.7
Since benchmarking Process is a specific instance
of a measurement process this process is
described as a tailoring of the ISO/IEC standard
15939 Measurement process 2002 It has adopted
the structure and major activities described
within ISO/IEC 159392002 adapted to the specific
needs of a benchmarking process It is anticipated
that the final document may become the basis of a
future ISO standard
11Introduction
Benchmarking"A continuous, systematic process
for an organization to compare itself with those
companies and organizations that represent
excellence" IFPUG
- Benchmarking of software related activities
- External benchmarking
- How does the organization compare to industry
standards - Are we more or less effective in comparison to
our competitors - Are our processes effective or do we need to
launch an improvement initiative - Is the outsource contract achieving the service
level agreed in the contract - Peer group benchmarks
- Comparisons between division or sites within an
organization - Year- on - year or internal benchmarking
- Is our process improvement initiative proving
effective? (base lining) - Is the outsource contract meeting the levels
agreed in the contract? - Are all the divisions and sites in our
organisation performing at the same level? - Has the introduction of a new technology achieved
the benefits expected? - What evidence is there to support the estimates
that we are using? - Continual improvement requires change within the
organization Evaluation of change requires
benchmarking of performance and comparison - Benchmarks should have a clearly defined purpose,
should lead to action, and not be employed purely
to accumulate information
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
12Scope
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- This standard identifies the required activities
and tasks that are necessary to successfully
identify, define, select, apply, and improve
benchmarking for software development within an
overall project or organizational benchmark
structure - It provides guidance to about the issues and
consideration required in data selection and
comparison - It assists in interpreting the output of a
benchmark exercise - It is intended to be used by Software suppliers
and acquirers
Software suppliers include personnel performing
management, technical and quality management
functions in software development, maintenance,
integration and product support organizations
Software acquirers include personnel performing
management, technical and quality management
functions in software procurement and user
organizations
13Purpose and outcome of the software benchmarking
process
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Purpose of the software benchmarking process
- Collect, analyze and report data relating to
products developed and processes implemented
within the organizational unit to - Support effective management of the processes
- Objectively demonstrate the comparative
performance of these processes - Outcomes of the software benchmarking process
- Organizational commitment for benchmarking will
be established and sustained - Information objectives of technical and
management processes will be identified - an appropriate set of questions, driven by the
information needs will be identified and/or
developed - benchmark scope will be identified
- the required performance data will be identified
- the required performance data will be measured,
stored, and presented in a form suitable for the
benchmark - the benchmark outcomes will be used to support
decisions and provide an objective basis for
communication - benchmark activities will be planned
- opportunities for process improvements will be
identified and communicated to the relevant
process owner - the benchmark process and measures will be
evaluated
14Benchmarking Process
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
Foundation processes Establish and sustain
benchmark commitment Identify information
needs Establish benchmark parameters
Sponsors Initiate Benchmark Exercise
Benchmark experience base
Establish Benchmark Parameters
Plan metrics collection and storage
Identify performance measures
Define questions to be answered
Evaluate and Present Benchmark Results
Collect the Data
Carry out Benchmark
Evaluate Benchmark Process
Core Benchmark process
Plan the benchmark proces Perform the benchmark
process Evaluate and present the benchmark results
Repository
15Benchmark information model Productivity
(example)
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
16Benchmark information model Schedule (example)
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
17Limitations
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Benchmarking is an imprecise tool as it is not
possible to find directly comparable - Organizations
- Contracts
- Years
- Method and underlying assumptions of benchmarking
need to be transparent and auditable - As technology changes the original benchmark
measurements may no longer provide suitable
comparisons and there may be a need to re
establish the baseline measurement or the
comparison group against which benchmarking is
being conducted - It may be necessary to reconsider the terms of an
outsourcing contract as new technologies may
render older agreements unfair to either side - Due to a lag time factor benchmark results can
be anywhere from 6 months to a year old
18Establish and sustain benchmark commitment
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Accept requirements ? Requirements for
benchmarking - Maintain requirements
- Assign responsibility
- Benchmark user
- Benchmark analyst
- Benchmark librarian
- Assign resources
- Management commitment
- Communicate commitment
19Identify information needs
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
How do I predict the productivity of my project
- Identify benchmark information needs
- Based on benchmark goals
- Constraints
- Risks
- Organizational problems
- Project problems
- Prioritize information needs
- Select information needs
- Determine questions
How do I evaluate the quality of the s/w product
compared to industry norms
How do I know the cost effectiveness and
efficiency of my supplier compared to industry
norms
- What is the productivity of the unit
- How does it compare with other organizational
units
- Set competitive range for metrics baseline.
- Demonstrate ongoing competitiveness and
continuous improvement in pricing service
levels - Identify Process Improvement opportunities
- Identify best practices
- Decision making re-outsourcing
- Establish market position
Quality Productivity Time to market Customer
Satisfaction Cost Rework
of defects
Usability Reliability
. .
Quality
Cost to develop a system
Cost to develop and maintain a system
Cost
20Establish benchmark parameters
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Benchmark type
- Benchmark scope
- Organizational unit
- Single project
- Functional area
- Whole enterprise
- Single site
- Multisite organization
- Consists of software projects/ supporting
processes or both - Stakeholders (internal / external)
- Project Managers
- Information System managers
- Head of Quality management
- Benchmark frequency
21Plan the benchmark process
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Define procedures (collection storage analysis
and reporting) - Data Collection
- When a sample set is chosen check for
- Size of sample sets
- Sample selection technique (random /
representative) - Storage
- Verification
- Profile of the data collection set should
adequately match the profile for the benchmark
data collection set - Configuration management procedures
- Set criteria for evaluating information products
- Criteria to determine whether required data have
been collected and analyzed with sufficient
quality to satisfy information needs - Set criteria for evaluating benchmark process
- E.g. timeliness, efficiency
- Approving benchmark process
- Should include acceptance criteria and process
for dispute resolution - Approval of planning
- By management of the organization
- Acquire support technologies
- Characterize Organizational unit
- Organizational processes
- Interfaces among divisions
- Organizational structure
- Select measures (Define)
- Measures should have a clear link to the
information needs - Document measures
- Select benchmarking supplier
- Sampling techniques
- Analysis techniques
- Sample findings reports and conclusions
- Logistics
- Required resource commitments
- Ability to meet project timetable
- Cost
- Select benchmark dataset ( from organizations of
comparable size) - Segmentation by industry sector, application
type, or business environment - Process maturity levels
- Project profiles
22Perform the benchmark
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Integrate procedures (data collection and
storage) - Data generation and collection needs to be
integrated into the relevant processes - Integrated data collection procedures need to be
communicated to the data providers - Data analysis and reporting needs to be
integrated into the relevant processes - Collect Data (stored data)
- Collected data needs to be verified e.g using a
checklist - The context information necessary to verify,
understand or evaluate the data needs to be
stored with the data - Analyze Data (data analysis and interpretations)
- Analysis results are interpreted by benchmark
analysts - Stakeholders need to review the results
- Interpretations should take into account the
context of the measures - Collected data may need to be normalized to
ensure comparability - Communicate information products
- Information products need to be reviewed with the
data providers and benchmark users - Information products need to be documented and
communicated to data providers and benchmark
users - Feedback should be provided to stakeholders
- Feedback should be obtained from stakeholders to
evaluate the information products and benchmark
process
23Evaluate the benchmark
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- Evaluate measures
- based on benchmark user feedback, Information
products, and performance measures - Evaluate the bench mark process
- Example criteria
- Timeliness
- Efficiency
- Defect containment
- Stakeholder satisfaction
- Process conformance
- Store the lessons learned from the evaluation in
the benchmark experience base - Identify potential improvements
-
Evaluation results Improvement actions Benchmark
Users feedback
24References
Source International Software Benchmarking
Standards Group (ISBSG) v 0.7 2005
- ISO/IEC 2382-1 1993, Data Processing -
Vocabulary Part 1 Fundamental Terms. - ISO/IEC 2382-20 1990, Information Technology
Vocabulary. - ISO 8402 1994, Quality management and quality
assurance Vocabulary. - ISO 9001 1994, Quality Systems - Models for
quality assurance in design/development,
production, installation and servicing. - ISO/IEC 12207 1995, Information Technology -
Software Life Cycle Processes. - ISO/IEC 9126 1991, Information Technology -
Software Product Evaluation - Quality
Characteristics and Guidelines for their Use. - ISO/IEC 141431998 Information Technology -
Software Benchmark - Definition of Functional
Size Benchmark. - ISO/IEC 14598-1 1996, Information Technology
Software Product Evaluation Part 1, General
Overview. - ISO/IEC TR 15504-2 1998, Information Technology
- Software Process Assessment - Part 2 A
Reference Model for Processes and Process
Capability. - ISO/IEC TR 15504-9 1998, Information Technology
Software Process Assessment Part 9
Vocabulary. - ISO, International Vocabulary of Basic and
General Terms in Metrology, 1993. - ISO TR 100171999, Guidance on Statistical
Techniques for ISO 90011994. -
- F. Roberts Benchmark Theory with Applications to
Decision making, Utility, and the Social
Sciences. Addison-Wesley, 1979.
25Thank you
gargi_at_ieee.org
26Your feedback and comments