Title: Practical Software Measurement Challenges and Strategies
1Practical Software Measurement- Challenges and
Strategies
Krishna ArulGSG Scotland
2Objectives
- Provide an overview of the challenges faced by
software engineering teams in the Software
Measurement arena - Outline some of the strategies that are employed
to meet these challenges in GSG
3Overview
- About Motorola
- Organisation
- Global Software Group
- GSG Scotland
- CMM
- Concept of Six Sigma
- Metrics
- Define Measure
- Data Set
- Analysis
- Tools Methodology
- Improvements
- Conclusions
- Control
- Future Plans
4Motorolas Global Software Group
Premier provider of Software Systems Solutions,
Software Products, and Software Technologies to
Motorola Businesses and their customers worldwide
- Idea conceived 1990 to support rapidly increasing
software demand in Motorola products - First Center established in Bangalore India
- Currently gt20 city locations around the world
- Independent organization partnering with Motorola
business units and their customers - Domain-focused centers of excellence
- Skills orientation
- Process focus
- Process/methodology improvement to control costs
and cycle-time - Current world-wide headcount gt 5000
5GSG Locations
Nanjing
St. Petersburg
Livingston
Chicago
Montreal
Hyderabad
Beijing
Seoul
Krakow
Chendu
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
Kuala Lumpur
Perth
Turin
Cordoba
Phoenix
Ft Lauderdale
Bangalore
Singapore
Adelaide
6GSG Scotland - Background
- Centre Started in Dec 2000.
- Why Scotland ?
- Large talent pool of qualified Engineers from
Scottish Universities. - Synergy with (Motorola) Semiconductors and ISLI
(Institute of System Level Integration) - Proximity to, and ease of doing business with,
main European markets - Scotland is now the lead Automotive centre for
GSG. - Focussed on embedded software applications
primarily automotive. - System-on-Chip (SoC) design team focussed on
Motorola design needs post Freescale split - Achieved CMM L3 certification in May 2004,
currently working towards Level 5 later this year
7Six Sigma
3 sigma
3 sigma
8Software Six Sigma
- An overall business improvement methodology that
drives consistently excellent products, services,
designs and processes. - Metric 3.4 DPMO
- Process Improvement
- Management System E-Training ,E-Processes
,E-Tools ,E-Tracking, E-Visibility - Typical software processes operate at between 2.3
and 3.0 Sigma - The best software processes operate at 4 to 5
Sigma
9Strategies Process Improvement
Analysis
More Metrics
Metrics
CMM Model
10DMAIC - Define
- DEFINE what is important to the Organization ?
- But what is of paramount importance to GSG?
-
- Parameters chosen for Measurement Analysis
(Scorecard) - CSS ( Customer Satisfaction Survey)
- COQ and COPQ (Cost of Quality and Cost of Poor
Quality) - Productivity
- Estimation Accuracy
- Effort
- Schedule
- Size
11 DMAIC - Measure
- Data Source
- Annual Release data from Motorolas
- Global Software Centres is used for analysis
- Tools
- WISE-TCS (Web based Total Customer Satisfaction
Tool) - EPMS (Primavera)
- IQMEn / E-IQMEn (Quality Metrics Environment)
- JMP Tool / Minitab for Statistical Analysis
- JMP - The Statistical Discovery Software
12Customer Satisfaction Surveys
- Pre and Post Project Surveys
- Criteria of satisfaction and importance
- Scorecard goal- 8.86 average and 75 of
projects with all high importance areas rated at
8 or above - Measured against a Baseline
13COQ and COPQ
- Cost Of Quality (COQ) is the sum of effort due to
appraisal, prevention, internal failure and
external failure, expressed as a percentage of
total project effort. - Cost Of Poor Quality (COPQ) is the sum of effort
due to internal failure and external failure,
expressed as a percentage of total project
effort. -
- COQ COPQ Appraisal Effort Prevention
Effort
14COQ and COPQ
- Current Baseline based on Scorecard
- Cost of Quality lt25 (/-10 for each project)
- Cost of Poor Quality lt5
15Productivity
-
- Productivity is the ratio of Delta Code released
to the customer (New Deleted Modified
Reused Tool-generated) to the Total Project
Effort. - Productivity increase 10 ? 0.62 KAELOC/SM
16Estimation Accuracy
- Estimation Accuracy is the ratio of Actual value
to the Estimated value of a parameter. - Size, Effort and Schedule estimates are not
stand-alone metrics but should be analysed in
context with each other. - Deviations of all three estimations going outside
of the limits are indications that the collective
action of the estimation, planning and control
processes is not performing well.
17Estimation Accuracy - Size
- The size estimation accuracy metric (ZEA)
provides an insight into the project's ability to
estimate the size of the project - ZEA is critical for Embedded Application where
size is constraint by target device - Size Estimation Accuracy ZEA 100 /- 15
18Estimation Accuracy - Effort
- The effort estimation accuracy metric (EEA)
metric provides an insight into the project's
ability to estimate effort of the project - EEA critical for accurate cost estimation
- Effort Estimation Accuracy EEA 100 /- 15
19Estimation Accuracy - Schedule
- The schedule estimation accuracy metric (SEA)
metric provides an insight into the project's
ability to estimate effort of the project - SEA critical for On Time Delivery
- Schedule Estimation Accuracy SEA 100 /- 15
20DMAIC - Analyze
- E- IQMEn, PSD and SMSC
- Project Summary Database (PSD) is the output from
E- IQMEn, which serves as the GSG organizational
data repository. - Software Metrics Summary Charts (SMSC) are
integrated with the PSD, with no additional data
input required by users.
21 Solution Design
22DMAIC - Improve Software Metrics Summary
Charts-Organizational Health
23DMAIC - Improve Software Metrics Summary
Charts-Organizational Health
24DMAIC - Future Plans (1)
- Concentrate on ensuring that the data in E-IQMEn
is - Accurate The data should provide a true
reflection of the status of GSG projects, both
completed and active. - Complete The data should be span all major
metrics areas (effort, size and faults). - Up to date Monthly updates to keep data current.
- Inclusive E-IQMEn should contain as much GSG
data as possible, for all types of GSG work.
25DMAIC - Future Plans (2)
- Continuing work in this area
- Redefinition of project categories to simplify
(and othogonalise) schema. - Building traceability through intermediate tools
if required, to allow for better tracking of GSG
effort usage. - Development of interfaces from E-IQMEn to other
tools to facilitate once-only data entry.
26DMAIC - Conclusions
- Return On Investment of such analysis is very
high to - the organization in terms of effort
spent to - perform such analysis. i.e. in
selection of Domains and - type of projects.
- A strong Measurement System for Software Data
- is established to ascertain if any
correlation existed - between the domains, effort, errors
,and other parameters - Any Cost avoidance is Cost saving
- Nobody said it was going to be easy
27Thank You