Title: Sumant Sahoo
1Sumant Sahoo
Presented In
MINITAB User Summit 07
April 23, 2007
2Global IT Service Provider
- Founded in 1990. Listed in Mumbai London
- SEI CMMI Level 5, ISO 90012000 ISO27001
- Resource Count 2007 - 8000 employees (P)
- Revenues 2007 US 280 million (P)
- 141 clients worldwide, 40 Global 500
London
Montreal
Chicago
San Jose
New Jersey
Bad Homburg
- Multiple channels deployed for higher reliability
in the networks - Multiple Disaster recovery centers to ensure
business continuity - 4 offshore delivery centers
- 3 nearshore
- delivery centers
Tokyo
Stockholm
Paris
Delhi
Mexico
Singapore
Mumbai
Sydney
Chennai
Pune
Headquarters
Sales Office
Delivery Centers
- Seamless integration of onshore and offshore
delivery centers
Sydney
3Project Charter
Problem Statement
This project started from March 2004 for
developing the parcel tracking application. The
newly developed application would replace and
automate the current semi manual process
followed. The initial scope was to develop 300
modules within a year. But the scope changed to
500 plus modules and the project is now planned
to end in December 2007. From Jan 2006 to March
2006, the defect density w.r.t. effort for
customer defects was 0.3 ( Defects 145 and
Effort 483 PDS) and defect density for internal
defects was 0.38 ( Defects 185 and Effort 483
PDS).
Goal Statement
Objective of this project is To reduce the
overall Defect Density for the project by 50
within next quarter.
4Project Charter
Outside Scope
Longitudinal Scope
Start Point Receive PDD / PS from onsite End
Point Implementation of the module at onsite
- Defects identified in the modules developed by
the Joint team (client Hexaware) at onsite for
this Project. - Defects identified in PDD/ PS review are out of
scope.
Lateral Scope
Defects of the Project Captured in Internal
reviews, testing, ITRB and reported by customer
Assumptions / Constraints All defects are logged
and tracked to closure.
5SIPOC
Suppliers
Process
Customers
Inputs
Outputs
Process Definition Document and/ or Program
Specs Change Requests Faults/ Errors reported
by client MMS Defect logs Customer test Logs
Start
PDD/ PS review comments Source Code Working
Application Fixed Bugs Updated Change
Requests MMS Defect dump Customer test logs
Onsite Joint Team (Hexaware Client) project
team at offshore
Client Hexaware sr. Management project team at
offshore
Program Specs is received from Onsite
Review PS Give suggestions
CUT at offshore
Peer review, Unit Testing
ITRB Delivery to Customer
SIT UAT at onsite. Bugs reported.
Fix Bugs. Implementation in Production
End
6High Level Project Milestone
7Design of Measurement System
Unit of Measurement
Defect Density w.r.t. Engineering effort spent
for development, changes fixes for each module.
Specifications / Defect Definition
All defects identified Internally (in Code UTP
Reviews, Independent ITRB Testing) and
Externally ( by the customer in the delivered
code) for the Initial development, change
requests and the failed fixes for each of the
selected modules which has gone thru the full
life cycle.
Opportunities for Error
We will calculate Sigma level by calculating
process capability.
8Data Collection Plan
Defects collated manually from Excel sheets on
defects reported by customers, UTPs, SIT Plans,
UAT Plans MMS defect dump (for Internal Review,
Testing ITRB defects). Effort data collected
from MMS / PlanArena timesheets and manually
segregated for Normal, Change Rework effort.
- Padma
- Pratosh
- Vishnu
- Punam
- Chaitanya
40 Modules data collected over 3 months time
period.
Continuous
- A. By Defect -
- Type (Internal, PSD)
- Nature (Review, Testing, ITRB)
- Severity (Fatal, Major, Minor)
- Category (1 to 13)
- B. By Effort Breakup ( Normal, Change, Rework)
- C. By Module ID
9Sample Data Sheet
10Summary of Defect Density
On an average defect density is 0.48 per PD. But
Median shows defect density is only 0.31. As
there are some modules where DD is very high, it
is pulling mean to right side. So, in this type
of data, Median is better representation than
Mean.
11Normality Test
Anderson Darling test used to check the normality
of Defect Density. Analysis shows that Defect
density data is non normal as p-value less than
0.05 in probability plot. To calculate process
capability, data convert to normal. Box-Cox
transformation used to transform the defect
density data to normal which is shown here.
12Process Capability
Process capability analysis carried out based on
transformed data. From the analysis, it was found
out that long term sigma level is 0.07. Short
term sigma level of the process is 1.57 after 1.5
added to long term.
13Control Chart Defect Density
I-MR control chart prepared for defect density.
Two modules DD data shows more than UCL. After
analyzing, data shows that for those two modules
there is no rework effort though there are
defects entered.
14Goal Setting t-test
As p-value is less than 0.05, target of 0.24 for
defect density is significantly better than
current process performance.
One-Sample T Total Defect Density Test of mu
0.24 vs not 0.24 Variable N
Mean StDev SE Mean 95 CI
T p Total Defect Den 56 0.376446
0.367524 0.049112 (0.278023, 0.474870) 2.78
0.007
15Pareto Chart Defect Type Severity
16Pareto Chart Defect Category
Cause wise Pareto chart shows that more than 82
defects are belongs to four type of causes. Those
are Error in Coding, Incomplete or erroneous
Specification, Incomplete or erroneous testing
and Suggestions/Observations.
17Correlation Analysis Total defect Vs total
effort Rework effort
Correlations Rework Effort, Total Defects
Pearson correlation of Rework Effort and Total
Defects 0.499, P-Value 0.000
Correlations Total Effort in hrs, Total Defects
Pearson correlation of Total Effort in hrs and
Total Defects 0.576
Though there is positive correlation between
total defect and total effort, the correlation is
not significant. Similarly, the correlation
between total defect and rework effort is not
significant. i.e. Rework effort is less for more
defects in some modules and vice versa.
18Correlation Normal Vs Rework Effort
Correlations Normal effort, Rework Effort
Pearson correlation of Normal effort and Rework
Effort 0.330 P-Value 0.012
Again, correlation between Normal effort and
Rework effort is not significant though there
exists positive correlation.
19Box Plot Defect Density
Many outliers observed in PSD defect density
compare to internal DD and total defect density.
20Matrix Plot Internal Defects Vs PSD
Analysis shows that there is a significant
correlation exists between PSD and Defect
density. Also Rework effort is higher with PSD.
But in case of internal defects, though there is
positive correlation, it is not significant.
21Cause Effect Diagram
Unstable Design
Development Team
Dev / Test Environment
Clarity not available with design team at onsite
Training for the team
Bad / Junk data
Data difference in different regions
Change in requirement by users
Resource Attrition
Technical details
Design Team
Poor Functional/ tech Knowledge
Constraint of delivery dates
Learning Curve
Error in documenting the design
Inexperience Resource
Database constraints not proper b/w regions
Lack of Awareness of design team
Addition of new requirements
Access to regions not available to all
Continuously long work hours
Large no. of Post Shipment defects
Difference in initial actual Estimation as per
the estimation grid
Assumptions not taken into account by tester /
reviewer
Manual testing is done
Communication gaps b/w client and team
All Internal Testing / Review Defects not logged
Adequate time for ITRB not Available
Reusable components availability
Implementation gaps in Change management process
Code performance
Testing / Review Not done completely
Efforts Spent on development and testing
Module complexity
Available time with senior members for reviews
Review by seniors not done stringently
Last minute D/B code changes
Adherence to coding standards
General
Development
Review/ Testing Method
22Prioritization of Xs Control/Impact Matrix
IMPACT
High
Medium
Low
- Training for the team
- All internal testing / review defects not logged
- Adequate time for ITRB not available
- Testing / review not done completely
- Review by seniors not done stringently
- Available time with senior members for review.
- Reusable components availability
- Last minutes code changes
- Adherence to coding standards
- Resource attrition
- Poor functional Knowledge
- Continuously long work hours
- Assumptions document not taken into account by
reviewer / tester - Code performance issues.
- Effort spent on development testing
- Implementation gap in change mgmt
- process.
- Communication gap b/w client /team
- Module complexity
In Our Control
C O N T R O L
- Learning curve
- Inexperience resources
- Data differences in different regions
- Bad/Junk data
- Database constraints not proper b/w regions
- Manual testing
- Difference in initial actual Estimation
- as per the estimation grid
- Change in requirement by users
- Constraints of delivery dates
- Addition of New requirements
- Error in documenting the design
- Lack of awareness of design team
- Last minutes database changes
- Clarity not available with design team at onsite
- Access to regions not available to all
Out Of Our Control
23Vital Few Causes
- Internal Testing / review not done completely
because of non availability of Senior members or
time constraints - Non Adherence to coding standards
- Non availability of re-usability components
- Poor Functional Knowledge
- Assumptions document not taken into account by
Reviewer/ Tester - Communication gap between client / team
- Data issue
24Improvement Context Analysis
- Solutions for each root cause
- Have dedicated Review / testing team
- Sample programs for different kinds of work
- Have induction manual / process in place for new
team members - Ask each reviewer to put clearly all the
assumptions and get the same validated by a
senior resource before review testing - Get all communications validated by the client
via mails/ MOMs - Test bed and junk data issues - to get it solved
with help of customer
25Solution Implementation Plan
26Sample programs for different kinds of work
27Improvement Solutions
- Induction manual created for the freshers and
the new joiners in the project
- Assumption Document sent to the client with the
deliveries after the review/testing from senior
resource
- Detail communication in mail rather than
conveying the errors verbally over the phone
- Test bed solution given to client
28Process Capability Post Improvement
Again, Process capability analysis carried out
based on transformed data. It was found out that
long term sigma level is 0.30. Short term sigma
level of the process is 1.80 after 1.5 added to
long term.
29Histogram Before After Improvement
Histogram prepared for both pre and post
improvement data. Both mean and standard
deviation looks better (reduction in value) in
post improvement.
30T-test Pre Post Improvement
Though there is reduction in defect density
average Standard deviation in post improvement,
still improvement is not significant (as p-value
is more than 0.05)
Two-Sample T-Test and CI Before_total,
After_total N Mean StDev
SE Mean Before_total 58 0.479 0.658
0.086 After_total 33 0.347 0.463
0.081 Difference mu (Before_total) - mu
(After_total) Estimate for difference
0.132390 95 lower bound for difference
-0.064079 T-Test of difference 0 (vs gt)
T-Value 1.12 P-Value 0.133 DF 84
31Mann-Whitney test Pre Post Improvement
Mann-Whitney Test and CI Before_total,
After_total N Median Before_total
58 0.3100 After_total 34 0.1920 Point
estimate for ETA1-ETA2 is 0.0460 95.0 Percent CI
for ETA1-ETA2 is (-0.0511,0.1869), W
2821.0 Test of ETA1 ETA2 vs ETA1 gt ETA2 is
significant at 0.1589 The test is significant at
0.1574 (adjusted for ties)
As data is non-normal, Median is better measure
for central tendency. So Mann-Whitney test used
to check the significance of improvement. Median
of defect density is 0.19 in post improvement
compared against 0.31 in pre-improvement.
Analysis shows that Median has improved
drastically, as p-value is less than 0.05.
32Control Charts
Control Chart prepared for both pre and post
improvement data after removing all the outliers.
As post improvement defect density variation
reduced, control limits are become narrower.
33Trend Chart Rework Effort
As a result of reduction in Defect Density,
Rework effort reduced drastically for all the
modules. Trend chart shown here.
34Monitoring Plan
- We will continue monitoring and tracking defect
density for all the modules of project - After collection of post shipment defects, again
we will check for the improvement - Defect density for all future modules will be
computed and data will be plotted in control
chart. If any module defect density data goes
beyond control limits, special causes will be
identified and corrective/preventive action will
be implemented
35Project Summary
Define/Measure Phase Project started from March
2004 for developing the parcel tracking
application. The newly developed application
would replace and automate the current semi
manual process followed. From Jan 2006 to March
2006, the defect density w.r.t. effort for
customer defects was 0.48 and defect density for
internal defects was 0.38 ( Defects 185 and
Effort 483 PDS). Goal Defect Density for
defects reported by customer is to be reduced by
50 in next quarter. CTQ Measure Defect Density
w.r.t Effort Current DD 0.48 Current Z value
1.57 Target DD 0.24
- Analyse Phase
- Internal Testing / review not done completely
because of non availability of Senior members or
time constraints - Assumptions document not taken into account by
Reviewer/ Tester - Communication gap between client / team
- Non Adherence to coding standards
- Non availability of re-usability components
- Poor Functional Knowledge
- Data issue
C E Diagram
- Improve Phase
- Have dedicated Review / testing team
- Sample programs for different kinds of work
- Have induction manual / process in place for new
team members - Ask each reviewer to put clearly all the
assumptions and get the same validated by a
senior resource before review testing - Get all communications validated by the client
via mails/ MOMs - Test bed and junk data issues - to get it solved
with help of customer
Control Phase Defect density for all future
modules will be computed and data will be plotted
in control chart. If any module defect density
data goes beyond control limits, special causes
will be identified and corrective/preventive
action will be implemented.
Implementation Plan
Process Capability
36Process Improvements Results
37Statistical Tools by MINITAB
- Box Plot Helped to study the variation in
different segmentation - Process Capability Analysis Helped to make
baselining the performance at various stages (In
terms of Z-value or Cp Cpk) - Normality Test Anderson Darling Test used for
data normality - Box-Cox Transformation This method used to
transform the non-Normal data to Normal. - Pareto Chart To identify the vital few factors
from trivial many - Histogram To see the Central Tendency and
Variation in the process - Test of Hypothesis
- One sample t-test Used for Goal setting
- Two sample t-test Used for significant test for
improvement - Correlation Analysis To study the relation
between two variables - Mann-Whitney Test Used for Median test for 2
samples - Control Chart To check for variation due to
Random causes or Special Causes
38Thank You All!
Sumant Sahoo Hexaware Technologies sumants_at_hexawa
re.com www.hexaware.com