Synergy - PowerPoint PPT Presentation

1 / 57
About This Presentation
Title:

Synergy

Description:

Title: PowerPoint Presentation Last modified by: Joe Tran Created Date: 2/28/2006 6:47:11 AM Document presentation format: On-screen Show Other titles – PowerPoint PPT presentation

Number of Views:76
Avg rating:3.0/5.0
Slides: 58
Provided by: engrSjsuE51
Category:

less

Transcript and Presenter's Notes

Title: Synergy


1
Synergy
  • Group Members
  • Carol Cheng
  • Carol Yiu
  • Gary Ho
  • Bryant Young

2
Agenda
  • Define
  • Measure
  • Analysis
  • Improve
  • Control

3
SIPOC
Overview of process
4
Define
  • Determine the projects purpose and scope.
  • Obtain a good understanding of the current
    process and the customers needs.
  • Recruit team members.

5
Company Charter
  • Purpose
  • The cross communication between marketing and
    manufacturing can lead to product delays and
    excessive financial losses. Our purpose is to
    reduce the review time of the documents and
    improve the accuracy of computer configurations.
  • Importance
  • Avoid customer dissatisfaction which can lead to
    long-term loss of business. Too many recalls of
    Synergy computers due to defective or unreliable
    setup configurations will damage the companys
    reputation and potential loss of vendor
    partnerships.
  • Reduce financial losses and schedule delays
    attributed to configuration failures by doing
    extensive component and system checking.

6
Company Charter
  • Scope
  • Determine a solution to improve accuracy of the
    documents among RD, marketing and manufacturing
    departments.
  • System Management Department has final veto power
    for system configuration. By giving authority to
    SMD, disputes between marketing and manufacturing
    will be reduced and there will be fewer schedule
    slips from changing configurations back and
    forth.
  • Find a solution with existing resources. No
    additional employees or budget increases are
    allowed due to overhead constraints.
  • Measures
  • Record configuration review time by SMD, number
    of exchanges among different departments, number
    of changes proposed versus implemented, monetary
    losses from configuration errors and use
    statistical methods to compare weekly data and
    compare to past data.

7
Company Charter
  • Deliverables
  • Come up with a solution for improving the
    accuracy of configuration document reviews. This
    solution may include a process change and/or
    software tools to automate process.
  • A final presentation and report to upper
    management detailing the results of the solution
    by using supporting statistical data and random
    surveys from the departments involved.
  • Resources
  • Team leader Gary Ho
  • Team members Carol Cheng, Carol Yiu and Bryant
    Young
  • Team sponsor Synergy
  • Coach Professor Saeed
  • Full access to financial, consumer and inventory
    database to assist in the development of a better
    solution.
  • All employees within the departments involved.

8
Gantt Chart
9
CTQ Tree
10
Flow Chart
11
Process Flow Diagram
12
Measure
  • Collection of data
  • Highlight contributing factors to problem

13
Balanced Scorecard
14
Balanced Scorecard
15
Pareto Chart - Types of Errors
16
Pareto Chart Compatibility Issues
17
Internal Benchmarking Region Comparison
SM is currently responsible for the region of
Europe. Below is a regional comparison by types
of errors by SKU.
of Errors by SKU  Latin America North America Asia Pacific Europe
of SKUs 12 80 150 300
Compatibility Issues 3 4 3.28 2.35
Input Issues 1.33 1 1.2 0.46
Availability Issues 2.58 0.65 0.4 0.24
Total of Issues 6.91 5.65 4.88 3.05
18
Internal Benchmarking Region Comparison
19
Internal Benchmarking Comparing with Perfection
For more internal benchmarking, please refer to
Appendix.
20
Internal Benchmarking Comparing with Perfection
21
Analysis
  • Find root cause of problems

22
Cause Effect Diagram
For affinity diagram of root causes, please
refer to Appendix.
23
Improve
  • Generate solutions to address root causes
  • Implement recommended solutions

24
Affinity Diagram
25
Project Recommendations
  • Project 1 SW Automation
  • Verification software for marketing
  • Verification software for SM
  • Project 2 Procedure
  • Increase frequency of marketing updates
    bi-weekly
  • IT to provide updates regularly
  • IT to provide notification when component name
    changes
  • Project 3 Training for marketing staff

26
Prioritization Matrices
Options vs. All Criteria
NPV Duration Change required Complexity Labor Row Total Relative Decimal Value
SW Automation 0.42 0.00 0.00 0.00 0.04 0.47 0.47
Procedure 0.01 0.02 0.07 0.02 0.01 0.13 0.13
Training for MKT 0.23 0.03 0.14 0.00 0.00 0.40 0.40
Grand Total 1.00
For details, please refer to Appendix.
27
Improved Flow Chart
28
Failure Mode and Effects Analysis (FMEA)
  • To anticipate potential failures of the new
    process flow

Process Steps Potential Failure Mode Potential Effects of Failure Severity Potential Causes Occur-rence Current Controls Detection RPN Recommended Actions Responsibility
Marketing Generates Report 1.component incompatible 2. components not ready 1. wrong assembly 2.unable to build computers on time 8 1. ignore constraints 2. marketing has inadequate information 8 none 2 128 1. training for marketing 2. verification SW and IT updates IT SM
IT Updates Tool inaccurate configurations because tool not updated not checking incompatible configurations 8 1. IT has no procedure in place 2. no time for IT to update 8 none 7 448 establish procedure for IT to update SW tool SM
Marketing Runs Tool 1. missing configuration constraint 2. false errors 1. incompatible configurations 2. time wasted 8 1. inaccurate constraints 2. not enough time to update 6 none 7 336 1. establish update procedure 2. marketing tells IT of errors Marketing, SM, IT
SM Generates Document And Checking human error entering data 1. wrong configuration 2. delay schedule 8 lack of time 6 none 2 96 more time for verification and input SM
SM Runs Tool 1. names not matching 2. components missing time wasted 6 no procedure to update tool 7 none 7 294 establish procedure SM IT
For details, please refer to Appendix.
29
Improvement Plan Gantt chart
30
For details, please refer to Appendix.
31
PDCA
  1. Automated software
  2. Biweekly update
  3. Marketing Training
  • Plan
  1. If actual expected
  2. Else

Do
Act
  1. Do pilot for automated tool
  2. Try out biweekly update
  3. Train marketing

Check
  1. Check Tools
  2. Compare of human errors
  3. Compare of compatibility errors

For details, please refer to Appendix.
32
Control
  • Establish tools to monitor and ensure new process
    is being followed.

33
Types of Control
  • Performance Monitoring
  • Training (Monthly)
  • Performance Review
  • Management
  • SOP (Standard Operating Procedures)

34
SOP
  • Marketing training
  • IT updates
  • Use of verification software by marketing
  • Use of verification software by SM

35
Questions
  • ???

36
Appendix
  • Internal Benchmarking
  • Affinity Diagram of Root Causes
  • Commitment Scale
  • Communication Plan
  • FMEA Details
  • Prioritization Details
  • PDCA Details
  • House of Quality

37
Internal Benchmarking Comparing with Perfection
38
Internal Benchmarking 2004 versus 2005
39
Internal Benchmarking 2004 versus 2005
40
Internal Benchmarking 2004 versus 2005
41
Affinity Diagram
42
Commitment Scale
43
Communication Plan
44
FMEA - Severity Scale
Rating Criteria A failure could
10 Injure a customer or employee
9 Be illegal
8 Render the product unfit for use
7 Cause extreme customer dissatisfaction
6 Result in partial malfunction
5 Cause a loss of performance likely to result in a complaint
4 Cause minor performance loss
3 Cause a minor nuisance can be overcome with no loss
2 Be unnoticed minor effort on performance
1 Be unnoticed and not affect the performance
Note lower numbers are better.
45
FMEA - Occurrence Scale
Rating Time Period Probability
10 More than once per day gt 30
9 Once every 3-4 days lt 30
8 Once per week lt 5
7 Once per month lt 1
6 Once every 3 months lt 0.03
5 Once every 6 months 1 per 10,000
4 Once per year 6 per 100,000
3 Once every 1-3 years 6 per 1 million
2 Once every 3-6 years 3 per 10 million
1 Once every 6-100 years 2 per billion
Note lower numbers are better.
46
FMEA - Detection Scale
Rating Definition
10 Detect caused by failure is not detectable
9 Occasional units are checked for defects
8 Units are systematically sampled and inspected
7 All units are manually inspected
6 Manual inspection with mistake-proofing modifications
5 Process is monitored via statistical process control (SPC) and manually inspected
4 SPC used, with an immediate reaction to out-of-control conditions
3 SPC as above, with 100 inspections surrounding out-of-control conditions
2 All units automatically inspected
1 Defect is obvious and can be kept from affecting customer
Note lower numbers are better.
47
Prioritization Matrices
Criteria Relative Decimal Value
NPV 0.67
Duration 0.05
Change required 0.21
Complexity 0.02
Labor 0.05

48
Prioritization Matrices
NPV
NPV SW Automation Procedure Training for MKT Row Total Relative Decimal Value
SW Automation 10 1 11 0.64
Procedure 0.1 0.2 0.3 0.02
Training for MKT 1 5 6 0.35
Grand Total 17.3
49
Prioritization Matrices
Duration
Duration SW Automation Procedure Training for MKT Row Total Relative Decimal Value
SW Automation 0.1 0.2 0.3 0.01
Procedure 10 0.1 10.1 0.40
Training for MKT 5 10 15 0.59
Grand Total 25.4
50
Prioritization Matrices
Change Required
Change Required SW Automation Procedure Training for MKT Row Total Relative Decimal Value
SW Automation 0.2 0.1 0.3 0.02
Procedure 5 1 6 0.35
Training for MKT 10 1 11 0.64
Grand Total 17.3
51
Prioritization Matrices
Complexity
Complexity SW Automation Procedure Training for MKT Row Total Relative Decimal Value
SW Automation 0.1 0.2 0.3 0.01
Procedure 10 10 20 0.79
Training for MKT 5 0.1 5.1 0.20
Grand Total 25.4
52
Prioritization Matrices
Labor
Labor SW Automation Procedure Training for MKT Row Total Relative Decimal Value
SW Automation 10 10 20 0.79
Procedure 0.1 5 5.1 0.20
Training for MKT 0.1 0.2 0.3 0.01
Grand Total 25.4
53
PDCA - Plan
Solution Metrics Results Steps
Automated Software of errors caught Errors generated by tool (tool base) of errors not caught Reduction in errors (accurate report) Target is to have 0 errors Design (need resources from IT) Test out software Ready for use
Biweekly update from Marketing of errors generated by marketing of human errors (inputting errors) Reduction in errors (accurate report) Target is to reduce errors by 50) Send report from marketing to S.M on Mon Wed. Marketing should use tool to check before sending to S.M.
Training for Marketing of errors generated by marketing of human errors (inputting errors) Reduction in errors (accurate report) Target is to have 0 errors Monthly training of marketing staff to understand the constraints
Note SM is responsible to oversee this process.
54
PDCA - Do
  • Do software tools pilot run for both marketing
    and S.M
  • Try out bi-weekly update to see if there are
    improvements
  • Test each components individually to check
    results accurately
  • Monthly training of marketing staffs

55
PDCA - Check
  • Check Tools
  • Compare of errors generated before and after
    using the tool
  • Evaluate of errors incorrectly reported
  • Compare of errors to expected (0 errors)
  • Compare of human errors generated by SM to
    expected (50)
  • Number of configuration errors

56
PDCA - Act
  • If actual expected
  • Will implement as standard procedure
  • Else
  • Determine root cause and take corrective actions.

57
House of Quality
-1
1
-1
1
2
1
1
1
1
Error Free Document from Marketing Error Free Document from Marketing Error Free Document from Marketing Error Free Document from Marketing Error Free Document from Marketing Error Free Document from System Management Error Free Document from System Management Error Free Document from System Management Error Free Document from System Management Error Free Document from System Management Communication Communication Communication
Relative Importance Knowledgeable Staff Clear Procedure for Documentation Creation Adequate Time Consistency of Component Naming Simple format / Easy to use Knowledgeable Staff Clear Procedure for Documentation Creation Adequate Time Consistency of Component Naming Simple format / Easy to use The Direct Point of Contact Team Dynamic Number of People Involved
Correct Configuration 5 2 1 1 2 1 2 1 1 2 1 0 1 -1
Document on Time 4 1 1 1 2 1 2 1 2 2 1 0 1 -1
Ease of Communication 1 0 0 0 0 0 0 0 0 0 0 2 1 -1
Write a Comment
User Comments (0)
About PowerShow.com