COSYSMO-IP - PowerPoint PPT Presentation

About This Presentation
Title:

COSYSMO-IP

Description:

Title: PowerPoint Presentation Author: crv021 Last modified by: Ricardo Valerdi Created Date: 10/14/2001 11:57:31 PM Document presentation format – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 38
Provided by: crv7
Category:

less

Transcript and Presenter's Notes

Title: COSYSMO-IP


1
COSYSMO-IP COnstructive SYStems Engineering Cost
Model Information Processing
PSM Users Group Conference Keystone,
Colorado July 24 25, 2002
Dr. Barry Boehm Ricardo Valerdi University of
Southern California Center for Software
Engineering
Version 3
2
Outline Day 1
  • USC Center for Software Engineering
  • Background Update on COSYSMO-IP
  • Ops Con EIA632
  • Delphi Round 1 Results
  • Updated Drivers
  • Lessons Learned/Improvements
  • LMCO INCOSE Comments
  • Q A

3
Outline Day 2
  • Review of yesterdays modified slides to clarify
    terminology
  • A few new slides to emphasize points
  • Review of current driver definitions
  • Definition for two new Cost drivers
  • Technology Maturity
  • Physical system/information system tradeoff
    analysis complexity

4
Objectives of the Workshop
  • Agree on a Concept of Operation
  • Converge on scope of COSYSMO-IP model
  • Address definitions of model parameters
  • Discuss data collection process

5
  • 8 faculty/research staff, 18 PhD students
  • Corporate Affiliates program (TRW, Aero Galorath,
    Raytheon, Lockheed, Motorola, et al)
  • 17th International Forum on COCOMO and Software
    Cost Modeling October 22-25, 2002, Los Angeles,
    CA
  • Theme Software Cost Estimation and Risk
    Management
  • Annual research review in March 2003


6
COSYSMO-IP What is it?
  • The purpose of the COSYSMO-IP project
  • is to develop an initial increment of a
  • parametric model to estimate the cost of
  • system engineering activities during system
  • development.
  • The focus of the initial increment is on the
  • cost of systems engineering for information
  • processing systems or subsystems.

7
What Does COSYSMO-IP Cover?
  • Includes
  • System engineering in the inception, elaboration,
    and construction phases, including test planning
  • Requirements development and specification
    activities
  • Physical system/information system tradeoff
    analysis
  • Operations analysis and design activities
  • System architecture tasks
  • Including allocations to hardware/software and
    consideration of COTS, NDI and legacy impacts
  • Algorithm development and validation tasks
  • Defers
  • Physical system/information system operation test
    evaluation, deployment
  • Special-purpose hardware design and development
  • Structure, power and/or specialty engineering
  • Manufacturing and/or production analysis

8
Candidate COSYSMO Evolution Path
Oper Test Eval
Inception Elaboration
Construction
Transition
1. COSYSMO-IP
IP (Sub)system
2. COSYSMO-C4ISR
C4ISR System
Physical Machine System
3. COSYSMO-Machine
System of Systems (SoS)
4. COSYSMO-SoS
9
Current COSYSMO-IP Operational Concept
Requirements Interfaces Scenarios
Algorithms Volatility Factor
Size Drivers
COSYSMO-IP
Effort
Duration
Effort Multipliers
Calibration
  • Application factors
  • Team factors
  • Schedule driver

WBS guided By EIA 632
10
EIA632/COSYSMO-IP Mapping
COSYSMO-IP Category EIA632 Requirement Supplier
Performance 3 Technical Management
4-12 Requirements Definition 14-16 Solution
Definition 17-19 Systems Analysis
22-24 Requirements Validation 25-29 Design
Solution Verification 30 End Products
Validation - COTS 33a
EIA632 Reqs. not included in COSYSMO-IP are
1,2,13,20,21,31,32,33b
11
Activity Elements Covered by EIA632, COCOMOII,
and COSYSMO-IP
When doing COSYSMO-IP and COCOMOII, Subtract
grey areas prevent double counting.
COCOMOII
COSYSMO-IP
12
Past, Present, and Future
Performed First Delphi Round
  • Initial set if parameters
  • compiled by Affiliates

PSM Workshop
2001
2002
2003
Meeting at CCII Conference
Working Group meeting at ARR
13
Future Parameter Refinement Opportunities
2004
2005
2003
Driver definitions
Data collection (Delphi)
First iteration of model
Model calibration
14
Delphi Survey
  • Survey was conducted to
  • Determine the distribution of effort across
    effort categories
  • Determine the range for size driver and effort
    multiplier ratings
  • Identify the cost drivers to which effort is most
    sensitive to
  • Reach consensus from a sample of systems
    engineering experts
  • Distributed Delphi surveys to Affiliates and
    received 28 responses
  • 3 Sections
  • Scope, Size, Cost
  • Also helped us refine the scope of the model
    elements

15
System Engineering Effort Distribution
Delphi Round 1 Results
Std. Dev.
Category (EIA Requirement) Supplier Performance
(3) Technical Management (4-12) Requirements
Definition (14-16) Solution Definition
(17-19) Systems Analysis (22-24) Requirements
Validation (25-29) Design Solution Verification
(30) End Products Validation (33a)
Delphi 5.2 13.1 16.6 18.1 19.2 11.3 10.5 6.
6
Suggested 5 15 15 20 20 15 5 5
3.05 4.25 4.54 4.28 5.97 4.58 6.07 3.58
16
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Size Drivers
6.48
5.57
6
Relative Effort
4
2.54
2.23
2.21
2.10
2
1
TPMs
Modes
Scenarios
Algorithms
Platforms
Interfaces
Requirements
17
Two Most Sensitive Size Drivers
Suggested Rel. Effort Delphi Respondents EMR Delphi Respondents EMR
Rel. Effort Standard Deviation
Interfaces 4 5.57 1.80
Algorithms 6 6.48 2.09
18
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Cost Drivers
(Application Factors)
4
EMR
2.81
2.43
2.24
2.13
2
1.93
1.74
1.13
COTS
Architecture und.
Legacy transition
Platform difficulty
Requirements und.
Bus. process reeng.
Level of service reqs.
19
Delphi Round 1 Highlights (cont.)
Range of sensitivity for Cost Drivers (Team
Factors)
4
EMR
2.46
2.16
1.91
1.94
1.84
1.78
2
1.28
1.25
Tool support
Multisite coord.
Process maturity
Formality of deliv.
Stakeholder comm.
Personnel capability
Stakeholder cohesion
Personal experience
20
Four Most Sensitive Cost Drivers
Suggested EMR Delphi Respondents EMR Delphi Respondents EMR
Mean Standard Deviation
Arch. Under. 1.66 2.24 0.83
Reqs. Under. 1.73 2.43 0.70
Pers. Cap. 2.15 2.46 0.66
Serv. Req. 2.5 2.81 0.67
21
4 Size Drivers
  1. Number of System Requirements
  2. Number of Major Interfaces
  3. Number of Operational Scenarios
  4. Number of Unique Algorithms

Number of Technical Performance Measures Number
of Modes of Operation Number of Different
Platforms
COST Driver
COST Driver
22
Size Driver Definitions (1 of 4)
  • Number of System Requirements
  • The number of requirements taken from the system
  • specification. A requirement is a statement of
    capability or
  • attribute containing a normative verb such as
    shall or will. It
  • may be functional or system service-oriented in
    nature
  • depending on the methodology used for
    specification. System
  • requirements can typically be quantified by
    counting the
  • number of applicable shalls or wills in the
    system or
  • marketing specification.
  • Note Use this driver as the basis of
  • comparison for the rest of the drivers.

23
Size Driver Definitions (2 of 4)
  • Number of Major Interfaces
  • The number of shared major physical and logical
  • boundaries between system components or functions
  • (internal interfaces) and those external to the
    system
  • (external interfaces). These interfaces typically
    can be
  • quantified by counting the number of interfaces
  • identified in either the systems context diagram
    and/or by counting
  • the significant interfaces in applicable
    Interface Control
  • Documents.

24
Size Driver Definitions (3 of 4)
  • Number of Operational Scenarios
  • The number of operational scenarios that a
    system is specified to
  • satisfy. Such threads typically result in
    end-to-end test scenarios
  • that are developed to validate the system
    satisfies its requirements.
  • The number of scenarios can typically be
    quantified by counting
  • the number of end-to-end tests used to validate
    the system
  • functionality and performance. They can also be
    calculated by
  • counting the number of high-level use cases
    developed as part of
  • the operational architecture.

Number of Modes of Operation (to be merged with
Op Scen) The number of defined modes of operation
for a system. For example, in a radar system,
the operational modes could be air-to-air,
air-to-ground, weather, targeting, etc. The
number of modes is quantified by counting the
number of operational modes specified in the
Operational Requirements Document.
counting rules need to be refined Op Scen can
be derived from system modes
25
Size Driver Definitions (4 of 4)
  • Number of Unique Algorithms
  • The number of newly defined or significantly
    altered functions that
  • require unique mathematical algorithms to be
    derived in order to
  • achieve the system performance requirements.
  • Note Examples could include a complex aircraft
  • tracking algorithm like a Kalman Filter being
    derived using existing
  • experience as the basis for the all aspect search
    function. Another
  • Example could be a brand new discrimination
    algorithm being
  • derived to identify friend or foe function in
    space-based
  • applications. The number can be quantified by
    counting the number
  • of unique algorithms needed to support each of
    the mathematical
  • functions specified in the system specification
    or mode description
  • document (for sensor-based systems).

26
12 Cost Drivers
Application Factors (5)
  1. Requirements understanding
  2. Architecture complexity
  3. Level of service requirements
  4. Migration complexity
  5. COTS assessment complexity
  6. Platform difficulty
  7. Required business process reengineering
  8. Technology Maturity
  9. Physical system/information subsystem tradeoff
    analysis complexity

27
Cost Driver Definitions (1,2 of 5)
  • Requirements understanding
  • The level of understanding of the system
    requirements
  • by all stakeholders including the systems,
    software, hardware,
  • customers, team members, users, etc

Architecture complexity The relative difficulty
of determining and managing the system
architecture in terms of IP platforms,
standards, components (COTS/GOTS/NDI/new),
connectors (protocols), and constraints. This
includes systems analysis, tradeoff analysis,
modeling, simulation, case studies, etc
28
Cost Driver Definitions (3,4,5 of 5)
Level of service requirements The difficulty and
criticality of satisfying the Key Performance
Parameters (KPP). For example security, safety,
response time, the illities, etc
  • Migration complexity (formerly Legacy transition
    complexity)
  • The complexity of migrating the system from
    previous system
  • components, databases, workflows, etc, due to new
    technology
  • introductions, planned upgrades, increased
    performance, business
  • process reengineering etc

Technology Maturity The relative readiness for
operational use of the key technologies.
29
12 Cost Drivers (cont.)
Team Factors (7)
  1. Number and diversity of stakeholder communities
  2. Stakeholder team cohesion
  3. Personnel capability
  4. Personal experience/continuity
  5. Process maturity
  6. Multisite coordination
  7. Formality of deliverables
  8. Tool support

30
Cost Driver Definitions (1,2,3 of 7)
Stakeholder team cohesion Leadership, frequency
of meetings, shared vision, approval
cycles, group dynamics (self-directed teams,
project engineers/managers), IPT framework, and
effective team dynamics.
Personnel capability Systems Engineerings
ability to perform in their duties and
the quality of human capital.
  • Personnel experience/continuity
  • The applicability and consistency of the staff
    over the life of the
  • project with respect to the customer, user,
    technology, domain,
  • etc

31
Cost Driver Definitions (4,5,6,7 of 7)
Process maturity Maturity per EIA/IS 731, SE CMM
or CMMI.
Multisite coordination Location of stakeholders,
team members, resources (travel).
  • Formality of deliverables
  • The breadth and depth of documentation required
    to be formally
  • delivered.

Tool support Use of tools in the System
Engineering environment.
32
Lessons Learned/Improvements
Lesson 1 Need to better define the scope and
future of COSYSMO-IP via Con Ops Lesson 2
Drivers can be interpreted in different Ways
depending on the type of program Lesson 3
COSYSMO is too software-oriented Lesson 4
Delphi needs to take less time to fill out Lesson
5 Need to develop examples, rating scales
33
LMCO Comments
The current COSYSMO focus is too software
oriented.  This is a good point.  We propose to
change the scope from "software-intensive systems
or subsystems" to "information processing (IP)
systems or subsystems."  These include not just
the software but also the associated IP hardware
processors memory networking display or other
human-computer interaction devices.  System
engineering of these IP systems or subsystems
includes considerations of IP hardware device
acquisition lead times, producibility, and
logistics.  Considerations on non-IP hardware
acquisition, producibility, and logistics are
considered as IP systems engineering cost and
schedule drivers for the IOC version of COSYSMO. 
Perhaps we should call it COSYSMO-IP.
34
LMCO Comments (cont.)
The COSYSMO project should begin by working out
the general framework and WBS for the full life
cycle of a general system.  We agree that such a
general framework and WBS will eventually be
needed.  However, we feel that progress toward it
can be most expeditiously advanced by working on
definitions of and data for a key element of the
general problem first.  If another group would
like to concurrently work out the counterpart
definitions and data considerations for the
general system engineering framework, WBS, and
estimation model, we will be happy to collaborate
with them.
35
Points of Contact
  • Dr. Barry Boehm boehm_at_sunset.usc.edu
  • (213) 740-8163
  • Ricardo Valerdi rvalerdi_at_sunset.usc.edu
  • (213) 440-4378
  • Donald Reifer dreifer_at_earthlink.net
  • (310) 530-4493
  • Websites
  • http//valerdi.com/cosysmo
  • http//sunset.usc.edu

36
Backup slides
37
COCOMOII Suite
COPROMO
COQUALMO
COPSEMO
COCOMOII
CORADMO
COCOTS
COSYSMO-IP
For more information visit http//sunset.usc.edu
Write a Comment
User Comments (0)
About PowerShow.com