COSYSMO Working Group Meeting Industry Calibration results - PowerPoint PPT Presentation

About This Presentation
Title:

COSYSMO Working Group Meeting Industry Calibration results

Description:

Title: Slide 1 Author: Rico Last modified by: Rico Created Date: 2/22/2005 9:44:20 PM Document presentation format: On-screen Show Company: USC Other titles – PowerPoint PPT presentation

Number of Views:144
Avg rating:3.0/5.0
Slides: 47
Provided by: rico3
Category:

less

Transcript and Presenter's Notes

Title: COSYSMO Working Group Meeting Industry Calibration results


1
COSYSMO Working Group MeetingIndustry
Calibration results
  • Ricardo two months from the finish line Valerdi
  • USC Center for Software Engineering
    The
    Aerospace Corporation

2
Morning Agenda
  • 730 Continental Breakfast (in front of Salvatori
    Hall)
  • 830 Introductions All
  • 900 Brief overview of COSYSMO Ricardo
  • 915 Calibration results Ricardo
  • 945 Break
  • 1015 Size driver counting rules exercise All
  • 1115 Mini Delphi for EIA 632 activity
    distributions All
  • 1200 Lunch (in front of Salvatori Hall)

3
Afternoon Agenda
  • 100 Joint meeting with COSOSIMO workshop JoAnn
    Lane
  • 200 COSYSMO Risk/Confidence Estimation Prototype
    John Gaffney
  • 245 Break
  • 315 Open issues
  • Local calibrations
  • Lies, damned lies, and statistical outliers
  • Future plans for COSYSMO 2.0 (including
    ties to SoS work)
  • 430 Action items for next meeting July 2005 in
    Keystone, CO
  • 500 Adjourn

4
7-step Modeling Methodology
Analyze Existing literature
Perform Behavioral Analysis
1
Identify Relative Significance
2
Perform Expert- Judgement, Delphi Assessment
3
4
Gather Project Data
Determine Bayesian A-Posteriori Update
5
WE ARE HERE
Gather more data refine model
6
7
5
COSYSMO Operational Concept
Requirements Interfaces Scenarios
Algorithms Volatility Factors
Size Drivers
COSYSMO
Effort
Effort Multipliers
  • Application factors
  • 8 factors
  • Team factors
  • 6 factors

Calibration
WBS guided by EIA/ANSI 632
6
COSYSMO Cost Drivers
  • Application Factors
  • Requirements understanding
  • Architecture understanding
  • Level of service requirements
  • Migration complexity
  • Technology Maturity
  • Documentation Match to Life Cycle Needs
  • and Diversity of Installations/Platforms
  • of Recursive Levels in the Design
  • Team Factors
  • Stakeholder team cohesion
  • Personnel/team capability
  • Personnel experience/continuity
  • Process maturity
  • Multisite coordination
  • Tool support

7
COSYSMO 1.0 Calibration Data Set
  • Collected 35 data points
  • From 6 companies 13 business units
  • No single company had gt 30 influence

8
COSYSMO Data Sources
Raytheon Intelligence Information Systems (Garland, TX)
Northrop Grumman Mission Systems (Redondo Beach, CA)
Lockheed Martin Transportation Security Solutions (Rockville, MD) Integrated Systems Solutions (Valley Forge, PA) Systems Integration (Owego, NY) Aeronautics (Marietta, GA) Maritime Systems Sensors (Manassas, VA)
General Dynamics Maritime Digital Systems/AIS (Pittsfield, MA) Surveillance Reconnaissance Systems/AIS (Bloomington, MN)
BAE Systems National Security Solutions/ISS (San Diego, CA) Information Electronic Warfare Systems (Nashua, NH)
SAIC Army Transformation (Orlando, FL) Integrated Data Solutions Analysis (McLean, VA)
9
Data Champions
  • Gary Thomas, Raytheon
  • Steven Wong, Northrop Grumman
  • Garry Roedler, LMCO
  • Paul Frenz, General Dynamics
  • Sheri Molineaux, General Dynamics
  • Fran Marzotto, General Dynamics
  • John Rieff, Raytheon
  • Jim Cain, BAE Systems
  • Merrill Palmer, BAE Systems
  • Bill Dobbs, BAE Systems
  • Donovan Dockery, BAE Systems
  • Mark Brennan, BAE Systems
  • Ali Nikolai, SAIC

10
Meta Properties of Data Set
Almost half of the data received was
from Military/Defense programs 55 was from
Information Processing systems and 32 was from
C4ISR
11
Meta Properties of Data Set
Two-thirds of the projects were
software-intensive
First 4 phases of the SE life cycle were
adequately covered
12
Industry Calibration Factor
Calculation is based on aforementioned data (n
35)
This calibration factor must be adjusted for
each organization
Evidence of diseconomies of scale (partially
captured in Size driver weights)
13
Size Driver Influence on Functional Size
N 35
of scenarios and of requirements accounted
for 83 of functional size
of Interfaces and of Algorithms drivers
proved to be less significant
14
Parameter Transformation
15
Size vs. Effort
35 projects R-squared 0.55
Range of SE_HRS Min 881, Max 1,377,458
Range of SIZE Min 82, Max 17,763
16
Intra-Size Driver Correlation
REQ INTF ALG OPSC
REQ 1.0
INTF 0.63 1.0
ALG 0.48 0.64 1.0
OPSC 0.59 0.32 0.05 1.0
  • REQ INTF are highly correlated (0.63)
  • ALG INTF are highly correlated (0.64)

17
A Day In the Life
  • Common problems
  • Requirements reported at sky level rather than
    sea level
  • Test if REQS lt OPSC, then investigate
  • Often too high requires some decomposition
  • Interfaces reported at underwater level rather
    than sea level
  • Test if INTF source pin or wire level, then
    investigate
  • Often too low requires investigation of physical
    or logical I/F

We will revisit these issues later
18
A Day In the Life (part 2)
  • Common problems (cont.)
  • Algorithms not reported
  • Only size driver omitted by 14 projects spanning
    4 companies
  • Still a controversial driver divergent support
  • Operational Scenarios not reported
  • Only happened thrice (scope of effort reported
    was very small in all cases)
  • Fixable involved going back to VV documentation
    to extract at least one OPSC

We will revisit these issues later
19
The Case for Algorithms
N 21
  • Reasons to keep ALG in model
  • Accounts for 16 of the total size
    in the 21 projects that reported ALG
  • It is described in the INCOSE
    SE Handbook as a crucial part of SE
  • Reasons to drop ALG from model
  • Accounts for 9 of total SIZE contribution
  • Omitted by 14 projects, 4 companies
  • Highly correlated with INTF (0.64)
  • Has a relatively small (0.53) correlation with
    Size (compared to REQ 0.91, INT 0.69, and OPSN
    0.81)

20
Cost Drivers
  • Original set consisted of gt 25 cost drivers
  • Reduced down to 8 application and 6
    team factors
  • See correlation handout
  • Regression coefficient improved from 0.55 to 0.64
    with the introduction of cost drivers
  • Some may be candidates for elimination or
    aggregation

21
Cost Drivers Application Factor
Distribution(RQMT, ARCH, LSVC, MIGR)
22
Cost Drivers Application Factor
Distribution(TMAT, DOCU, INST, RECU)
23
Cost Drivers Team Factor Distribution(TEAM,
PCAP, PEXP, PROC)
24
Cost Drivers Team Factor Distribution(SITE,
TOOL)
25
Top 10 Intra Driver Correlations
  • Size drivers correlated to cost drivers
  • 0.39 Interfaces of Recursive Levels in the
    Design
  • -0.40 Interfaces Multi Site Coordination
  • 0.48 Operational Scenarios of Recursive
    Levels in Design
  • Cost drivers correlated to cost drivers
  • 0.47 Requirements Und. Architecture Und.
  • -0.42 Requirements Und. Documentation
  • 0.39 Requirements Und. Stakeholder Team
    Cohesion
  • 0.43 Requirements Und. Multi Site Coordination
  • 0.39 Level of Service Reqs. Documentation
  • 0.50 Level of Service Reqs. Personnel
    Capability
  • 0.49 Documentation of Recursive Levels in
    Design

26
Candidate Parameters for Elimination
  • Size Drivers
  • of Algorithms
  • Cost Drivers (application factors)
  • Requirements Understanding
  • Level of Service Requirements
  • of Recursive Levels in the Design
  • Documentation
  • of Installations Platforms
  • Personnel Capability
  • Tool Support

Motivation for eliminating parameters is based
on the high ratio of parameters (18) to data
(35) and the need for degrees of freedom
By comparison, COCOMO II has 23 parameters and
over 200 data points
Due to high correlation Due to regression
insignificance
27
The Case for of Recursive Levels in the Design
  • Reasons to keep RECU in model
  • Captures emergent properties of systems
  • Originally thought of as independent from other
    size and cost drivers
  • Reasons to drop RECU from model
  • Highly correlated to
  • Size (0.44)
  • Operational Scenarios (0.48)
  • Interfaces (0.39)
  • Documentation (0.49)

28
Size driver counting rules
  • Are there any suggested improvements?
  • Requirements
  • Need to add guidance with respect to
  • system vs. system engineered vs. subsystem
    requirements
  • decomposed vs. derived requirements
  • Current guidance includes
  • Requirements document, System Specification,
    RVTM, Product Specification, Internal functional
    requirements document, Tool output such as DOORS,
    QFD.

29
Counting Rules Requirements
  • Number of System Requirements
  • This driver represents the number of requirements
    for the system-of-interest at
  • a specific level of design. The quantity of
    requirements includes those related
  • to the effort involved in system engineering the
    system interfaces, system
  • specific algorithms, and operational scenarios.
    Requirements may be
  • functional, performance, feature, or
    service-oriented in nature depending on the
  • methodology used for specification. They may
    also be defined by the customer
  • or contractor. Each requirement may have effort
    associated with is such as
  • VV, functional decomposition, functional
    allocation, etc. System requirements
  • can typically be quantified by counting the
    number of applicable
  • shalls/wills/shoulds/mays in the system or
    marketing specification. Note some
  • work is involved in decomposing requirements so
    that they may be counted at
  • the appropriate system-of-interest.

How can we prevent requirements count from being
provided too high?
30
Counting Rules Interfaces
  • Number of System Interfaces
  • This driver represents the number of shared
    physical and logical
  • boundaries between system components or functions
    (internal
  • interfaces) and those external to the system
    (external interfaces).
  • These interfaces typically can be quantified by
    counting the number of
  • external and internal system interfaces among
    ISO/IEC 15288-defined
  • system elements.
  • Examples would be very useful
  • Current guidance includes
  • Interface Control Document, System Architecture
    diagram, System block diagram from the system
    specification, Specification tree.

How can we prevent interface count from being
provided too low?
31
Counting Rules Algorithms
  • Number of System-Specific Algorithms
  • This driver represents the number of newly
    defined or significantly
  • altered functions that require unique
    mathematical algorithms to be
  • derived in order to achieve the system
    performance requirements. As
  • an example, this could include a complex aircraft
    tracking algorithm like
  • a Kalman Filter being derived using existing
    experience as the basis for
  • the all aspect search function. Another example
    could be a brand new
  • discrimination algorithm being derived to
    identify friend or foe function
  • in space-based applications. The number can be
    quantified by counting
  • the number of unique algorithms needed to realize
    the requirements
  • specified in the system specification or mode
    description document.
  • Current guidance includes
  • System Specification, Mode Description Document,
    Configuration Baseline, Historical database,
    Functional block diagram, Risk analysis.

Are we missing anything?
32
Counting Rules Op Scn
  • Number of Operational Scenarios
  • This driver represents the number of operational
    scenarios that a
  • system must satisfy. Such scenarios include both
    the nominal stimulus-
  • response thread plus all of the off-nominal
    threads resulting from bad or
  • missing data, unavailable processes, network
    connections, or other
  • exception-handling cases. The number of
    scenarios can typically be
  • quantified by counting the number of system test
    thread packages or
  • unique end-to-end tests used to validate the
    system functionality and
  • performance or by counting the number of use
    cases, including off-
  • nominal extensions, developed as part of the
    operational architecture.
  • Current guidance includes
  • Ops Con / Con Ops, System Architecture Document,
    IVV/Test Plans, Engagement/mission/campaign
    models.

How can we encourage Operational Scenario
reporting?
33
Effort Profiling mini-Delphi
  • Step 4 of the 7-step
  • methodology
  • Two main goals
  • Develop a typical distribution profile for
    systems engineering across 4 of the 6 life cycle
    stages (i.e., how is SE distributed over time?)
  • Develop a typical distribution profile for
    systems engineering across 5 effort categories
    (i.e., how is SE distributed by activity
    category?)

34
COCOMO II Effort Distribution
MBASE/RUP phases and activities
Source Software Cost Estimation with COCOMO
II, Boehm, et al, 2000
35
Our Goal for COSYSMO
Operate, Maintain, or Enhance
Transition to Operation
Operational Test Evaluation
Replace or Dismantle
Conceptualize
Develop
ISO/IEC 15288
Acquisition Supply
Technical Management
System Design
Product Realization
Technical Evaluation
36
Mini Delphi Part 1
Goal Develop a distribution profile for 4 of the
6 life cycle phases
  • 5x6 matrix of EIA 632
  • processes vs. ISO 15288
  • life cycle phases

33 EIA 632 requirements (for reference)
37
Previous Results Are Informative
EIA/ANSI 632 - Pre-System EIA/ANSI 632 - Sys Definition EIA/ANSI 632 - Subsystem Design EIA/ANSI 632 - Detailed Design EIA/ANSI 632 - Integration, Test, and Evaluation ISO-IEC 15288- Operations ISO-IEC 15288 - Maintenance or Support ISO-IEC 15288 - Retirement
Product Supply - 4.1.1 (1) 40 30 20 10 0 0 0 0
Product Acquisition - 4.1.2 (2) 40 30 20 10 0 0 0 0
Supplier Performance - 4.1.2 (3) 30 30 20 20 0 0 0 0
Technical Management - 4.2 (4-13) 15 20 20 20 25 0 0 0
Requirements Definition - 4.3.1 (14-16) 35 30 20 10 5 0 0 0
Solution Definition - 4.3.2 (17-19) 25 35 30 5 5 0 0 0
Implementation - 4.4 (20) 5 10 25 40 20 0 0 0
Transition to Use - 4.4 (21) 5 10 25 30 30 0 0 0
Systems Analysis - 4.5.1 (22-24) 25 40 25 5 5 0 0 0
Requirements Validation - 4.5.2 (25-29) 10 35 30 15 10 0 0 0
System Verification - 4.5.3 (30-32) 10 25 20 20 25 0 0 0
End Products Validation - 4.5.4.1 (33) 20 25 20 15 20 0 0 0
EIA/ANSI 632 ISO/IEC 15288 Allocation - Clause No. (Requirements)
Acquisition Supply
Technical Management
System Design
Product Realization
Technical Evaluation
38
Breadth and Depth of Key SE Standards
Source Draft Report ISO Study Group May 2, 2000
39
5 Fundamental Processes for Engineering a System
Source EIA/ANSI 632 Processes for Engineering a
System (1999)
40
33 Requirements for Engineering a System
Source EIA/ANSI 632 Processes for Engineering a
System (1999)
41
Mini Delphi Part 2
Goal Develop a typical distribution profile for
systems engineering across 5 effort categories
  • 5 EIA 632 fundamental
  • processes

33 EIA 632 requirements (for reference)
42
Preliminary results
  • 4 person Delphi done last week at GSAW

EIA 632 Fundamental Process Average Standard Deviation
Acquisition Supply 5 0
Technical Management 13.75 2.5
System Design 26.25 9.4
Product Realization 22.5 6.4
Technical Evaluation 32.5 15
43
COSYSMO Invasion
  • In chronological order

Developer Implementation Availability
Gary Thomas (Raytheon) myCOSYSMO v1.22 Prototype at www.valerdi.com/cosysmo
Ricardo Valerdi (USC) AcademicCOSYSMO August 2005
John Gaffney (Lockheed Martin) Risk add-on Prototype developed, not yet integrated
Dan Liggett (Costar) commercialCOSYSMO TBD
44
COSYSMO Risk Estimation Add-on
  • Justification
  • USAF (Teets) and Navy acquisition chief (Young)
    require "High Confidence Estimates
  • COSYSMO currently provides a single point
    solution
  • Elaboration of the Sizing confidence level in
    myCOSYSMO

45
Final Items
  • Open issues
  • Local calibrations
  • Lies, damned lies, and statistical outliers
  • Future plans for COSYSMO 2.0 (including
  • ties to SoS work)
  • Action items for next meeting July 2005 in
  • Keystone, CO
  • Combine Delphi R3 results and perform
  • Bayesian approximation
  • Dissertation defense May 9

46
  • Ricardo Valerdi
  • rvalerdi_at_sunset.usc.edu
  • Websites
  • http//sunset.usc.edu
  • http//valerdi.com/cosysmo
Write a Comment
User Comments (0)
About PowerShow.com