Software Efforts at the NRO Cost Group - PowerPoint PPT Presentation

1 / 47
About This Presentation
Title:

Software Efforts at the NRO Cost Group

Description:

Explain how the NCG uses USC's code counter data. Introduce NCG's Software Database ... Deleted code size is proportional to previous New code size ... – PowerPoint PPT presentation

Number of Views:38
Avg rating:3.0/5.0
Slides: 48
Provided by: dun55
Category:
Tags: nro | code | cost | efforts | group | software

less

Transcript and Presenter's Notes

Title: Software Efforts at the NRO Cost Group


1
Software Efforts at the NRO Cost Group
  • 21st International Forum on COCOMO and Software
    Cost Modeling
  • November 8, 2006

2
Purpose
  • Explain how the NCG uses USCs code counter data
  • Introduce NCGs Software Database
  • Insight into Difference Application Results and
    Software Trends

3
Background
  • NCG maps the USC output files to a CSCI/CSC and
    Work Breakdown Structure (WBS)
  • Mapping is most meaningful when mapped to the
    lowest functional level possible within the WBS
  • Mapping is very labor intensive if done through
    excel spreadsheets or other manual methods

4
CSCI/CSC DP/DPAP
CSCI/CSC DP/DPCC
CSCI/CSC DP/DPCU
5
Software Database
  • NCG created a software database which automates
    the mapping process
  • Software database is primary tool for storing ALL
    NCG software related data. Database will provide
  • Low level functional breakout
  • Traceability to past programs
  • Historical representation of development process
  • Database will help us better understand trends
    for
  • Code Counts
  • Staffing Profiles
  • Discrepancy Reports (DRs)
  • Schedules

6
Database Functionality
  • Database allows for the importation of
  • Code counter output files for any language
  • Difference Application output files
  • CSCI/CSC listing
  • Staffing data
  • DR data
  • Cost and Hours
  • Programmatic/Technical data
  • Database allows the mapping of output file folder
    paths to a WBS and CSCI/CSC

7
  • Walkthrough of Software Databases Key
    Functionalities

8
(No Transcript)
9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
(No Transcript)
13
Unmap
14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
28
(No Transcript)
29
(No Transcript)
30
(No Transcript)
31
(No Transcript)
32
(No Transcript)
33
Insight into Difference Application Results and
Software Trends
34
Introduction
  • NCG collects datasheet information regarding
    program Complexity Attributes
  • This can be defined as Program Development
    Environment data
  • e.g. Number of years of experience of programmers
  • We also collect USC output files for LOC and
    Reuse trends
  • We collect Staffing Profiles
  • Staffing Profiles are broken out by CSCI
  • We collect Discrepancy Reports (DRs)
  • DRs are broken out by Priorities and ranked in
    some manner
  • Are there any useful trends if we analyze all the
    data collectively?

35
CSCI 1 Example
Below is a summary of CSCI 1
Code reaches stable point
Heritage code written in C and remainder written
in C
New code looks proportional to Staffing !!!
Increase in staff to fix DRs
36
CSCI 2 Example
  • Below is a summary of CSCI 2

Heritage code written in C and remainder written
in C
Keep an eye on the peak staffing levels
37
CSCI 3 Example
  • Below is a summary of CSCI 3

Heritage code written in C and remainder written
in C
Peak staffing trend?
38
CSCI 4 Example
  • Below is a summary of CSCI 4

Low Modified code trend continues
Heritage code written in C and remainder written
in C
39
CSCI 5 Example
  • Below is a summary of CSCI 5

Any guess on why this looks different from the
previous trends?
40
Reuse
  • This CSCI shows the usefulness of breaking out
    New and Deleted code from the Total code counts
  • What assessment can be made of this development?
  • Unstable requirements?
  • Re-writing same code?
  • Code cleanup occurring after each New code
    delivery?

Code looks stable at this point, but in reality,
it was dynamic!!
Total counts show change, but without Diff, you
cant see why
Deleted code size is proportional to previous New
code size
Notice that there is nearly 5,000 New SLOC and
5,000 Deleted SLOC Doesnt look STABLE
41
More on Reuse
  • USC Diff results can provide better insight
    into how much Reuse came from heritage programs
  • Example
  • Program B uses Program A software as a starting
    point
  • Program A metrics
  • 50,841 Total Logical Code Counts
  • Program B is completed and returns the following
    metrics
  • 1,937,167 Total Logical Code Counts

42
More on Reuse (cont.)
  • Run Diff counter on the initial baseline (in
    this case, Program A) and the final baseline
    (Program B)
  • Diff results show
  • New Logical code 1,918,011
  • Deleted Logical code 31,685
  • Modified Logical code 5,701
  • Unmodified Logical code 13,455
  • Compute Reuse from Program A
  • Unmodified (at Program B completion) / Total (at
    Program B start)
  • 13,455 / 50,841 26
  • 26 of Program A was DIRECT reuse into Program
    B
  • This is not 26 of Program B!
  • This is one way to simplify the Reuse problem

43
Operators / Tokens
  • Here are 8 types of operators that could be
    counted
  • Logical
  • ,
  • Trigonometric
  • Cos(), Sin(), Tan(), Cot(), Csc(), Sec()
  • Log
  • Log(), Ln ()
  • Preprocessor
  • , Sizeof()
  • Math
  • , -, , /, sqrt(), ,
  • Assignment
  • Pointers
  • Conditional
  • if, else, switch, case, ?
  • As well as, Nesting of the loops
  • Level 1 loop
  • Level 2 loop

44
Complexity Density Ratios
  • Here are snapshots at various baselines of a CSCI
  • If we only looked at the final numbers, we would
    assume nearly 450,000 Logical lines of code
  • What do you make of this????

A growth of 371,868 Logical SLOC
What productive developers!
45
Complexity Density Ratios (cont)
  • Looking at the Complexity Density Ratio for
    different operators, we can understand more about
    the development

Code looks different
46
Complexity Density Ratios (cont)
  • Looking at the SW as an entity, composed of many
    different elements, the Complexity Density Ratio
    allows us to see the make up of the SW
  • This can be characterized as a signature of the
    SW development!
  • In the previous example, the program received SW
    from another project (not directly associated to
    the current project)
  • The Complexity Density Ratio validates that the
    SW is different for the ongoing development
  • This should be taken into account when trying to
    come up with any productivities
  • ANALYST BEWARE!
  • DONT BLINDLY USE DATA

47
Summary
  • The NCG continues to standardize our code
    counting efforts
  • Essential for normalizing our data across
    multiple programs, multiple contractors
  • The NCG works closely with USC to develop a
    complete USC Code Counting Tool Suite
  • Addressing necessities such a new ways of looking
    at reuse, complexities, trends, etc.
  • The NCG has invested extensive resources to use
    the USC code counter files and parse the USC
    output files
  • Our goal is to establish consistency across the
    Intelligence Community
  • Primarily involves our industry contractors
Write a Comment
User Comments (0)
About PowerShow.com