Title: Software Independent Verification and Validation (IV
1Software Independent Verification and Validation
(IVV) An Agency Overview
- Kenneth A Costello
- IVV Program Lead Engineer
GSFC Systems Engineering Seminar Series 12 Sep
2006
2Agenda
- A Quick IVV Facility/Program History
- The Software Crisis
- IVV/NASA IVV
- Forming an IVV Project
- IVV Relationships
- Closing
3Setting the stage A History
05/03 NASA Executive Council makes IVV an Agency
OSMA Program
1996 Facility Omnibus contract Enabled IVV
across all NASA Projects
06/99 Senior Management Council IVV mandate
for all NASA software
04/94 Space Station Program Implements IVV
through Facility
05/88 Space Shuttle Program Implements IVV
10/03 IVV Funding changed to Corporate GA
1991
2004
1996
1993
1999
1988
focusresearch
08/01 NPD 8730.4 Software IVV Policy
08/05 NPD 2820.1 Software Policy
focusIVV
04/96 Facility transitioned to AMES Research
Center
07/00 Facility transitioned to Goddard Space
Flight Center
42
36
37
00
96
91
26
24
20
15
12
IVV, IA Projects
1
2-3
4Setting the Stage An Agency Requirement
- NPD 8730.4 SW IVV Policy
- Cancelled on 08/30/05
- Current Requirements
- NPD 2820.1C Software Policy
- NPR 7150.2 Software Engineering Requirements
- NASA-STD-8739.8 Software Assurance
5NPD 2820.1C Software Policy
- NASA policy regarding software activities for
each project is to accomplish the following - (5) Projects shall ensure software providers
allow access to software and associated artifacts
to enable insight/oversight by software
engineering and software assurance which includes
Independent Verification and Validation (IVV)
and NASA's Safety and Mission Assurance
organizations. - c. Use the NASA IVV Facility as the sole
provider of IVV services when software created
by or for NASA is selected for IVV by the NASA
Chief Safety and Mission Assurance Officer. - Responsibilities
- c. The NASA Chief Safety and Mission Assurance
Officer shall - (1)
- (6) Oversee the functional management of the
NASA IVV Program and assure the performance of
all of IVV processes, services, and
activities. - (7) Establish and manage processes for the
selection of software to which to apply IVV. - (8) Charter the IVV Board of Directors
(IBD) which makes prioritized recommendations for
allocating IVV services to projects based on
the annual Software Inventory (maintained by the
Chief Engineer) and the Office of Safety and
Mission Assurance(OSMA) defined process. - (9) Select and maintain the list of software
projects to which IVV is to be applied. - (10)
- d. The IVV Program Manager shall 1) establish
and manage the Agency's software IVV services
and procedures 2) establish, maintain, and
report on the results of IVV services and
findings and 3) support NASA's program for
improving software assurance and other trusted
verifications (e.g., independent assessments,
peer reviews, and research). The IVV Facility
shall determine and document the services
provided by the Facility on projects selected for
IVV by the NASA Chief Safety and Mission
Assurance Officer.
6NPR 7150.2 Software Engineering Requirements
- Section 5.1.1.1 states required content for SW
Development Plans. - The Software Development or Mgmt Plan shall
contain SWE-102 - a. Project organizational structure showing
authority and responsibility of each
organizational unit, including external
organizations (i.e., Safety and Mission
Assurance, Independent Verification and
Validation (IVV), Independent Technical
Authority (ITA), NASA Engineering and Safety
Center (NESC))". - Additionally, within section 5.1.5 which
addresses SW Assurance, it states "The SW
Assurance Plan details the procedures, reviews,
and audits required to accomplish software
assurance. The project office should coordinate,
document, and gain concurrence with the Office of
Safety and Mission Assurance as to the extent and
responsibilities of the assurance and safety of
the project. This will be documented into the
project plans and reflected in the assurance
process. - Section 5.1.5.1 states The SW Assurance Plan(s)
shall be written per NASA-STD-8739.8, NASA SW
Assurance Standard. SWE-106".Â
7NASA-STD-8739.8 Software Assurance
- Std states the following
- Section 6.1.4 When IVV has been selected for a
project, the provider shall coordinate with IVV
personnel to share data and information. - Section 7.5.3 When the IVV function is required,
the provider shall provide all required
information to NASA IVV Facility personnel.
(This requirement includes specifying on the
contracts and subcontracts, IVVs access to
system and software products and personnel.)
8A Software Crisis
Independent Verification and Validation The NASA
Approach
9Growing Software Importance
- Fundamental Concern
- First NASA robotic mission with actual software
launched in 1969 (Mariner 6) - Software size has grown over time
- 128 words of assembly equivalent 30 lines of C
code - MER has about 600,000 lines of equivalent C code
- More functionality is being placed within
software and software constructed devices
(Programmable Logic Devices) - With increased processing power and memory, more
tasks are running concurrently - Control software increasing in complexity and
size - Software used to monitor and react to hardware
faults
10Software is still hard to get right
- The Carnegie Mellon Software Engineering
Institute reports(1) that at least 42-50 percent
of software defects originate in the requirements
phase. - The Defense Acquisition University Program
Manager Magazine(2) reports in a Department of
Defense study that over 50 percent of all
software errors originate in the requirements
phase.
1 Carnegie Mellon Software Engineering
Institute, The Business Case for Requirements
Engineering, RE 2003, 12 September 2003 2 -
Defense Acquisition University Program Manager
Magazine, Nov-Dec 1999, Curing the Software
Requirements and Cost Estimating Blues
11Fixing errors early can conserve resources
- Early error detection and correction are vital to
development success - The cost to correct software errors multiplies
during the software development lifecycle - Early error detection and correction reduces cost
and saves time - IVV assurance vital to mission success
- Independent evaluation of critical software is
value-needed - Agency goal
Average relative costs for finding errors late
"Software Engineering Economics" by Barry Boehm
12Overview of Defects found by IVV Teams
13Independent Verification and Validation
Independent Verification and Validation The NASA
Approach
14What is Verification and Validation?
- Simply put, assuring that a software system meets
the users needs - Verifying that the software is accurate and
representative of its specification - Validating that the software will do what the
user really wants it to do
15What is up with that I?
- I Independent
- Financially Funded from Corporate GA for Agency
identified high priority Projects - Customer Project may also fund the effort
- Technical IVV program defines scope and tasks,
tailored by an IVV criticality assessment - Uses a predefined work breakdown structure
- Managerial Functional management supplied by
OSMA - Project management supplied from the IVV program
16So what is IVV?
- An engineering discipline employing rigorous
methods for evaluating the correctness and
quality of the software product throughout the
software life cycle from a system level
viewpoint. - The NASA Software IVV approach covers not only
expected operating conditions but the full
spectrum of the system and its interfaces in the
face of unexpected operating conditions or
inputs.Â
17So what else is IVV?
- Testing at the end of the life cycle?
- No
- IVV is testing, but it is whole life cycle
testing - The IVV team tests artifacts ranging from
system and software requirements to source code
and test results - Each task in the IVV WBS is designed to test a
development artifact or process
18What are the objectives of IVV?
- Find defects within the system with a focus on
software and its interactions with the system - Make an assessment of whether or not the system
is usable in an operational environment, again
with a focus on the software within the system - Identify any latent risks associated with the
software
19What is the goal of IVV?
Establish confidence that the software is fit for
its purpose within the context of the system
- Note that the software may not be free from
defects - Rarely the case and difficult to prove
- The software must be good enough for its intended
use - As described by the requirements
- Correct requirements
- The type of use will determine the level of
confidence that is needed - Consequence of software defect/failure
20Are there any other benefits to IVV?
- Primary purpose to provide confidence to OSMA,
however... - Development projects receive all findings
- The good, the bad, the ugly
- Allows PM to have unbiased view of software
development effort - Provides knowledge resource for software
developers - In phase IVV work provides early error detection
and may save the project money in error correction
21IVV is process as well as product oriented
22IVV Increases Project Awareness
23IVV Interfaces and Reporting
- Formal and informal interface with developers
- The formal interface with an IVV Program project
manager - Informal interface between the IVV analysts and
the developers - Helps to get identified problems and issues into
the appropriate hands quickly - Results of the effort thoroughly documented
- Issues identified to the developers in a timely
manner - Status reports to the Project Management
- Monthly/Quarterly reviews to GPMCs/Directorates/HQ
s - Project close out report
- All inputs and outputs archived
- Final report delivered to project for its own
internal records - Lessons learned documented throughout
24NASA IVV
25Agency Generic IVV Scoping and Costing Flow
26The IVV Life Cycle
- An IVV Project follows a life cycle similar to
most Projects - Formulation
- Execution
- Close-out
27Formulation Phase
- The Formulation phase is used to plan and scope
the work to be performed - Starts usually prior to System Requirements
Review (SRR) with initial planning and contact
with the Project - Generally between SRR and Preliminary Design
Review (PDR) planning and scoping process is
executed - Criticality analysis developed as foundation for
the IVV effort on the project - The effort addresses all of the software on a
Project - The process generates a tailored approach based
on the results of the assessment
28Execution Phase
- Majority of the IVV effort is performed
- Documented in an IVV Plan (IVVP) that is an
output of the Formulation work - The IVVP is provided to the Project for review
and applicable concurrence - Approach taken from the WBS and tailored based on
the results of the Formulation work - The Execution phase generally ends somewhere
around or shortly after launch - In some cases, work may extend beyond launch when
software is still being developed (MER)
29IVV WBS for NASA Missions
- The purposes of the IVV Work Breakdown Structure
are to - Provide a consistent approach to IVV across the
Agency - Provide a consistent and comprehensive basis for
collection and reporting of metrics - Help Projects anticipate and understand what IVV
will do - The IVV WBS was developed using industry
standards and IVV history on NASA missions as
reference - IEEE Std. 1012-2004 IEEE Standard for Software
Verification and Validation - IEEE/EIA 12207.0-1996 Standard for Information
Technology-Software life cycle processes - WBS Tasks for NASA Missions
- Task selection is based on an algorithm using
software development risk - Risk is generated based various on Project
characteristics (size, complexity, reuse, risk,
etc.) as part of IVV planning and criticality
analysis tasks - The full WBS can be found at http//ims.ivv.nasa.g
ov/isodocs/IVV_09-1.pdf
30IVV Activities Fit within the Project Schedule
System Requirements Review
Preliminary Design Review
System Test
Mission Readiness Review
System Retirement
Critical Design Review
S/W FQT
Launch
Initial IVVP Signed
Baseline IVVP Signed
- - IVV provides support and reports for Project
milestones - Technical Analysis Reports document major phases
- IVVP is updated to match changes in Project
IVV Provides CoFR
IVV Final Report
Concept Phase 2.0
Requirements Phase 3.0
Design Phase 4.0
Implementation Phase 5.0
Test Phase 6.0
Operations Maintenance Phase 7.0
IVV Phase Independent Support 1.0
Note numbers correspond to IVV WBS
- Designed to mesh with the Project schedule and
provide timely inputs to mitigate risk - Dialog between the IVV Facility and the Project
begins before SRR
31Close Out Phase
- The Close Out phase concludes the IVV effort
- All of the work performed is summarized in a
final technical report - Additionally, Lessons Learned are captured and
either documented separately or incorporated into
the final technical report - In some cases, the IVV Team is retained to
provide mission support during critical phases of
the Project which may occur after Close Out of
the primary effort
32The IVV Life Cycle Flow
Concept Phase
Focused activity at the earliest point System
requirements and software role important Issues
are introduced at lowest level
Verification
Covers all levels of testing. Ensures that system
meets the needs of the mission
System Requirements
Software Planning
Verification
Verification
IVV in phase with development
Validation Testing
Software Requirements
Verification
Design
Verification
Simulator/ Environment/ Hardware
Implementation
Verification
Maintenance
Later life cycle activity also important Issues
are still introduced at lowest level Focused more
on individual components
IVV support continues over initial operational
phase and beyond based on mission profile
33IVV Testing Philosophy
Component Based Testing
Integration
System Testing
Acceptance Testing
Unit Test (CSC, CSCI)
S/W Integration
S/W Functional Qualification Testing
Acceptance Testing
System Integration and Test
- Most testing is designed to show the software
works within the envelope of the mission (Test
what you fly, fly what you test) - The IVV approach is to focus more on off-nominal
and unexpected situations in the software - The higher the level of confidence needed the
deeper the analysis - The guiding goal is not necessarily to perform
additional testing - The goal is to improve the Project's test
planning and execution - In some cases, IVV may independently test highly
critical software
34Forming an IVV Project
Independent Verification and Validation The NASA
Approach
35IVV Project Requirements Background
- Critical first steps is to develop the
requirements for the IVV project - A set of engineering/management tasks that are
determined through a criticality analysis process
- Previously accomplished individually by different
NASA contractors using different processes - This sometimes led to confusion with the NASA
development projects as there was little
consistency - There was also a mixture of terminology used that
was sometimes in conflict with other NASA
terminology and industry standard terminology - There was also a perception among some parts of
NASA that the IVV contractors were determining
their own work
36Software Integrity Level Assessment Process
- To help mitigate or eliminate some of these
issues the IVV Program undertook an initiative
to develop a new process - Examined the best of current criticality analysis
processes from industry and academia - The primary objective of the process is to
develop the requirements for an IVV project
37SILAP Goals
Scalable Reasonably applicable from a mission-level, down to a function level
Risk-based Ranking A combination of Consequence (impact if the software component fails) and Error Potential (likelihood an error exists)
Minimal Complexity Relatively simple such that it can be executed across a broad range of experience levels
Minimal Impact Minimize the level of participation from the project we are assessing
Objective Criteria Minimize the use of engineering judgment and maximize the use of measurable criteria
Disjoint Tasking Produce tasking that is different for each software integrity level
Applicable Applicable throughout the life cycle
Understandable The process and reasons for the results can be completely described and should make sense to a general engineer/project manager
38Software Integrity Level (SIL)
- Software Integrity Levels
- Want to define, for a software component, the
required level of integrity in terms of its role
in the system - Understand how the component fits within the
system - Understand what is required of that component to
be able to maintain the functionality of the
system
39Software Integrity Level Definition
- Definition of Software Integrity Level
- A range of values that represent software
complexity, criticality, risk, safety level,
security level, desired performance, reliability,
or other project-unique characteristics that
define the importance of the software to the user
and acquirer - The characteristics used to determine software
integrity level vary depending on the intended
application and use of the system. - A software component can be associated with risk
because - a failure (or defect) can lead to a threat, or
- its functionality includes mitigation of
consequences of initiating events in the systems
environment that can lead to a threat - Developed using not only software but also system
level integrity as a basis (ISO/IEC 15026, 6)
40Risk A Common Denominator
- Previously development projects (IVV
stakeholders) could not easily link risk with the
scoring that was performed - Prime requirement for this new process is that it
clearly defined the system risk and is linked to
the software - Process was built around two project factors the
combination of which would define some level of
system risk linked to the software - The factors are Consequence and Error Potential
41Consequence vs. Error Potential
- Consequence is a measure of the system level
impact of an error in a software component - Generally, take the worse case error (at the
software component level) that has a reasonable
or credible fault/failure scenario - Then consider the system architecture and try to
understand how that software fault/failure
scenario may affect the system - Error Potential is a measure of the probability
that the developer may insert an error into the
software component - An error is a defect in the human thought process
- A fault is a concrete manifestation of errors
within the software - A failure is a departure of the system behavior
from the requirements - With these definitions in mind, the approach is
not to assess faults or failures, but to assess
errors - Scoring
42Consequence
- Consequence consists of the following items
- Human Safety This is a measure of the impact
that a failure of this component would have on
human life - Asset Safety This is a measure of the impact
that a failure would have on hardware - Performance This is a measure of the impact
that a failure would have on a mission being able
to meet its goals
43Error Potential
- Error Potential consists of the following items
- Developer Characteristics
- Experience This is a measure of the system
developers experience in developing similar
systems - Organization This is a measure of the
complexity of the organization developing the
system (distance and number of organizations
involved tend to increase the probability of
errors being introduced into the system) - Software/System Characteristics
- Complexity This is a measure of the complexity
of the software being developed - Degree of Innovation This is a measure of the
level of innovation needed in order to develop
this system/software - System Size This is a measurement of the size
of the system in terms of the software (i.e.,
Source Lines of Code)
44Error Potential (2)
- Development Process Characteristics
- Formality of the Process This is a measure of
how maturity of the developers processes - Re-use Approach This is a measure of the level
of re-use for the system/software - Artifact Maturity This is a measure of the
current state of the development documentation in
relation to the state of the overall development
project (i.e., the is past critical design review
but the requirements documents are still full of
TBDs and incompletes)
45Determining the Scores
- Using the criteria, each software component is
assessed and a score generated - The scores are then processed through an
algorithm to create a final score for Consequence
and Error Potential - The algorithm takes into account a weight for
each of the characteristics
Consequence Factors  Weight
Human Safety 0.0
Asset Safety 35.0
Performance 65.0
Error Potential Factors Sub-Factor Weight  Factor Weight
Developer 57.9
Experience 82.8
Development Organization 17.2
Development Process 24.9
Formality of Process 53.2
Re-use Approach 22.6
Artifact Maturity 24.2
System/Software Characteristic 17.2
Complexity 54.7
Degree of Innovation 35.1
Size of System 10.2
Note that the Human Safety score carries no
weight. Rather it is treated in a special manner
as shown on the next slide
46Calculating Consequence
- The following algorithm is used to determine the
final Consequence score
If a component has no human safety impact then
Human Safety 0 else score the Human Safety
(hs) 1-5 using the criteria Score the Asset
Safety (as) 1-5 using the criteria Score the
Performance (pf) 1-5 using the criteria If hs gt
(.35as .65pf) then Final score
hs else Final score (.35as .65pf)
This step defines the Human Safety score (hs)
This last step is important as it places emphasis
human safety by using it as an overriding score
if it is larger than the sum of the weighted
asset safety and performance score
47Calculating Error Potential
- The algorithm for the Error Potential calculation
has no special provisions - It is simply a sum of the weighted scores
The first three terms represent the high level
weights
- These attributes have
- Values (vi)
- generated during the assessment
- Weights (wi)
- pre-defined
Error Potential
Note that all scores are rounded to the next
whole integer
48Developing Tasking
- A tasking set based on each individual score
- Tasking associated with a given Consequence score
- Tasking associated with a given Error Potential
score - One set of tasks per component
- The tasks are not exclusive to a given score
- This results in a matrix of software components
and scores that provides the starting set of
requirements for IVV on that project - The current matrix of score and tasks is provided
on the next slide
49IVV Tasking Matrix
Factors Factors Consequences Consequences Consequences Consequences Consequences Error Potential Error Potential Error Potential Error Potential Error Potential
Factor Scores Factor Scores 1 2 3 4 5 1 2 3 4 5
1.0 Phase Independent Support          Â
1.1 Management and Planning of Independent Verification and Validation X X X X X X X X X X
1.2 Issue and Risk Tracking X X X X X X X X
1.3 Final Report Generation X X X X X X X X
1.4 IVV Tool Support X X X X X X X X
1.5 Management and Technical Review Support X X X X X X X X X X
1.6 Criticality Analysis X X X X X X X X X X
1.7 Identify Process Improvement Opportunities in the Conduct of IVV X X X X X X X X
Items with a carat () next to them are only
invoked when human safety is involved
50IVV Tasking Matrix (2)
Factors Factors Consequences Consequences Consequences Consequences Consequences Error Potential Error Potential Error Potential Error Potential Error Potential
Factor Scores Factor Scores 1 2 3 4 5 1 2 3 4 5
2.0 Concept Phase          Â
2.1 Reuse Analysis X X X
2.2 Software Architecture Assessment X X X
2.3 System Requirements Review X X X X X
2.4 Concept Document Evaluation X X X
2.5 Software/User Requirements Allocation Analysis X X X
2.6 Traceability Analysis X X X
3.0 Requirements Phase          Â
3.1 Traceability Analysis Requirements X X X Â X X X
3.2 Software Requirements Evaluation X X X X X
3.3 Interface Analysis Requirements X X X X X
3.4 System Test Plan Analysis X X X
3.5 Acceptance Test Plan Analysis X
3.6 Timing and Sizing Analysis  X X
51IVV Tasking Matrix (3)
Factors Factors Consequences Consequences Consequences Consequences Consequences Error Potential Error Potential Error Potential Error Potential Error Potential
Factor Scores Factor Scores 1 2 3 4 5 1 2 3 4 5
4.0 Design Phase          Â
4.1 Traceability Analysis Design  X X X X
4.2 Software Design Evaluation X X X X
4.3 Interface Analysis Design X X X
4.4 Software FQT Plan Analysis X X X X
4.5 Software Integration Test Plan Analysis  X X
4.6 Database Analysis  X X X
4.7 Component Test Plan Analysis  X
52IVV Tasking Matrix (4)
Factors Factors Consequences Consequences Consequences Consequences Consequences Error Potential Error Potential Error Potential Error Potential Error Potential
Factor Scores Factor Scores 1 2 3 4 5 1 2 3 4 5
5.0 Implementation Phase          Â
5.1 Traceability Analysis - Code X X X X X
5.2 Source Code and Documentation Evaluation X X X X X
5.3 Interface Analysis - Code X X X X X
5.4 System Test Case Analysis X X
5.5 Software FQT Case Analysis X X
5.6 Software Integration Test Case Analysis  X
5.7 Acceptance Test Case Analysis X
5.8 Software Integration Test Procedure Analysis  X
5.9 Software Integration Test Results Analysis  X X
5.10 Component Test Case Analysis  X
5.11 System Test Procedure Analysis X
5.12 Software FQT Procedure Analysis X
53IVV Tasking Matrix (5)
Factors Factors Consequences Consequences Consequences Consequences Consequences Error Potential Error Potential Error Potential Error Potential Error Potential
Factor Scores Factor Scores 1 2 3 4 5 1 2 3 4 5
6.0 Test Phase          Â
6.1 Traceability Analysis - Test X X X X X
6.2 Regression Test Analysis  X X
6.3 Simulation Analysis X
6.4 System Test Results Analysis X X
6.5 Software FQT Results Analysis X X
7.0 Operations and Maintenance Phase          Â
7.1 Operating Procedure Evaluation X
7.2 Anomaly Evaluation X
7.3 Migration Assessment X
7.4 Retirement Assessment X
54IVV Relationships
55IVV Facility Relationship to HQ
- IVV reports annual performance and receives
approved budget from IBD (Chaired by OSMA) - AA/OSMA delegates Program to GSFC Center Director
- IVV Facility Director is Program Manager
- Facility works with OSMA IVV Liaison to
coordinate IBD budget inputs and performance
reporting - OSMA works with IBD to identify and prioritize
Projects annually
56IVV/Center/Project Relationships
- IVV-Project Relationship
- IVV still reports issues to Project first and
treats Project as primary customer for
technical findings and risks - As a Code Q Program, IVV will keep Center SMA
personnel informed of IVV technical issues so
that SMA has a complete mission assurance
picture - IVV-Center Relationship
- Center Liaison facilitates the startup of IVV on
new Projects - Center Liaison and IVV Facility Leads facilitate
technical issue resolution - Center Liaison promotes consistent approaches to
IVV on Projects, and promotes awareness of IVV
Center-wide - SMA, Projects, and IVV provide technical status
and issues to the GPMC - IVV reports to GSFC PMC as a Program Office
57Closing
Independent Verification and Validation The NASA
Approach
- Software IVV, as practiced by the NASA Software
IVV Facility, is a well-defined, proven, systems
engineering discipline designed to reduce the
risk in major software developments
58Points of Contact
- Bill Jackson
- Acting Director
- 304-367-8202
- Bill.L.Jackson_at_nasa.gov
- Ken Costello
- Lead Engineer
- 304-367-8343
- Kenneth.A.Costello_at_nasa.gov
59(No Transcript)