Title: The Software Development Life Cycle: An Overview
1The Software DevelopmentLife Cycle An Overview
- Presented by
- Maxwell Drew
- and
- Dan Kaiser
- Southwest State University
- Computer Science Program
2Last Time
- Brief review of the testing process
- Dynamic Testing Methods
- Static Testing Methods
- Deployment in MSF
- Deployment in RUP
3Session 8Security and Evaluation
- General Systems Engineering Concepts
- Information Systems Security Engineering Process
- Relation of ISSE Process to other Processes
- Product, Process Resource Evaluation
- Course Evaluations
4Information SystemsSecurity Engineering
- General Systems Engineering Concepts
- Information Systems Security Engineering Process
- Relation of ISSE Process to other Processes
5Systems Engineering Process
6Discover Needs
- Mission/Business Description
- Policy Consideration
- Mission Needs Statement (MNS)
- Concept of Operations (CONOPS)
7Define System Functionality
- Objectives - MoE
- System Context/Environment
- Requirements - RTM
- Functional Analysis
8Define System
- Functional Allocation - CM
- Preliminary DesignBaseline Configuration
- Detailed Design - CI
9Implement System
10Assess Effectiveness
- Interoperability
- Availability
- Training
- Human/Machine Interface
- Cost
11ISSE Activities
- Describing information protection needs
- Generating information protection requirements
based on needs early in the systems engineering
process - Satisfying the requirements at an acceptable
level of information protection risk - Building a functional information protection
architecture based on requirements - Allocating information protection functions to a
physical and logical architecture - Designing the system to implement the information
protection architecture - Balancing information protection risk management
and other ISSE considerations within the overall
system context of cost, schedule, and operational
suitability and effectiveness
12ISSE Activities - Continued
- Participating in trade-off studies with other
information protection and system engineering
disciplines - Integrating the ISSE process with the systems
engineering and acquisition processes - Testing the system to verify information
protection design and validate information
protection requirements - Supporting the customers after deployment and
tailoring the overall process to their needs
13Discover Information Protection Needs
14Layered Requirements Hierarchy
15Mission Information Protection Needs
- What kind of information records are being
viewed, updated, deleted, initiated, or processed
(classified, financial, proprietary, personal
private, etc.)? - Who or what is authorized to view, update,
delete, initiate, or process information records? - How do authorized users use the information to
perform their duties? - What tools (paper, hardware, software, firmware,
and procedures) are authorized users using to
perform their duties? - How important is it to know with certainty that a
particular individual sent or received a message
or file?
16Threats to Information Management
- Types of Information
- Legitimate users and uses of information
- Threat agent considerations
- - Capability
- - Intent
- - Willingness
- - Motivation
- - Damage to mission
17Information Protection Policy Considerations
- Why protection is needed
- What protection is needed
- How protection is achieved not considered at this
stage
18Information Protection Policy Issues
- The resources/assets the organization has
determined are critical or need protection - The roles and responsibilities of individuals
that will need to interface with those assets (as
part of their operational mission needs
definition) - The appropriate way (authorizations) authorized
individuals may use those assets (security
requirements).
19Define Information Protection System
- Information Protection Objectives MoE
- System Context/Environment
- Information Protection Requirements RTM
- Functional Analysis
20Information Protection Objectives Should Explain
- The mission objectives supported by information
protection objective - The mission-related threat driving the
information protection objective - The consequences of not implementing the
objective - Information protection guidance or policy
supporting the objective
21Design Information Protection System
- Functional Allocation
- Preliminary Information Protection Design
- Detailed Information Protection Design
22Preliminary Information Protection Design
Activities
- Reviewing and refining Discover Needs and Define
System activities' work products, especially
definition of the CI-level and interface
specifications - Surveying existing solutions for a match to
CI-level requirements - Examining rationales for proposed PDR-level (of
abstraction) solutions - Verification that CI specifications meet
higher-level information protection requirements - Supporting the certification and accreditation
processes - Supporting information protection operations
development and life-cycle management decisions - Participating in the system engineering process
23Detailed Information Protection Design Activities
- Reviewing and refining previous Preliminary
Design work products - Supporting system- and CI-level design by
providing input on feasible information
protection solutions and/or review of detailed
design materials - Examining technical rationales for CDR-level
solutions - Supporting, generating, and verifying information
protection test and evaluation requirements and
procedures - Tracking and applying information protection
assurance mechanisms - Verifying CI designs meet higher level
information protection requirements - Completing most inputs to the life-cycle security
support approach, including providing information
protection inputs to training and emergency
training materials - Reviewing and updating information protection
risk and threat projections as well as any
changes to the requirements set - Supporting the certification and accreditation
processes - Participating in the system engineering process
24Implement Information Protection System
25Implement Information Protection SystemGeneral
Activities
- Updates to the system information protection
threat assessment, as projected, to the system's
operational existence - Verification of system information protection
requirements and constraints against implemented
information protection solutions, and associated
system verification and validation mechanisms and
findings - Tracking of, or participation in, application of
information protection assurance mechanisms
related to system implementation and testing
practices
26Implement Information Protection SystemGeneral
Activities (cont.)
- Further inputs to and review of evolving system
operational procedure and life-cycle support
plans, including, for example, Communication
Security (COMSEC) key distribution or
releasability control issues within logistics
support and information protection relevant
elements within system operational and
maintenance training materials - A formal information protection assessment in
preparation for the Security Verification Review - Inputs to Certification and Accreditation (CA)
process activities as required - Participation in the collective,
multidisciplinary examination of all system issues
27Build Information Protection System
- Physical Integrity.
- Have the components that are used in the
production been properly safeguarded against
tampering? - Personnel Integrity.
- Are the people assigned to construct or assemble
the system knowledgeable in proper assembly
procedures, and are they cleared to the proper
level necessary to ensure system trustworthiness?
28Test Information Protection System Activities
- Reviewing and refining Design Information
Protection System work products - Verifying system- and CI-level information
protection requirements and constraints against
implemented solutions and associated system
verification and validation mechanisms and
findings - Tracking and applying information protection
assurance mechanisms related to system
implementation and testing practices - Providing inputs to and review of the evolving
life-cycle security support plans, including
logistics, maintenance, and training - Continuing risk management activities
- Supporting the certification and accreditation
processes - Participating in the systems engineering process
29Assess Effectiveness
- Interoperability.
- Does the system protect information correctly
across external interfaces? - Availability.
- Is the system available to users to protect
information and information assets? - Training.
- What degree of instruction is required for users
to be qualified to operate and maintain the
information protection system? - Human/Machine Interface.
- Does the human/machine interface contribute to
users making mistakes or compromising information
protection mechanisms? - Cost.
- Is it financially feasible to construct and/or
maintain the information protection system?
30Relation to Other Processes
- System Acquisition Process
- Risk Management Process
- DITSCAP
- Common Criteria International Standard
31ISSE and System Acquisition Process Flows
32Risk Management Process
33Risk Decision Flow
34Risk Plane
35DITSCAPFlow
36Security Concepts Relationships
37ProtectionProfile
38Evaluation Concepts Relationships
39Use of Evaluation Results
40Questions?
41Evaluation
- General Techniques
- Evaluating the Product
- Evaluating the Process
- Evaluating Resources
42Categories of Evaluation
- Feature analysis
- rate and rank attributes
- Survey
- document relationships
- Case study
- sample from variables
- Formal experiment
- sample over variables
43Example Feature Analysis
Tool 1 t-OO-l
Tool 2 ObjecTool
Tool 3 EasyDesign
Importance
Feature
Good user interface
4
5
4
3
Object-oriented design
5
5
5
5
Consistency checking
3
3
5
1
5
4
Use cases
2
4
4
Runs on Unix
4
5
5
Score
82
77
73
Table 12.1. Design tool ratings
44Case Study Types
- Sister projects
- each is typical and has similar values for the
independent variables - Baseline
- compare single project to organizational norm
- Random selection
- partition single project into parts
45Formal Experiment
- Controls variables
- Uses methods to reduce bias and eliminate
confounding factors - Often replicated
- Instances are representative
- sample over the variables (whereas case study
samples from the variables)
46Evaluation Steps
- Setting the hypothesis
- the tentative supposition that we think explains
the behavior we want to explore - Maintaining control over variables
- decide what effects our hypothesis
- Making investigation meaningful
- determine the degree to which results can be
generalized
47Common Evaluation Pitfalls
Pitfall
Description
1. Confounding
Another factor is causing the effect.
2. Cause or effect?
The factor could be a result, not a cause, of the
treatment.
3. Chance
There is always a small possibility that your
result happened by chance.
4. Homogeneity
You can find no link because all subjects had the
same level of the factor.
5. Misclassification
You can find no link because you cannot
accurately classify each subjects level of the
factor.
6. Bias
Selection procedures or administration of the
study inadvertently bias the result.
7. Too short
The short-term effects are different from the
long-term ones.
8. Wrong amount
The factor would have had an effect, but not in
the amount used in the study.
9. Wrong situation
The factor has the desired effect, but not in the
situation studied.
Table 12.2. Common pitfalls in evaluation.
Adapted with permission from (Liebman 1994)
48Assessment vs. Prediction
- An assessment system examines an existing entity
by characterizing it numerically - Prediction system predicts characteristic of a
future entity involves a model with associated
prediction procedures - deterministic prediction (we always get the same
output for an input) - stochastic prediction (output varies
probabilistically)
49Product Quality Models
- Boehms Model
- ISO 9126 Model
50Boehms Model
51ISO 9126 Model
52Targeting
Table 12.5. Quantitative targets for managing US
defense projects. (NetFocus 1995)
53Software Reuse
- Producer reuse
- creating components for someone else to use
- Consumer reuse
- using components developed for some other product
- Black-box reuse
- using component without modification
- Clear- or white-box reuse
- modifying component before reusing it
54Process Evaluation
- Postmortem Analysis
- a post-implementation assessment of all aspects
of the project - Process Maturity Models
- development has built in feedback and control
mechanisms to spur improvement
55Postmortem Analysis
- Design and promulgate a project survey to collect
relevant data. - Collect objective project information.
- Conduct a debriefing meeting.
- Conduct a project history day.
- Publish the results by focusing on lessons
learned.
56Table 12.9. When post-implementation evaluation
is done.
Time period
Percentage of respondents (of 92 organizations)
Just before delivery
27.8
At delivery
4.20
One month after delivery
22.20
Two months after delivery
6.90
Three months after delivery
18.10
Four months after delivery
1.40
Five months after delivery
1.40
Six months after delivery
13.90
Twelve months after delivery
4.20
57Capability Maturity Model (CMM)
58Table 12.10. Required questions for level 1 of
process maturity model.
Question number
Question
1.1.3
Does the Software Quality Assurance function have
a
management reporting channel separate from the
software
development project management?
1.1.6
Is there a software configuration control
function for each
project that involves software development?
2.1.3
Is a formal process used in the management review
of each
software development prior to making contractual
commitments?
2.1.14
Is a formal procedure used to make estimates of
software size?
2.1.15
Is a formal procedure used to produce software
development
schedules?
2.1.16
Are formal procedures applied to estimating
software
development cost?
2.2.2
Are profiles of software size maintained for each
software
configuration item over time?
2.2.4
Are statistics on software code and test errors
gathered?
2.4.1
Does senior management have a mechanism for the
regular
review of the status of software development
projects?
2.4.7
Do software development first-line managers sign
off on their
schedule and cost estimates?
2.4.9
Is a mechanism used for controlling changes to
the software
requirements?
2.4.17
Is a mechanism used for controlling changes to
the code?
59Table 12.11. Key process areas in the CMM (Paulk
et. al. 1993)
60Evaluating Resources
- People Maturity Model
- goal is to improve workforce
61Table 12.13. People capability maturity model.
(Curtis, Hefley and Miller 1995)
Level
Focus
Key practices
5 optimizing
Continuous knowledge and skills
Continuous workforce innovation
improvement
Coaching
Personal competency
development
4 managed
Effectiveness measured and
Organizational performance
managed, high performance
alignment
teams developed
Organizational competency
management
Team-based practices
Team-building
Mentoring
3 defined
Competency-based workforce
Participatory culture
practices
Competency-based practices
Career development
Competency development
Workforce planning
Knowledge and skills analysis
2 repeatable
Management takes responsibility
Compensation
for managing its people
Training
Performance management
Staffing
Communication
Work environment
1 initial
62Questions?
63Course Evaluations