Title: CS 521
1CS 521
- Software Engineering Analysis
2Course Topics
- Measurement
- Basic Concepts
- Measurement in SE
- Empirical Strategies
- Surveys
- Case Studies
- Experiments
- Empiricisms in SE
- Experiment Planning
- Hypothesis Formulation
- Variable Selection
- Subjects
- Design
- Instrumentation
- Validity
- Threats to Validity
- Evaluation
- Cyber-Physical Systems
- Definition
- Specification
- Design
- Analysis
- Testing
- Security and Trustworthiness
3cyber-physical system (CPS)
- A system featuring a tight combination of, and
coordination between, the systems computational
and physical elements. - Today, a pre-cursor generation of cyber-physical
systems can be found in areas as diverse as
aerospace, automotive, chemical processes, civil
infrastructure, energy, healthcare,
manufacturing, transportation, entertainment, and
consumer appliances. - This generation is often referred to as embedded
systems. In embedded systems the emphasis tends
to be more on the computational elements, and
less on an intense link between the computational
and physical elements. - Unlike more traditional embedded systems, a
full-fledged CPS is typically designed as a
network of interacting elements instead of as
standalone devices. The expectation is that in
the coming years ongoing advances in science and
engineering will improve the link between
computational and physical elements, dramatically
increasing the adaptability, autonomy,
efficiency, functionality, reliability, safety,
and usability of cyber-physical systems.
4Recent developments
- Infrastructure getting modernized
- Ratio of advanced to regular meters 4.7 (FERC,
2008) - Island of Malta becomes smart grid island
- Enemalta and Water Services Corp. to conduct
remote monitoring 250,000 smart meters - 400,000 population
- 90M expense
- Network to be completed by 2012
- Remote monitoring, meter reading, and real-time
management of network - Real time monitoring and smart meters -gt time of
day pricing - Xcel Energy Boulder as first Smart Grid City in
the U.S. - First fully integrated smart Grid in U.S.
- PGE rolling out several million smart meters in
N.Cal - Alliander - Amsterdam green grid city project
- Several 100 households
- Target completion 2012
- Total 1B investment
- Estimated cost 410/household over 15 years for
installation of smart grid - Experted emmissions reduction 40 by 2025
5The Utility Industry is undergoing rapid change -
Google Power Meter
- Source SmartGridNews.com
- Google is announcing Google PowerMeter, which
will ultimately become an open platform for home
energy information. - PowerMeter is currently in internal beta testing.
About four dozen Google employees have home
energy monitors to record their power usage (as
proxies for the smart meters of the future). A
Home Energy gadget on their iGoogle home pages
shows them how much energy they are using. The
gadget tracks historical data and forecasts
future trends (similar to the displays available
for some of Googles finance applications). - Giving Customer information so they can act
and help with demand response - The PowerMeter Platform
- Underneath the PowerMeter gadget is an open
systems platform that Google equates to Google
Maps, the highly successful geospatial system
that has become the foundation for thousands of
applications. - Although the company uses the Maps comparison,
PowerMeter may actually have more in common with
Google Android and Google Health. Android is a
platform for building mobile phone applications.
It deals not just with data, but also with
hardware. In a similar fashion, Google PowerMeter
will ultimately need to interface with smart
meters, thermostats and other devices. - . Intelligent software with real time
information pushes consumption away from high
peak load areas
6Wikipedia Smart Grid
- A smart grid delivers electricity from suppliers
to consumers using digital technology to save
energy, reduce cost and increase reliability. - Such a modernized electricity network is being
promoted by many governments as a way of
addressing energy independence or global warming
issues. - For example, if smart grid technologies made the
United States grid 5 more efficient, it would
equate to eliminating the fuel and greenhouse gas
emissions from 53Â million cars. - United States Congress to pass legislation that
included doubling alternative energy production
in the next three years and building a new
electricity "smart grid". - Alternative fuel sources would require a smart
and flexible grid
7Open protocol and standards is the way to go
- "(F)Â OPEN PROTOCOLS AND STANDARDS. The
Secretary shall require as a condition of
receiving funding under this subsection that
demonstration projects utilize open protocols and
standards (including Internet-based protocols and
standards) if available and appropriate." (P.30,
Section 405 A-F). - Government plays an important role
8So where are we headed with Smart Grid?
- Long Term - decades
- - What is the model of the Smart Grid?
- Bringing it alive?
- Knows its status - sensors
- Makes smart decisions intelligent decision
making (decentralized) - Fixes/modifies/evolves itself like a living
organism Control - Changes its topology
- This is the queen of infrastructures
- Short and Medium
- Is flexible quick repair
- Can take in new energy sources such as wind /
solar / renewables? - Can we connect to PHEVs?
- Do we have any smart grids today YES!
9Organize Thought Leadership Forums on Smart Grid
of Future
- Next Forum, June 18, 2009
- Current technical limitations of 100 year old
electric grid infrastructure in the United States - Various visions of the Smart Grid from DOE,
National Labs, how it relates to technologies
available today - Technologies adopted by successful
implementations of Smart Grid across the US and
abroad - Open-systems wireless/comm interface software and
standards based approach - Advanced wireless, RFID and RF-sensors
technologies and their convergence with the grid - California Energy Commission, Defense Energy
Support CenterElectric Power GroupEPRIGlobal
Quality Corp.Hughes Network SystemsISGEC
GroupIsmb - istituto superiore mario
boellaLanTech, Inc.Motorola, Inc.
San Diego Gas ElectricSempra Energy/The Gas
CompanySouthern California Edison
CompanySouthern Contracting CompanyTechnoCom
CorporationUniversal Devices, IncUniversity of
South CarolinaUtility Consulting Group
10Previous forum, March 18, 2009
- Electric Power Group
- Qualcomm Ventures
- Capgemini
- Los Angeles Dept Water Power
- BC Hydro
- Lawrence Berkeley National Laboratory
- Sempra Energy/The Gas Company
- NERC Cyber Security CIP Program
- Siemens
- Oracle Corporation
- Next forum, November,2009
11RESEARCH
- Variable and uncertain sources
- Solar
- Wind
- Variable sinks
- Appliances
- Spatial and Temporal sources
- and sinks
- PHEVs / hybrids
- Demand Response
- Plug and Play
- Open Architecture
- An intelligent network making decisions
12Infrastructure upgrade challenge but
opportunity
- Electric grid set up about100 years ago
- 157,000 miles of high voltage electric
transmission lines - Since 1990, demand has increased 25
- Construction of power plants has decreased 30
- Recent history .
- Wikiepedia - Â The energy crisis was
characterized by a combination of extremely high
prices and rolling blackouts. Price instability
and spikes lasted from May 2000 to September
2001. Due to price controls, utility companies
were paying more for electricity than they were
allowed to charge customers, forcing the
bankruptcy of Pacific Gas and Electric and the
public bail out of Southern California Edison.
This led to a shortage in energy and therefore,
blackouts. Rolling blackouts began in June 2000
and recurred several times in the following 12
months. - 2003 rolling blackout (Cleveland isolation, 55M
people affected) - Opportunity to support changing demands of the
customer via a flexible infrastructure
13Demand response
- Demand Response Definition (LBL) DR is a set of
time-dependent activities that reduce or shift
electricity use to improve electricity grid
reliability, manage electricity costs, and
encourage load shifting or shedding when the grid
is near its capacity or electricity prices are
high. -
LBL Demand Response 2004
Test - FERC - 8 percent of energy consumers in US have
demand response program - Potential demand response from all U.S. programs
41,000 MW, or 5.8 of peak demand. - Is increase of 3,400 MW from the 2006 estimate
- largest demand response resource contributions
from Mid-Atlantic, Midwestern and Southeastern - Ontario Smart Grid Forum - ..providing
transparent electricity prices to consumers
together with time-of-use rates can lead to
consumption reductions that range from five to
fifteen per cent.
14A new grid over the next 25-50 yearsData network
Power Network, Is there a parallel?
15Challenges and research opportunities
- Lack of clear definition on what the Smart Grid
will or should look like - Lack of clear articulation from leaders to
citizens on the benefits and reason for
investment - Lack of on interfaces between devices, networks,
appliances, meters, infrastructure (need for open
interfaces) - Lack of acceptance of problems vendors systems
sometimes talk even when standard interfaces are
developed - Economic justification at the unit level (home,
office, factory) is challenging - How does one pay for the investment?
- Who pays?
- How does utility charge for it? Utilities are
highly regulated - How does community discount it? Concern about
certain vendors getting additional advantage - Rate adjustments incremental would be necessary
- All parties to not share the same vision of the
Smart Grid - Evolution versus revolution conflict in
approaches - Are there appropriate incentives from government
- Regulatory challenges utilities are regulated
- Infrastructure not ready today to turn on the
switch - In the Smart Grid of the Future, what becomes of
utilities (only a pipe? Or have content what
is the meaning of content in the Smart Grid of
the Future)?
16Where does Wireless Technology Come in?
- Does not require large amounts of fixed
infrastructure - New generations of technology can easily replace
older generations without having to remove cables - Next generation of appliances can be done easily
- Infrastructure itself can be upgraded frequently
(e.g. 1G -gt 2G -gt 3G -gt 4G) - Benefits of wireless, variability in performance
and resource requirement - Long range / short range
- Low bandwidth / high bandwidth
- Delays in networks constantly reducing
- Much lower investment to start getting benefits
of Smart Grid
17The Wireless Internet of Artifacts 2.0 Edge,
middle, core
- Edgeware - edge of the network generates
- Sensor data from increasingly powerful sensing
- devices
- Variable data rates depending on the application
- E.g. temperature-sensing RFIDs on power lines
- Location (GPS or RTLS) on field equipment
- Sensors are talking to decision making software
which in turn is routing energy in various
directions much like a router is forwarding data
packets to the right destination - Middleware
- Determines what to do with the sensor data, adds
intelligence, and then executes it - Gets high level controls from next layer and
executes on it - Centralware
- Makes decision on what needs to get done
- Central repository of information
18The Wireless Internet of Artifacts 2.0
- Filtration
- Where is the filtration Edge/Middle/Core?
- Where do the rules for filtration come from
Core? - Edge node is smart and knows at some level what
to do - How does one distinguish between the Edge, Middle
and Core nodes? Why three levels? - Aggregation
- Two sensor streams (S1 and S2) need to be
combined into one (e.g. power sensor status in
combination with temperature and motion status
can be used to create a single boolean, at what
level should the stream be discarded and only the
boolean propagated further? - Messaging
19Cyber security in the Smart Grid
- Cyber and Physical Security is important for the
Smart Grid - Security of Wireless Devices is a bigger
challenge than wired devices - Devices operating on standard wireless interfaces
would require standardized security protocols - Existing protocols such as 802.11i, WEP, WPA,
Public key/Private Key, etc. require systematic
investigation and eventually security will scale
out similar to the net i.e. mixed/heterogeneou - Definition and meaning of security to an
appliance needs to be researched - Physical security would involve adding
motion/video/infrared sensors which would be
integrated into the architecture of the system
20Source CNET Grid gets hacked
- Spies from other countries have hacked into the
United States' electricity grid, leaving traces
of their activity and raising concerns over the
security of the U.S. energy infrastructure to
cyberattacks. - The Wall Street Journal on Wednesday published a
report saying that spies sought ways to navigate
and control the power grid as well as the water
and sewage infrastructure. It's part of a rising
number of intrusions, the article said, quoting
former and current national security officials. - There have long been concerns over securing the
power grid and other infrastructure. Those
security issues are mounting as utilities use
more Internet-based communications and software
to control the grid through smart-grid
technology. - A report by security firm IOActive last
month warned that people with 500 worth of
equipment and the right training could manipulate
smart meters with embedded communications in
people's homes to potentially disrupt operation
of the grid. - WIRELESS and Security
- - Business case for Wireless relies on
scalability, upgradeability and cost - - What is the model of security on the Smart
Grid? Just like on the net, there will not be a
single source of attack and so there will not be
a single source of security - - What do you protect and where? You will have
to made decisions based on cost benefit analysis,
e.g. in the home the security requirements are
different from the enterprise Denial of
service, Protocol hacked, firewalls, encryption - - Benefits - Mobile phones today are secure so
wireless on the Grid can be made secure - -
21Reconfigurable Wireless Interface for Networking
of Sensors (ReWINS) Architecture
-
-
Hardware design of Intelligent sensor and
wireless interface
Fig. 1 Architecture of Intelligent sensor
Interface
- Multiple protocols
- Variable payloads - depending on the level of
intelligence required by smart appliance - Existing devices - Works with existing devices
and open for scaling up - Multiple sensors temperature, humidity, motion,
shock, acceleration, gyroscopic, chemical - Embedded demand response intelligence within
low-power Atmel processor - Accept time of day pricing
- Framework for open AMI connects with
thermostats, meters, appliances, and HANs
22WINSmartGrid - Reconfigurable Wireless Interface
for Networking of Sensors (ReWINS)
23WINSmartGrid Technology
- Low Power technology
- Open architecture
- Standards-based hardware adapted to fit the
problem resulting in lower overall cost - Wireless infrastructure for monitoring
- Wireless infrastructure for control
- Two-way communication
- Service architecture with layers - Edgeware,
Middleware and Centralware - Over the air download for real-time
reconfigurability with wireless - Plug-and-Play approach to network installation
- Reconfigurability - The capability of the
technology to be reconfigurable allows OTA (over
the air) upgrade of the firmware to be able to
handle different appliances, applications,
sensors, controllers, thermostats, smart meters,
PHEVs.
24WINSmartGrid Architecture
- Wireless protocols issues for in-home, in-office
and in-factory - Zigbee / 6LoPan / Home plug
- WiFi
- Bluetooth
- Rubee
- EPC / RFID
- Protocols for in-field
- Transmission Infrastructure CDMA, GPRS, LTE,
WiMAX, Broadband over power lines - Tracking and sensing technology for meters
- Active versus Passive
- UHF/LF/HF/433Mhz
- Data layer architecture issues
- Bandwidth requirement
- Power constraints
- Security Requirements
- Database requirements
25Characteristics of WINSmartGrid
- Low Power technology
- Standards-based adapted to fit the problem
resulting in lower overall cost - Wireless infrastructure for monitoring
- Wireless infrastructure for control
- Service architecture with three layers -
Edgeware, Middleware and Centralware - Open architecture for easy integration
- Plug-and-Play approach to architecture
- Reconfigurability - The capability of the
technology to be reconfigurable allows OTA (over
the air) upgrade of the firmware to be able to
handle different devices, applications, sensors,
controllers, thermostats, etc.
26Smart Grid in WINSmartHome
- Three layers
- Research issues
- What is the in-home architecture?
- How does the 3 layer model work?
- Which wireless comm protocol will actually work?
- Are current wireless protocols adequate?
- How can security be done and how important is
security?
27- WINSmartGrid UI
- Simplicity for consumer use
- Remote access and control
- Open systems and tools for integration
Energy Manager
28The Smart Grid Research Center In Progress
Work force (physical, social)
Grid (physical)
Technology (cyber and physical)
- Partnership Academia, Utilities, Government
Labs, Regulators, Industry - Demo on UCLA Micro Grid
- March 2010
- Develop demand response capability with UCLA
WINSmartGrid - Objective
- to determine how demand response is accepted in
Micro Grids - to determine what the reduction demand
response will be - to determine what the wireless and mobile
communications infrastructure will look like for
a scalable micro grid. - to connect various smart appliances and devices
on campus with the objective of studying how a
heterogeneous wireless infrastructure performs
when scaled up. - open-systems to allow vendors to create
plug-and-play sensor-enabled appliances - PHEV affect on location-centric and
29Research Thoughts Wireless Internet for Smart
Grid
- Long Term (25 years vision) versus short term (5
years) - Europe is ahead of us
- PHEVs will eventually play a very important role
- Major research issues
- Software architecture
- Integration of sensor interface with demand
response and building energy infrastructure - Smart Home Architecture
- Control loop in heterogeneous systems
- Plug and play
- Model of cyber with infrastructure
- Security needs to be solved before utilities will
start to use wireless on a wide scale
30The Wireless Internet of Artifacts Version 2.0
- Heterogeneous wireless grid with mobile/roaming
artifacts (objects, ICT devices people) - Constantly in communication with the
infrastructure - Control decisions made at the edge of the network
(via Edgeware), in the middle (via Middleware) or
at the core (Centralware)? - How is work load and intelligence distributed
between these layers? - Messaging engine becomes key to transmit control
data - Sitting on these networks are layers of I.P.
- What does this protocol look like? Is the
current I.P. protocol good enough? Should
high-media content (such as sending video over
HAN) adopt a different network approach from the
rest of the network that only sends period sensor
data? Is Video input a sensor? - Allow rich content to move rapidly
- Have intelligence
- location-specific media compression, analysis and
representation - Time-specific DRM
- Context specific commerce models
31The Wireless Internet of Artifacts
- Infrastructure
- With advances in technologies such as EVDO,
WIMax, Zigbee, UWB, Rubee - Each wireless internet link will provide SLAs
that data owner can purchase (Google open model) - Resources within a wireless network SLA would
include variables such as - Bandwidth
- Power utilization (sensor data that needs to be
sent infrequently between two nodes would opt for
low-power networks such as a zigbee networks) - Wireless networks that are remote would utilize
energy harnessing (green circuits) to offer
lower-cost transmission - Designing, managing, controlling, using, and
benefiting from a new genre of wireless internet
of artifacts provides for interesting
opportunities in the future.Â
32Measurement
- What is not measurable, make measurable
Galileo Galilei (1564 1642) - Suggests that one of the aims of science is to
find ways to measure attributes of things we are
interested. - Measurement lies at the heart of many systems
that govern our lives.
33- Measurement process by which numbers or symbols
are assigned to attributes of entities in the
real world in such a way as to describe them
according to clearly defined rules. - Entity object or an event in the real world
- Attribute is a feature or property of an entity
34Measurement
- Definition far from clear cut
- Height of person, but what about IQ, or quality
of a wine? - Measuring Instrument?
- Margin for error with best instruments
- What scale is appropriate?
- We can say Joe is twice as tall as Fred, but why
not yesterday was twice as hot - We can take the average grade for a quiz, but
what about the mean of the jersey numbers of the
Seahawks?
35Measurement
- Measurement is a direct quantification
- Calculation is indirect, we take measurements and
combine them into a quantified item that reflects
some attribute we are trying to understand
(overall score in a decathlon) - In SE we often want to combine measurements to
understand the Big Picture when discussing a
project
36Measurement in SE
- Software Engineering describes the collection of
techniques that apply an engineering approach to
construction and support of software products. - Activities include
- Managing
- Costing
- Planning
- Modeling
- Analyzing
- Specifying
- Designing
- Implementing
- Testing
- Maintaining
- Continually striving to improve process and
product
37Measurement
- Electrical, Mechanical, Civil Engineering rely on
measurement measure variables, changes in
behavior, measuring causes and effects EE uses
instruments to measure voltage, current,
resistance to design circuits - Measurement Considered a luxury in SE
38Measurement in SE
- Fail to set measurable targets, thus cannot tell
if we met our goals - User friendly
- Reliable
- Fail to understand components costs
- Cost of design from cost of coding, testing
- Do not predict quality
- Will or product fail
- We allow anecdotal evidence to convince us to try
yet another revolutionary technology
39Measurements in SE
- Measurements made infrequently, inconsistently,
and incompletely. - Can they be repeated?
40Measurements in SE
- Managers
- What does each process cost?
- How productive is the staff?
- How good is the code being developed?
- Will the user be satisfied with the products?
- How can we improve?
41Measurements in SE
- Engineers
- Are the requirements testable?
- Have we found all the faults?
- Have we met our product or process goals?
- What will happen in the future?
42Scope of SE Metrics
- Cost and effort estimation
- Productivity Measure
- Data Collection
- Quality Models and Measurement
- Reliability Models
43Exercise
- 1. Explain the roll of measurement in determining
the best players in your favorite sport. - 2. How do you begin to measure quality of a
software product?
44Software Engineering Technology Infusion at NASA
- (1) understand the difference between technology
transfer (the adoption of a new method by large
segments of an industry) as an industry-wide
phenomenon and the adoption of a new technology
by an individual organization (called technology
infusion), and - (2) does software engineering technology transfer
differs from other engineering disciplines. While
there is great interest today in developing
technology transfer models for industry, it is
the technology infusion process that actually
causes changes in the current state of the
practice.
45Tech Transfer Problem
- One reason why there is so much interest in the
diffusion of innovations is because getting a new
idea adopted, even when it has obvious
advantages, is often very difficult. There is a
wide gap in many fields, between what is known
and what is actually put into use. Many
innovations require a lengthy period, often of
some years, from the time when they become
available to the time when they are widely
adopted. - Problem how to speed up the rate of diffusion of
an innovation
46Changes
- process improvement involves changes
- Minor replacing one compiler or editor by
another - Major changes that affect the entire development
process (e.g., using Cleanroom software
development and eliminating much of the unit
testing phase).
47Product Adoption
48Product Adoption
- First few customers are the oddballs or
eccentrics of society, who adopt a new product.
- Following them are the opinion leaders, who
then givetheir approval to the product. Society
then follows these opinion leaders, and product
growth follows rapidly. - During the mature stage, as the market saturates,
growth levels off, giving the characteristic
S-curve.
49Technology Transfer
- Gatekeepers. Technology transfer follows a
similar process. One member of an organization,
often called the gatekeeper, monitors
technological developments, and chooses those
that seem appropriate for inclusion in an
organization hence opens the gate to the new
technology. - Because this role is often informal, it may fall
naturally to the most creative and technically
astute individual in an organization. Since the
gatekeeper is aware of technical developments
outside of the organization, others in the group
often look towards this person for guidance. This
person often is known by the name guru or
similar sounding monikers.
50 Models for Tech Transfer
- People mover model. In this approach, there is
personal contact between the developer and the
user of a technology. Typically there is some
facilitator within the infusing organization that
knows about the new technology and wishes to
import it into the new organization (i.e., the
gatekeeper). - This method was found to be the most prevalent
and effective of all technology transfer methods. - 1. Spontaneous gatekeeper role assumed by
organization member. - 2. Assigned gatekeeper role imposed by management
on some organization member. - 3. Umbrella gatekeeper role assumed by another
organization to impose new technology on others.
51Tech Transfer Models
- Communication model. In this approach, the new
technology has appeared in print and, as with the - people mover model, some facilitator discovers
the technology and wishes to infuse it into the
new organization. The print mechanism may be
internal documentation, conference reports or
journal publications.
52Tech Transfer Models
- 3. On-the-shelf model. This approach, relatively
rare, the new technology to be packaged so that
non-experts can discover it and learn enough
about it to begin the infusion process. It
requires sufficient documentation so that others
can easily pick it up and use it.
53Tech Transfer
- Vendor model. This last method requires an
organization to turn over the task to a vendor to
sell them a new technology. It effectively turns
the vendor into the agent of the People mover,
Communication or On-the-shelf model.
54Tech Transfer Models
- Rule model. This method uses an outside
organization to impose a new technology on the
development organization, which then infuses it
into its own development process. - There are many examples within the government
sector of this last technology transfer model.
The mandating of the Ada language by the
Department of Defenses Ada Joint Program Office
for system development, - the use of the Software Engineering Institutes
Capability Maturity Model to evaluate developers
qualifications for a Department of Defense
contract, - the similar process of using international
standard ISO 9000 in Europe, and the use of
Federal Information Processing Standards (FIPS)
by the National Institute of Standards and
Technology (NIST) are all examples of technology
transfer imposed by an outside agency.
55Tech Transfer
- Advocates. Fowler and Levine at the Software
Engineering Institute have been investigating
technology transition and have identified an
extension to the gatekeeper model 6. In their
model, technology transition is a pushpull
process - Producer ? Advocate ? Receptor ? Consumer
56Tech Transfer
- The produce of the technology needs an advocate
to export the technology outside of the
development organization, while the consumer
organization must have receptors agreeable to
importing the technology. - In many instances, however, both the advocates
and receptors are part of the consumer
organization, and in practice, this reduces to a
model very much like the gatekeeper.
57Maturation
- The original concept for the technology appears
as a published paper or initial prototype
implementation. - 2. The implementation of the technology involves
the further development of the concept by the
originator or successor organization until a
stable useful version is created. - 3. In the understanding stage, other
organizations experiment, tailor, expand, modify,
and try to use the technology. - 4. In the later transition stage, use of the
technology is further modified and expands across
the industry. - 5. The final maturation stage is reached when 70
of the industry uses the technology.
58Maturation
- In 1985, Redwine and Riddle 11 published the
first comprehensive study of software engineering
technology transfer, - Maturation what was the length of time required
for a new concept to move from being a laboratory
curiosity to general acceptance by industry. - In their study, they looked at 17 software
development technologies from the 1960s through
the early 1980s (e.g., UNIX, spreadsheets,
object-oriented design, etc.) - Technologies, once developed, required an
average of 7.5 years to become widely available
59Case Studies
- NASA plays the role of consumer organization
trying to adopt new technologies. - These technologies were studied by the Software
Engineering Laboratory (SEL) at Goddard Space
Flight Center. - The SEL was organized in 1976 to study flight
dynamics software, and since that time it has had
a significant impact on software development
activities within the Flight Dynamics Branch
(e.g., measurement, resource estimation, testing,
process improvement) - As a brief overview of SEL operations, the SEL
has collected and archived data on over 125
software development projects. The data are also
used to build typical project profiles against
which ongoing projects can be compared and
evaluated. The SEL provides managers in this
environment with tools for monitoring and
assessing project status. - Typically there are 6 to 10 projects
simultaneously in progress in the flight dynamics
environment. Each project is considered an
experiment within the SEL, and the goal is to
extract detailed information to understand the
process better and to provide guidance to future
projects. - Projects range in size from approximately 10K
lines of source code to 300K to 500K at the high
end. - Projects involve from 6 to 15 programmers and
typically take from 12 to 24 months to complete.
All software was originally written in FORTRAN,
but Ada was introduced in the mid-1980s (see
below), and there is now an increase in C and C
programming.
60Case Study
- Use of Ada
- Ada is a language that was developed by the U.S.
Department of Defense from 1976 until 1983 as a
common language on which to build complex
embedded applications. It is a general purpose
programming language adaptable to any computing
environment
61Case Study - Ada
- Use of Ada on flight dynamics projects was first
considered in 1985. - Because of Department of Defense interest in the
language and because of NASA Johnson Space
Centers decision to use Ada for Space Station
software, the SEL desired to look at its
applicability for other NASA applications. - The initial stimulus for this activity, then,
could be a mixture of the communication model
(i.e., papers were written about Ada),
on-the-shelf model (i.e., Ada products were being
sold) and to some extent, the rule model (i.e.,
since Johnson Space Center adopted Ada, there was
some pressure to do the same elsewhere within
NASA).
62Case Study - Ada
- To truly evaluate the appropriateness of Ada
within the SEL environment, a parallel
development of an Ada (GRODY) and FORTRAN (GROSS)
simulator was undertaken. - GROSS, as the operational product, had higher
priority and was developed on time. GRODY, as an
experiment to learn Ada, had a much longer
development cycle. In addition, since GRODY was
known by all to be an experiment, the development
team was not as careful in its design - However, the experiences of the GRODY team with
the typical set of requirements NASA used for
such products led to a greater interest in
applying object oriented technology instead as a
model for future NASA requirements and design
specifications. - Although the development of this simulator
continued until early 1988, by early 1987 it was
decided that the initial project was sufficiently
successful to continue the investigation of Ada
on other flight dynamics problems. - Elapsed time since start of Ada activity was 30
months.
63Case Study - Ada
- Transition phase of technology Transfer. Because
of the poor performance on the GRODY simulator
and the problems with developing Ada
requirements, the SEL undertook a second Ada
pilot project (GOADA) as an experiment. - Sufficient confidence in Ada by this time to make
GOADA an operational product, - In1990, Ada became the language of choice for
simulators in the Flight Dynamics Division.
Transition time was another 30 months.
64Conclusions
- Infusion mechanisms do not address software
engineering technologies well. - Quantitative data is crucial for understanding
software development processes - Technology infusion is not free.