Title: Earth Science Applications IBPD Metrics Feb 3, 2004
1Earth Science ApplicationsIBPD MetricsFeb 3,
2004
2NASAs Budget Planning Process
Congressional Budget Submission
Agency-Level Plans
Vol. 1
Vol. 2
Vol. 3
NASA Performance Accountability Report
NASA Integrated Budget Performance Document
Etc.
NASA Strategic Plan
ISTP
Long-range plan produced at least every 3 years
Five-year plan produced annually
Report on prior year produced annually
Integrated Space Plan
Real Property Strategic Plan
Budget Levels 1 2
Enterprise Performance Report
Enterprise Strategies
Strategic Human Capital Plan
Enterprise Budget Performance Plans
Update every 1-3 years
Budget Level 3
Development
Center Implementation Plans
Operation
Research
Update every 1-3 years
Tech. Adv. Conc.
Program Plans
Project Plans
3Earth Science Enterprise Communications Strategy
4IBPD Performance Metrics
FY05 IBPD Performance Measures (Abridged)
Outcome 1.2.1 Through 2012, benchmark the
assimilation of observations provided from 20
of the 80 sensors on NASA Earth observation
satellites.
Crosscutting Solutions Verify / validate at
least two commercial remote sensing
5 ESA 1
sources/products for Earth science research (thru
JACIE CRS).
National applications Benchmark measurable
enhancements to at least 2 national decision
support systems using NASA results, specifically
in the areas of Public Health Disaster Management
and Air quality.
5 ESA 2
Crosscutting Solutions Expand DEVELOP human
capital development program to
5 ESA 3
increase the capacity at a level of 100 program
graduates per year and perform
significant student-led activities using NASA
research
Crosscutting Solutions Benchmark solutions from
at least 5 projects that were selected in the
FY03 REASoN program.
5 ESA 4
Outcome 1.2.2 By 2012, benchmark the
assimilation of 5 specific types of predictions
resulting from Earth System Modeling Framework of
22 NASA Earth system science models.
Crosscutting Solutions DEVELOP program to rapid
prototype project using
5 ESA 5
models from GISS, GFDL, NCEP, SpoRT, and the JPL
Earth Science
laboratories.
Crosscutting Solutions Benchmark solutions
associated with at least 5
5 ESA 6
decision support systems that assimilate
predictions from Earth system
science models.
Outcome 1.2.3 By 2012, benchmark the
assimilation of observations and predictions
resulting
from NASA Earth Science research in 12 decision
support systems serving national priorities
and missions of federal agencies.
National applications Benchmark enhancements to
at least 2 national decision support systems
using NASA results, specifically Public Health
Disaster Management and Air quality.
5 ESA 7
Crosscutting Solutions Verify and Validate
Benchmark solutions for at least 6 5 decision
support systems associated with FY03 REASoN.
5 ESA 8
Outcome 3.1.1 By 2012, in partnership with DHS,
DoD, State Dept., deliver 15 obs. sets 5
model predictions for climate change, weather
prediction and natural hazards to 5 national 5
global organizations to evaluate 5 scenanarios
Benchmark the use of predictions from 2 NASA
Earth system science models
5 ESA 9
for use in national priorities, such as support
for the CCSP, CCTP, and NWS.
Benchmark the use of Earth science research
results in 2 scenarios
5 ESA 10
assessment tools.
5Earth Science Applications Program Approach
The Earth Science Applications Program approach
is to assimilate Earth observations
and predictions of Earth system processes into
decision support tools for reliable and sustained
use to serve society. The approach is based on
core systems engineering principals including
functional steps of evaluation, verification
validation, and benchmarking of system
components into integrated system solutions.
Reference Earth Science Enterprise Applications
Plan (Draft) Chapter 3, Paragraph 1
6Applications Plan Process Definitions
- Evaluation - This phase provides an initial match
of user-defined requirements relative to Earth
science products. Identify decision support tools
associated with an application area examine the
partners plans for its decision support tool
assess requirements, potential value, and
technical feasibility of current/future Earth
science results in partners operational tools
assess the impacts or commitments to NASA ESE by
engaging with the partner in the activity
compare project to application portfolio. With
partner, NASA makes decisions on whether to
pursue further collaboration and engage in a
project. - Verification Validation This phase measure
the performance characteristics of Earth science
products (as outputs) to meet the input
requirements for decision support system.
Verification determines how the actual
performance of a given observation or predication
product meets the user-defined requirements
within a specified tolerance. Validation
determines if the performance of the algorithms
(or logic) using the data achieved the intended
goals. As appropriate, NASA and partners develop
prototype products to address requirements and
introduce the Earth science products into the
decision support tools. - Benchmark Improvements This phase is a rigorous
process to compare the performance of a decision
support tool using Earth science research results
as inputs to a standard or reference scenario to
document the value of the use of Earth science
products in the tool. Where partners have
existing metrics to evaluate their tools and
decisions, NASA utilizes those metrics as
benchmarks. To support adoption by the partner,
this phase includes robust documentation of
procedures and guidelines to describe the steps
to access and utilize the Earth science products.
Reference Earth Science Enterprise Applications
Plan (Draft) Chapter 3, Paragraph 1
7General Definitions
Decision Support System (DSS) a computer based
information-processing system for scenario
optimization through multi-parametric analysis. A
DSS utilizes a knowledge base of information with
a problem solving strategy that may routinely
assimilate measurements and/or model predictions
in support of the decision making process. The
DSS provides an interface to facilitate human
inputs and to convey outputs. Outputs from a DSS
would typically be used for making decisions at
the local level and outputs from multiple DSSs
may be used in establishing policy. Decision
Support Tools a suite of solutions owned by
NASA partners that are used in a variety of
problem domains for decision and policy making.
These solutions could include assessments,
decision support systems, decision support
calendars, etc.
8Initial Systems Engineering Process
Define Requirements Specifications
DSS Selection
Investigate Alternative NASA Inputs
Design Implement
Benchmark (Baseline Assess Performance)
Enhanced DSS
Verify Validate
Refine
Refine
Refine
Refine
Refine
Evaluation
VV
Benchmarking
- Use of Systems Engineering principles leads to
scalable, systemic, and sustainable solutions and
processes, which in turn contribute to the
success of the mission, goals, and objectives of
each National Application. - Evaluation phase involves understanding the
requirements for and technical feasibility of
Earth science and remote sensing tools and
methods for addressing DSS needs.
- Verification and Validation (VV) phase includes
measuring the performance characteristics of
data, information, technologies, and/or methods,
and assessing the ability of these tools to meet
the requirements of the DSS. - In the Benchmarking phase, the adoption of NASA
inputs within an operational DSS and the
resulting impacts and outcomes are documented.
9Decision Support System Evaluation Status