Climate and Weather Research at NASA Goddard - PowerPoint PPT Presentation

About This Presentation
Title:

Climate and Weather Research at NASA Goddard

Description:

Climate and Weather Research at NASA Goddard – PowerPoint PPT presentation

Number of Views:83
Avg rating:3.0/5.0
Slides: 25
Provided by: dduf6
Category:
Tags: nasa | climate | com | goddard | research | rr | ty | weather | www | y8

less

Transcript and Presenter's Notes

Title: Climate and Weather Research at NASA Goddard


1
Climate and Weather Researchat NASA Goddard
  • 9 September 2009

Phil Webster Goddard Space Flight
Center Phil.Webster_at_NASA.gov
2
NASA Mission Structure
  • To implement NASAs Mission, NASA Headquarters
    is organized into four Mission Directorates.
  • Aeronautics Pioneers and proves new flight
    technologies that improve our ability to explore
    and which have practical applications on Earth.
  • Exploration Systems Creates new capabilities and
    spacecraft for affordable, sustainable human and
    robotic exploration
  • Science Explores the Earth, moon, Mars, and
    beyond charts the best route of discovery and
    reaps the benefits of Earth and space exploration
    for society.
  • Space Operations Provides critical enabling
    technologies for much of the rest of NASA through
    the space shuttle, the International Space
    Station, and flight support.

3
Science Mission Directorate
4
Earth Science Division Overview
  • Overarching Goal to advance Earth System
    science, including climate studies, through
    spaceborne data acquisition, research and
    analysis, and predictive modeling
  • Six major activities
  • Building and operating Earth observing satellite
    missions, many with international and interagency
    partners
  • Making high-quality data products available to
    the broad science community
  • Conducting and sponsoring cutting-edge research
    in 6 thematic focus areas
  • Field campaigns to complement satellite
    measurements
  • Modeling
  • Analyses of non-NASA mission data
  • Applied Science
  • Developing technologies to improve Earth
    observation capabilities
  • Education and Public Outreach

5
Earth Science Division Focus Areas
6
Modeling, Analysis and Prediction (MAP) Program
  • Seeks an understanding of the Earth as a
    complete, dynamic system
  • Emphasis on climate and weather
  • Key questions include
  • How is the Earth system changing?
  • What are the forcing mechanisms driving observed
    changes?
  • How does the Earth system respond to natural and
    human-induced changes?
  • What are the consequences of Earth system change
    to society?
  • What further changes can be anticipated, and what
    can be done to improve our ability to predict
    such changes through improved remote sensing,
    data assimilation, and modeling?
  • The MAP program supports observation-driven
    modeling that integrates the research activities
    in NASAs Earth Science Program

7
NASAs Climate and Weather Modeling
  • Spans timescales from weather to short-term
    climate prediction to long-term climate change
  • Spans weather, climate, atmospheric composition,
    water energy cycles, carbon cycle
  • Unique in bringing models and observations
    together through assimilation and simulation
  • Products to support NASA instrument teams and
    atmospheric chemistry community
  • Contributes to international assessments
    WMO/UNEP, IPCC - contributions to IPCC/AR5 new
    paradigm of data delivery for NASA modeling
    community in partnership with NCCS, PCMDI, and
    LLNL
  • Contributes to WWRP and WCRP

8
Tomorrows Science
  • New missions increased sensing of the earths
    climate system as recommended by decadal studies
    more data and more types of data
  • Advanced assimilation use more data to produce
    better model initiation
  • Higher resolution better representation of
    atmospheric processes to improve prediction
  • Greater complexity - understanding and
    predicting/projecting future climate
  • Coupled ocean-atmosphere-land models - including
    full carbon cycle
  • Increased collaboration of models, model
    output, simulation observational data sets

9
High-Resolution Climate Simulations with GEOS-5
Cubed-Sphere Model
Bill Putman, Max Suarez, NASA Goddard Space
Flight Center Shian-Jiann Lin, NOAA Geophysical
Fluid Dynamics Laboratory
Low Cloud features from 3.5-km GEOS-5 Cubed
Sphere for 2 January 2009 (left), compared to
GOES-14 first full-disk visible image on 27 July
2009 (center) and 27-km (roughly ΒΌ degree) GEOS-5
Cubed Sphere for 2 January 2009 (right).
  • GMAO, GISS, NCCS and SIVO staff is refining
    techniques for Intel Nehalem (e.g. concurrent
    serial I/O paths).
  • SIVOs Bill Putmans 3.5-km (non-hydrostatic)
    simulations with GEOS-5 Cubed Sphere
    Finite-Volume Dynamical Core, on approximately
    4,000 Nehalem cores of the NASA Center for
    Computational Sciences (NCCS) Discover
    supercomputer, yielded promising results
    including cloud features not seen with
    lower-resolution runs.
  • Exploring techniques for more efficient memory
    and parallel I/O, e.g., evaluating effects of
    replacing of Intel MPI with MVAPICH.

NCCS Discover Scalable Unit 5s Nehalem
architecture and larger core count enables
researchers to exploit methods for higher
resolution models, advancing NASA Science Mission
Directorate science objectives.
10
GEOS-5 Impact of ResolutionKarman Vortex
Streets
28 km
14 km
7 km
3.5 km
11
MERRA Project Modern Era Retrospective-analysis
for Research and Applications
Michael Bosilovich, Global Modeling and
Assimilation Office, NASA Goddard Space Flight
Center
  • GMAOs 30-year reanalysis of the satellite era
    (1979 to present) using GEOS-5
  • Largest assimilation data set available today
  • The focus of MERRA is the hydrological cycle and
    climate variability
  • Todays observing system - 1.6M observations per
    6-hour snapshot. Close to 90 are from satellites
  • Public record supporting broad range of
    scientific research
  • Climate Data Assimilation System efforts will
    continue
  • Single largest compute project at the NCCS
  • Products are accessed online at the GES DISC
    http//disc.sgi.gsfc.nasa.gov/MDISC

12
High-Resolution Modeling of Aerosol Impacts on
the Asian Monsoon Water Cycle
William Lau, Kyu-Myong Kim, Jainn J. Shi, Toshi
Matsui, and Wei-Kuo Tao, NASA Goddard Space
Flight Center
  • Objectives include 1) clarifying the interactions
    between aerosols (dust and black carbon) and the
    Indian monsoon water cycle and how they they may
    modulate regional climatic impacts of global
    warming, and 2) testing feedback hypotheses using
    high-resolution models as well as satellite and
    in situ observations.
  • The team runs the regional-scale, cloud-resolving
    Weather Research and Forecasting (WRF) Model at
    very high resolutionless than 10-km horizontal
    grid spacing with 31 vertical layers. To mitigate
    the large computational demands of over 200,000
    grid cells, the team uses a triple-nest grid with
    resolutions of 27, 9, and 3 km.
  • For the aerosol-monsoon studies, a radiation
    module within WRF links to the Goddard Chemistry
    Aerosol Radiation and Transport (GOCART) aerosol
    module.
  • Using the Discover supercomputer at the NASA
    Center for Computational Sciences (NCCS), the
    team conducted a model integration for May 1 to
    July 1 in both 2005 and 2006.
  • Among other results, studies documented the
    elevated-heat-pump hypothesis, highlighting the
    role of the Himalayas and Tibetan Plateau in
    trapping aerosols over the Indo-Gangetic Plain,
    and showed preliminary evidence of aerosol
    impacts on monsoon variability.

Rainfall distributions from Weather Research and
Forecasting (WRF) Model simulations at
9-kilometer resolution (top row) and from
Tropical Rainfall Measurement Mission (TRMM)
satellite estimates (bottom row). Units are in
millimeters per day. Both WRF and TRMM show heavy
rain (red) over the Bay of Bengal and the western
coast.
By using 256 Intel Xeon processors on Discover,
the WRF Model can finish a 1-day integration in
less than 3 hours.
13
Observing System Experiments Evaluating
andEnhancing the Impact of Satellite Observations
Oreste Reale and William Lau, NASA Goddard Space
Flight Center
  • An Observing System Experiment (OSE) assesses the
    impact of an observational instrument by
    producing two or more data assimilation runs, one
    of which (the Control run) omits data from the
    instrument under study. From the resulting
    analyses, the team initializes corresponding
    forecasts and evaluates them against operational
    analyses.
  • The team runs the NASA GEOS-5 data assimilation
    system at a resolution of 1/2 degree longitude
    and latitude, with 72 vertical levels, and the
    GEOS-5 forecasting system at a resolution of 1/2
    or 1/4 degree.
  • The team uses high-end computers at the NASA
    Center for Computational Sciences (NCCS) and the
    NASA Advanced Supercomputing (NAS) facility. The
    mass storage allows continual analysis of model
    results with diagnostic tools.
  • This research has demonstrated the impact of
    quality-controlled Advanced Infrared Spectrometer
    (AIRS) observations under partly cloudy
    conditions. In modeling tropical cyclogenetic
    processes, the team found that using AIRS data
    leads to better-defined tropical storms and
    improved GEOS-5 track forecasts. Depicted in the
    figure is a set of experiments centering on
    AprilMay 2008, during which Tropical Cyclone
    Nargis hit Myanmar.

Impact of the Advanced Infrared Spectrometer
(AIRS) on the 1/2-degree Goddard Earth Observing
System Model, Version 5 (GEOS-5) forecast for
Tropical Cyclone Nargis. Upper left Differences
(AIRS minus Control) in 6-hour forecasts of 200
hPa temperature (C, shaded) and sea-level
pressure (hPa, solid line). Lower left The
6-hour sea-level pressure forecast from the AIRS
run shows a well-defined low close to the
observed storm track (green solid line). Lower
right The corresponding 108-hour forecast for 2
May 2008 (landfall time) compares very well with
the observed track. Upper right The 6-hour
sea-level pressure forecast from the Control run
shows no detectable cyclone.
NASA computational resources hosted 70 month-long
assimilation experiments and corresponding 5-day
forecasts.
14
GEOS-5 Support of NASA Field Campaigns TC4
ARCTAS TIGERZ
Michele Rienecker, Peter Colarco, Arlindo da
Silva, Max Suarez, Ricardo Todling, Larry Takacs,
Gi-Kong Kim, and Eric Nielsen, NASA Goddard Space
Flight Center
  • A Goddard Space Flight Center team supports NASA
    field campaigns with real-time products and
    forecasts from the Global Modeling and
    Assimilation Offices GEOS-5 model to aid in
    flight planning and post-mission data
    interpretation.
  • Recent supported campaigns include the TC4
    (Tropical Composition, Cloud and Climate
    Coupling), ARCTAS (Arctic Research of the
    Composition of the Troposphere from Aircraft and
    Satellites), and TIGERZ missions.
  • The most-often-used GEOS-5 configuration was a
    2/3-degree longitude by 1/2-degree latitude grid
    with 72 vertical levels. Data assimilation
    analyses were conducted every 6 hours.
  • The NASA Center for Computational Sciences (NCCS)
    hosted the GEOS-5 model runs on its high-end
    computers and provided a multi-faceted data
    delivery system through its Data Portal.
  • The mission support was successful, with GEOS-5
    products delivered on time for most of the
    mission duration due to the NCCS ensuring timely
    execution of job streams and supporting the Data
    Portal.
  • One example of mission success was a June 29,
    2008 DC-8 flights sampling of the Siberian fire
    plume transported to the region in the
    mid-troposphere, as predicted by GEOS-5.

This image shows 500-hectopascal (hPa)
temperatures (shading) and heights (contours)
during NASAs ARCTAS (Arctic Research of the
Composition of the Troposphere from Aircraft and
Satellites) mission. An analysis from the GEOS-5
model is shown with 24- and 48-hour forecasts and
validating analyses. These fields, with the
accompanying atmospheric chemistry fields, were
used to help plan a DC-8 flight on June 29, 2008.
The GEOS-5 systems were run on 128 processors of
the NCCS Explore high-end computer, with a
continuous job stream allowing timely delivery of
products to inform flight planning.
15
NASA HPC
  • NCCS at Goddard Space Flight Center
  • Focused on Climate and Weather Research in the
    Earth Science Division of the Science Mission
    Directorate
  • Support code development
  • Environment for running models in production mode
  • Capacity computing for large, complex models
  • Analysis visualization environments
  • NAS at Ames Research Center
  • Supports all Mission Directorates
  • For Earth Science Capability runs for test
    validation of next generation models

16
NCCS Data Centric Climate Simulation Environment
User Services
Analysis Visualization
  • Help Desk
  • Account/Allocation support
  • Computational science support
  • User teleconferences
  • Training tutorials

DATA Storage Management
  • Interactive analysis environment
  • Software tools for image display
  • Easy access to data archive
  • Specialized visualization support

Global file system enables data access for full
range of modeling and analysis activities
Data Transfer
  • Internal high speed interconnects for HPC
    components
  • High-bandwidth to NCCS for GSFC users
  • Multi-gigabit network supports on-demand data
    transfers

Data Archival and Stewardship
HPC Compute
  • Large scale HPC computing
  • Comprehensive toolsets for job scheduling and
    monitoring
  • Large capacity storage
  • Tools to manage and protect data
  • Data migration support

Joint effort with SIVO
17
Data Centric ArchitectureHighlight of Current
Activities
High Performance ComputingBuilding toward
Petascale computational resources to support
advanced modeling applications
Analysis and VisualizationTerascale environment
with tools to support interactive analytical
activities
Nehalem Cluster Upgrades
Dali Interactive Data Analysis
Data Storage and ManagementPetabyte online
storage plus technology-independent software
interfaces to provide data access to all NCCS
services
Data Archiving and StewardshipPetabyte mass
storage facility to support project data storage,
access, and distribution, access to data sets in
other locations
Data Management System
Data Portal Earth System Grid
Data Sharing and PublicationWeb-based
environments to support collaboration, public
access, and visualization
18
Interactive Data Analysis Visualization
Platform - Dali
  • Interactive Data Analysis Systems
  • Direct login for users
  • Fast access to all file systems
  • Supports custom and 3rd party applications
  • Visibility and easy access to post data to the
    data portal
  • Interactive display of analysis results
  • In-line and Interactive visualization
  • Synchronize analysis with model execution
  • Access to intermediate data as they are being
    generated
  • Generate images for display back to the users
    workstations
  • Capture and store images during execution for
    later analysis
  • Develop Client/Server Capabilities
  • Extend analytic functions to the users
    workstations
  • Data reduction (subsetting, field/variable/tempora
    l extractions, averaging, etc.) and manipulation
    (time series, display, etc.) functions

Analysis Visualization
16 cores 256GB
Direct GPFS I/O Connections 3 GB/sec per node
  • Dali Analytics Platform
  • 1.2 TF Peak, 128 cores, 2 TB main memory

- 8 nodes 2.4 GHz Dunnington (Quad Core) - 16
cores/node with 256 GB memory/core - 3 GB/s I/O
bandwidth to GPFS filesystem - Software CDAT,
ParaView, GrADS, Matlab, IDL,
python, FORTRAN, C, Quads, LATS4D
Currently configured as (8) 16-core nodes with
256 GB RAM/node, with flexibility technology to
support up to (2) 64-core nodes with 1 TB
RAM/node.
19
Data Management System
  • Improving access to shared observational and
    simulation data through the creation of a data
    grid
  • Adopting an iRODS grid-centric paradigm
  • iRODS intermediates between vastly different
    communities
  • The world of operational data management
  • The world of operational scientific practice
  • Challenges
  • Creating a catalog of NCCS policies to be mapped
    into iRODS rules
  • Creating an architecture for work flows to be
    mapped into iRODS microservices
  • Defining metadata and the required mappings
  • Capturing and publishing metadata
  • Doing all of this without disrupting operations!

20
Data Portal and Earth Systems Grid
  • Web-based environments to support collaboration,
    public access, and visualization
  • Interfaces to the Earth Systems Grid (ESG) and
    PCMDI for sharing IPCC model data
  • Connectivity to observational data, Goddard DISC,
    and other scientific data sets
  • Direct connection back to NCCS data storage and
    archive for prompt publication minimizes data
    movement and multiple copies of data
  • Sufficient compute capability for data analysis

NASA
ESG
PCMDI
Other
Data Portal
Local Disk
NFS
iRODS
GPFS MC
HP c7000 BladeSystem (128 cores, 1.2TF, 120TB of
disk)
21
Nehalem Cluster Upgrades
  • Additional IBM iDataPlex scalable compute unit
    added into the Discover cluster in FY09
  • Additional 512 nodes (46 TFLOPS)
  • 4K 2.8 GHz Nehalem quad cores
  • 24 GB RAM per node (12 TB RAM)
  • Infiniband DDR interconnect
  • An additional 4K core Nehalem scalable unit to be
    integrated later this calendar year
  • Performance
  • 2x speedup of some major NCCS applications
  • 3x to 4x improvement in memory to processor
    bandwidth
  • Dedicated I/O nodes to the GPFS file system
    provides much higher throughput
  • Discover Cluster
  • 110 TF Peak, 10,752 cores, 22.8 TB main memory,
    Infiniband interconnect
  • Base Unit
  • - 128 nodes 3.2 GHz Xeon Dempsey (Dual Core)
  • SCU1 and SCU2
  • - 512 nodes 2.6 GHz Xeon Woodcrest (Dual Core)
  • SCU3 and SCU4
  • - 512 nodes 2.5 GHz Xeon Harpertown (Quad Core)
  • SCU5
  • 512 nodes 2.8 GHz Xeon Nehalem (Quad Core)

22
Where were going
  • NASA is aggressively moving forward to deploy
    satellite missions supporting the Decadal
    Survey.
  • NCCS is moving forward to support the climate
    weather research that will extract the scientific
    value from this exciting new data!
  • (January 15, 2007, report entitled Earth
    Science and Applications from Space National
    Imperatives for the Next Decade and Beyond).

23
  • Thank you

24
NCCS Architecture
Planned for FY10
Existing
NCCS LAN (1 GbE and 10 GbE)
Login
Data Portal
Existing Discover 65 TF
Analysis
FY09 Upgrade 45 TF
FY10 Upgrade 45TF
Data Gateways
Data Management
Viz
Direct Connect GPFS Nodes
ARCHIVE
GPFS I/O Nodes
GPFS I/O Nodes
GPFS I/O Nodes
Disk 300 TB
GPFS Disk Subsystems 1.3 PB Increasing by
1.8PB in FY10
Tape 8 PB
Management Servers
License Servers
GPFS Management
Other Services
PBS Servers
Internal Services
Write a Comment
User Comments (0)
About PowerShow.com