Title: Tsengdar Lee
1Project Columbia Applications
- Tsengdar Lee
- NASA Science Mission Directorate
2System of Systems Framework
3Turning Observations into Knowledge Products
4Orbit Transfer
Pareto-Optimal Trajectories
Trade-off between flight time and propellant
5High-Fidelity Unsteady Simulation of Shuttle
Flowliner
U44.8 ft/sec
Pump Speed 15,761 rpm
DOWNSTREAM LINER
UPSTEREAM LINER
Strong backflow causing high-frequency pressure
oscillations
Back Flow In/Out of Cavity
Damaging frequency on flowliner due to LH2 pump
back flow has been quantified in developing
flight rationale for the flowliner.
6Fastest Production Supercomputer ( and everybody
knows it)
7Wall Street Journal 10/29/2004 A5
- NASA is now home to Columbia the fastest
supercomputer in the known universe. And we
salute NASA and SGI for this bold achievement.
Built with 10,240 Intel Itanium 2 processors,
Columbia performs at the mind-boggling rate of 42
trillion floating-point calculations per second.
We can only imagine what NASA will accomplish
with computing power that is truly astronomical.
- Intel
8Components
RTF 128
- Front End
- 128p Altix 3700 (RTF)
- Networking
- - 10GigE Switch 32-port
- 10GigE Cards (1 Per 512p)
- - InfiniBand Switch (288port)
- InfiniBand Cards (6 per 512p)
- Altix 3900 2048 Numalink Kits
- Compute Node
- - Altix 3700 12x512p
- - Altix 3900 8x512p
- Storage Area Network
- Brocade Switch 2x128port
- Storage (440 TB)
InfiniBand
10GigE
A512p
A512p
A512p
A512p
A512p
A512p
A512p
A512p
A512p
A512p
A512p
A512p
T512p
T512p
T512p
T512p
T512p
T512p
T512p
T512p
FC Switch 128p
FC Switch 128p
SATA 35TB
Fibre Channel 20TB
SATA 35TB
SATA 35TB
SATA 35TB
Fibre Channel 20TB
Fibre Channel 20TB
Fibre Channel 20TB
SATA 35TB
SATA 35TB
SATA 35TB
SATA 35TB
Fibre Channel 20TB
Fibre Channel 20TB
Fibre Channel 20TB
Fibre Channel 20TB
9Major Characteristics
- 10,240 processors-Intel Itanium 1.5GHz 6M cache
- 20 SGI Altix 512p nodes SSI for each node using
Numalink 4 - 1TB RAM per 512-20TB total
- 1-2048p supernode
- Linux O/S
- PBS scheduler
- 440TB of Fiber channel RAID
- Three levels of LAN Fabric-gigE, 10GigE and
Infiniband - 5PBytes of Tape storage
- 10Gig WAN connectivity at the backplane
10Project Columbia Supports Science Mission
Modeling and Analysis Research
- Project Columbia dedicated 10/26/04
- Worlds fourth fastest computer with 51.8
Teraflops throughput - 10240 processors
- 52 allocated to SMD
- Earth-Sun modeling has been the prime usage of
the systems - Demand exceeds supply
11Mission Support Backbone
GRC
GRC
CIEF
CIEF
Midwest
Midwest
DC
DC
ARC
JPL
HQ
JPL
HQ
CIEF
OC
48
CIEF
OC
48
Bay
OC
48
OC
48
Core
Lambda Services
Lambda Services
CIEF
GSFC
DFRC
LRC
ARC
GSFC
DFRC
LRC
Bay
CIEF
CIEF
CIEF
CIEF
MSFC
South
JSC
MSFC
South
JSC
South
South
East
Central
East
Central
KSC
SSC
MAF
KSC
SSC
MAF
WSC
WSC
WSTF
12Interactive Visual Supercomputing
13Ideal Architecture VisionData Centric,
Multi-Tiered
Compute Environment Multi-tiered Platforms
High Speed Research Network
High Speed Access to Other Sites
Common Front End
Next Generation Platforms
Visualization Environment
Capacity Systems
Capacity Systems
Capability Systems
GB/s
Hierarchical Storage Management
Shared High Speed Disk
14Establishing a Modeling Environment
15MAP06 Project Structure
MAP06 PI Bill Lapenta Project Manager Mike
Seablom
Hurricane Forecast Center(s) Best effort -
research investigation of multi-model ensemble
track and intensity forecasts
GMAO GEOS-5 DAS global analysis and forecasts
SIVO Compute resources Visualization Real-time
satellite images Project Web Page Product
Distribution
NAMMA SOP3 Field campaign aerosol impacts on
cyclogenesis / W African Monsoon
SPoRT WRF regional forecasts
16MAP06 GMAO Participation
- Science Goals
- Evaluate GEOS-5 DAS in terms of hurricane
prediction skill - Evaluate impact of NASA satellite data,
particularly AIRS and MODIS derived winds and
TRMM rainfall, possibly MODIS hi-res SST, on
hurricane prediction skill - Evaluate of merit of 1/4º vs 1/2º GEOS-5
analysis forecasts, instrument team products,
other science goals to guide evolution of GEOS
products and justify resource requirements - Evaluate role of aerosols in tropical
cyclogenesis of West Africa - Undertake preliminary investigation of the
impact of air-sea feedbacks on hurricane
evolution/prediction (GEOS-5 CGCM) - Operational Goals
- Support NAMMA Field Campaign in SOP3 (Sept 06)
with real-time forecasts of weather and aerosol
distributions - Conduct 1/4º GEOS-5 forecasts in NRT to compare
with 1/2º production system - Contribute NRT meteorology as input (IC and BC)
to WRF forecasts at SPoRT - Contribute NRT hurricane forecasts for
consideration in multi-model ensembles (best
effort, low priority) -
17MAP06 GMAO Participation (cont.)
- System
- Initial 1/2º GEOS-5 DAS 1/4º forecasts
- Sept06 1/4º GEOS-5 DAS 1/4º forecasts
- DAS on NAS/Explore GEOS-5 AGCM forecasts on
N-G platform - 1 x 5-day forecast per day (12Z) 2x during
NAMMA SOP-3 - NCEP-standard AIRS enhanced AIRS (channel and
pixel selection) TRMM rain retrievals - Requirements
- Transition model and scripts to MAP06 N-G
platform - Include relocator in parallel ops (different
from MERRA) - Historical runs 2004 (GEOS-5 validation) and
2005 (reprocessing selected events for
forecasts) - Transparent interface to products (across lambda
rail network) - 1/2 degree DAS - regular parallel ops - on
Explore - 1/4 degree forecasts and historical runs on N-G
machine - 1/2 degree DAS sensitivity runs - on Explore and
Columbia - 1/4 degree DAS on Columbia
18MAP06 GEOS-5 DAS Timeline Summary
July
September
August
October
GEOS-4 operations (to mid-Nov)
- GEOS-5 parallel ops (not frozen)
- 0.5º resolution
- MAP06 starting 1 July 2006
GEOS-5 development and validation
- GEOS-5 ops
- 0.5º res.
- Oct 1, 2006
- GEOS-5 Validation Aura Spin-Up
- 0.5º resolution
- Period Jan 1, 2004 - Sep 30, 2006
- MAP06
- 0.25º res.
- continuation
- NAMMA
- 0.25º res.
- Sep. 2006
- GEOS-5 MERRA Runs
- 0.5º resolution
- Three streams (1979 - present)
19MAP06 DATA FLOWS
NCCS GMAO Data Processing and Long Term Storage
Upon Production Public GMAO Data (GEOS-4
CERES, GEOS-5 forward, MERRA)
Derived Model Products
SIVO Modeling Environment (ME) interface
DISC GMAO Data Archive Distribution
GMAO Data
Derived Model Products web services
Validation Scientific investigations QA
Modelers Researchers System Developers
Instrument Teams Scientists and other users
20Vision A Virtual Integrated Data System Storage
location is transparent to the user
- Data management has advanced beyond simply
acquiring a desired data set, and then searching
for science - Data management is now about surfing datasets,
and acquiring specific information-filled data
granules, then going back to surf some more.
21An Integrated Approach to Data Serving MERRA as
a prototype
- NCCS
- The MERRA Complete Archive
- Leverages NCCS, NAS other compute clusters
- Tape and disk based
- Long-term archive
- GOLDS provides prototype for future NCCS data
services
- GES DISC
- GOLDS - External Data Access Services Provider
- Active Archive
- Users Instrument Teams, Public
- Leverages
- new evolved DISC capabilities All on-line
archive served by S4PA Mirador interface - existing data services Giovanni data and model
intercomparisons S4PM-DME On-the-fly subsetting - Other DISC data servers OGC OpenDAP
- SIVO
- Portal for MAP ME Services
- Application of GOLDS to Education Outreach
- Leverages
- existing web mapping services
- existing NEO outreach web services
- Interface to other data servers OGC (provides
access to NOMADS) OpenDAP
22Project FastPath
- The Scientific Computing Portfolio is committed
to support Applied Science Program - Many options
RTG