HPC in 2029: Will The March to ZettaFLOPS Succeed? - PowerPoint PPT Presentation

1 / 8
About This Presentation
Title:

HPC in 2029: Will The March to ZettaFLOPS Succeed?

Description:

Eugene Brooks writes 'Attack of the Killer Micros' 4 years ... Monica Lamm, Mark Gordon, Theresa Windus, Masha Sosonkina, Brett Bode, Iowa State University ... – PowerPoint PPT presentation

Number of Views:78
Avg rating:3.0/5.0
Slides: 9
Provided by: willia549
Category:
Tags: hpc | zettaflops | march | mark | shea | succeed

less

Transcript and Presenter's Notes

Title: HPC in 2029: Will The March to ZettaFLOPS Succeed?


1
HPC in 2029Will The March to ZettaFLOPS Succeed?
  • William Groppwww.cs.uiuc.edu/wgropp

2
Extrapolation is Risky
  • 1989 T 20 years
  • Intel introduces 486DX
  • Eugene Brooks writes Attack of the Killer
    Micros
  • 4 years before TOP500
  • Top systems at about 2 GF Peak
  • 1999 T 10 years
  • NVIDIA introduces the GPU (GeForce 256)
  • Programming GPUs still a challenge
  • Top system ASCI Red, 9632 cores, 3.2 TF Peak
  • MPI is 7 years old

3
HPC Today
  • High(est)-End systems
  • 1 PF (1015 Ops/s) achieved on a few peak
    friendly applications
  • Much worry about scalability, how were going to
    get to an ExaFLOPS
  • Systems are all oversubscribed
  • DOE INCITE awarded almost 900M processor hours in
    2009, many turned away
  • NSF PRAC awards for Blue Waters similarly
    competitive
  • Widespread use of clusters, many with
    accelerators cloud computing services
  • Laptops (far) more powerful than the
    supercomputers I used as a graduate student

4
NSFs Strategy for High-end Computing
Science and Engineering Capability (logarithmic
scale)
Track 1 System
UIUC/NCSA (1 PF sustained)
Track 2 Systems
Track 2d
PSC (?)
UT/ORNL (1PF peak)
TACC (500TF peak)
Track 3 Systems
Leading University HPC Centers (10-100 TF)
FY07
FY10
FY09
FY08
FY11
5
HPC in 2011
  • Sustained PF systems
  • NSF Track 1 Blue Waters at Illinois
  • Sequoia Blue Gene/Q at LLNL
  • Undoubtedly others
  • Still programmed with MPI and MPIother (e.g.,
    MPIOpenMP)
  • But in many cases using toolkits, libraries, and
    other approaches
  • And not so bad applications will be able to run
    when the system is turned on
  • Replacing MPI will require some compromise
    e.g., domain specific (higher-level but less
    general)
  • Still cant compile single-threaded code to
    reliably get good performance see the work in
    autotuners. Lesson theres a limit to what can
    be automated. Pretending that theres an
    automatic solution will stand in the way of a
    real solution

6
HPC in 2019
  • Exascale (1018) systems arrive
  • Issues include power, concurrency, fault
    resilience, memory capacity
  • Likely features
  • Memory per core (or functional unit) smaller than
    todays systems
  • 108-109 threads
  • Heterogeneous processing elements
  • Software will be different
  • You can use MPI, but constraints will get in your
    way
  • Likely a combination of tools, with
    domain-specific solutions and some automated code
    generation
  • Algorithms need to change/evolve
  • Extreme scalability, reduced memory
  • Managed locality
  • Participate in fault tolerance

7
HPC in 2029
  • Will we even have Zettaflops (1021 Ops/s)?
  • Unlikely (but not impossible) in a single (even
    highly parallel) system
  • Power (again) you need an extra 1000-fold
    improvement in results/Joule
  • Concurrency
  • 1011-1012 threads (!)
  • See the Zettaflops workshops www.zettaflops.org
  • Will require new device technology
  • Will the high-end have reached a limit after
    Exascale systems?

8
The HPC Pyramid in 1993
Tera Flop Class
Center Supercomputers
Mid-Range Parallel Processors and Networked
Workstations
High Performance Workstations
9
The HPC Pyramid in 2029 (?)
Center Exascale Supercomputers
Single Cabinet Petascale Systems (or attack of
the killer GPU successors)
Laptops, phones, wristwatches, eye glasses
10
Blue Waters ProjectPetascale Allocation Awards
  • Computational Chemistry at the Petascale
  • Monica Lamm, Mark Gordon, Theresa Windus, Masha
    Sosonkina, Brett Bode, Iowa State University
  • Testing Hypotheses about Climate Prediction at
    Unprecedented Resolutions on the Blue Waters
    System
  • David Randall, Ross Heikes, Colorado State
    University William Large, Richard Loft, John
    Dennis, Mariana Vertenstein, National Center for
    Atmospheric Research Cristiana Stan, James
    Kinter, Institute for Global Environment and
    Society Benjamin Kirtman, University of Miami
  • Petascale Research in Earthquake System Science
    on Blue Waters
  • Thomas Jordan, Jacobo Bielak, University of
    Southern California
  • Breakthrough Petascale Quantum Monte Carlo
    Calculations
  • Shiwei Zhang, College of William and Mary
  • Electronic Properties of Strongly Correlated
    Systems Using Petascale Computing
  • Sergey Savrasov, University of California, Davis
    Kristjan Haule, Gabriel Kotliar, Rutgers
    University

11
Blue Waters ProjectPetascale Allocation Awards
  • Understanding Tornados and Their Parent
    Supercells Through Ultra-High Resolution
    Simulation/Analysis
  • Robert Wilhelmson, Brian Jewett, Matthew Gilmore,
    University of Illinois at Urbana-Champaign
  • Petascale Simulation of Turbulent Stellar
    Hydrodynamics
  • Paul Woodward, Pen-Chung Yew, University of
    Minnesota, Twin Cities
  • Petascale Simulations of Complex Biological
    Behavior in Fluctuating Environments
  • Ilias Tagkopoulos, University of California,
    Davis
  • Computational Relativity and Gravitation at
    Petascale Simulating and Visualizing
    Astrophysically Realistic Compact Binaries
  • Manuela Campanelli, Carlos Lousto, Hans-Peter
    Bischof, Joshua Faber, Yosef Ziochower, Rochester
    Institute of Technology
  • Enabling Science at the Petascale From Binary
    Systems and Stellar Core Collapse to Gamma-Ray
    Bursts
  • Eric Schnetter, Gabrielle Allen, Mayank Tyagi,
    Peter Diener, Christian Ott, Louisiana State
    University

12
Blue Waters ProjectPetascale Allocation Awards
  • Petascale Computations for Complex Turbulent
    Flows
  • Pui-Kuen Yeung, James Riley, Robert Moser,
    Amitava Majumdar, Georgia Institute of Technology
  • Computational Microscope
  • Klaus Schulten, Laxmikant Kale, University of
    Illinois at Urbana-Champaign
  • Simulation of Contagion on Very Large Social
    Networks with Blue Waters
  • Keith Bisset, Xizhou Feng, Virginia Polytechnic
    Institute and State University
  • Formation of the First Galaxies Predictions for
    the Next Generation of Observatories
  • Brian OShea, Michigan State University Michael
    Norman, University of California at San Diego
  • Super Instruction Architecture for Petascale
    Computing
  • Rodney Bartlett, Erik Duemens, Beverly Sanders,
    University of Florida Ponnuswamy Sadayappan,
    Ohio State University
  • Peta-Cosmology Galaxy Formation and Virtual
    Astronomy
  • Kentaro Nagamine, University of Nevada at Las
    Vegas Jeremiah Ostriker, Princeton University
    Renyue Cen, Greg Bryan
Write a Comment
User Comments (0)
About PowerShow.com