Computational Challenges For LargeScale Astrophysics Calculations - PowerPoint PPT Presentation

About This Presentation
Title:

Computational Challenges For LargeScale Astrophysics Calculations

Description:

An Accelerated Strategic Computing Initiative (ASCI) Academic Strategic ... Highly subsonic to hypersonic velocities in the same simulation ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 24
Provided by: brucef4
Category:

less

Transcript and Presenter's Notes

Title: Computational Challenges For LargeScale Astrophysics Calculations


1
Computational Challenges ForLarge-Scale
Astrophysics Calculations
  • Bruce Fryxell
  • University of Chicago
  • ACAT 2000
  • FNAL - Oct. 16, 2000

2
Major Issues
  • Code development
  • Reduced time to solution
  • Scalability
  • Portability
  • Modularity
  • Maintainability
  • Ease of use
  • Dealing with the data produced
  • I/O
  • Retrieving the data
  • Storing the data
  • Analysis and visualization of the data
  • Code verification and validation

3
Code Development Issues
  • Enormous range of length and time scales
    (frequently more than 10 orders of magnitude)
  • Very large Reynolds numbers (gt 109)
  • Generation of unresolveable small scales
    structures
  • Turbulence modeling
  • Adaptive mesh refinement (AMR)
  • More efficient use of grid points
  • Improvement in time to solution is highly problem
    dependent
  • Increases code complexity
  • Increases communication costs on parallel
    computers
  • Irregular and unpredictable communication
    patterns
  • Efficient implementation on parallel computers

4
Code Development Issues
  • Front tracking
  • Eliminates unphysical mixing at fluid interfaces
  • Increases code complexity
  • Load balancing issues
  • Efficient implementation on parallel computers
    can be very difficult
  • Sub-grid modeling
  • Reliable sub-grid models are available only for
    relatively simple flows
  • Optimal sub-grid model is problem dependent

5
Code Development Issues
  • Large dynamic range of flow velocities
  • Highly subsonic to hypersonic velocities in the
    same simulation
  • Accurate simulation of subsonic flows requires
    incompressible code
  • Pseudo-spectral codes
  • Spectral element codes
  • Compressible codes for high Mach number usually
    use shock capturing methods
  • Finite difference
  • Finite volume

6
Code Development Issues
  • Issues of compatibility between compressible and
    incompressible codes
  • Different data structures
  • Data exchange between codes
  • Incorporation under a single code framework will
    require new, innovative code architectures
  • Do they give the same answers for intermediate
    Mach numbers?
  • Implicit / explicit hybrid methods
  • Very difficult to program
  • Global communications
  • Load balancing

7
Code Development Issues
  • Physical processes can be highly localized and
    non-uniform
  • Localized physics (e.g. reaction networks) can
    dominate computation time
  • Load balancing
  • Physical processes can be non-local
  • Self-gravity (Poisson equation)
  • Radiation transport
  • Global communication
  • Need efficient elliptic solvers on adaptive mesh
    grid structure

8
Code Development Issues
  • Rapidly changing computing environment
  • New computer architectures
  • Code development time can be comparable to the
    lifetime of a computer need for portability
  • Beowulf clusters
  • Require latency tolerant algorithms
  • Clusters of SMPs
  • Fast communication within a box
  • Slow communication between boxes
  • Hybrid programming model
  • MPI threads required in some cases
  • Unstable hardware and software
  • Frequent changes in operating systems, compilers,
    etc.

9
Scaling of FLASH Code
October, 1999
10
Scaling of FLASH Code
May 2000
11
Data Handling Issues
  • Typical calculation
  • 10003 grid, 25 variables per grid point
  • 4 Bytes per variables
  • Size of single output file 0.1 Tbyte
  • For 1000 output files 100 Tbytes per
    calculation
  • I/O time at 100 Mbytes / s gt10 days
  • Data retrieval time gt 1 year
  • 2D slices through a 3D data set will no longer be
    possible on a typical desktop workstation
  • 3D visualization requires a large parallel
    computer

12
Code Verification Validation
  • In most cases, direct validation of astrophysics
    codes is impossible
  • Astronomical objects are too far away point
    sources in sky
  • Even if we could resolve the objects, we could
    only see their surfaces no way to determine
    what is happening inside
  • For exploding stars, we can attempt to match
    observations of light curves and spectra obtained
    from telescopes and satellite detectors
  • Is the solution unique?

13
Other Approaches to VV
  • Reproduce answers to standard test problems
  • Most test calculations are performed in one or
    two dimensions
  • Most test calculations check only the simplest
    aspects of the code
  • Convergence tests
  • Codes can converge to the wrong answer
  • Each length and time scale in the problem can
    correspond to a different converged solution
  • For real astrophysics problems, it is usually
    impossible to achieve sufficient resolution to
    obtain a converged solution
  • Comparison between different codes
  • If solutions differ, which is correct?
  • Democracy is not a suitable answer
  • Comparison with laboratory experiments
  • Diagnostic resolution is usually insufficient to
    check details

14
Code Verification Validation
  • Sample validation problem Interface
    instabilities
  • Astrophysical flame fronts are frequently
    unstable to the growth of Rayleigh-Taylor modes
  • Flame speed depends on the surface area of the
    interface
  • Codes must be able to reproduce the correct
    growth rate of the instability
  • Codes must also be able to predict the amount of
    small-scale structure which develops along the
    interface due to secondary Kelvin-Helmholtz
    instabilities
  • We are attempting to model both single-mode and
    multi-mode Rayleigh-Taylor instabilities and
    compare the results to both theory and experiment

15
Single-Mode Calculations
1.5
Initial Conditions
Sinusoidal perturbation in vertical
velocity Amplitude 2.5 of local sound
speed Perturbation concentrated in region near
interface Horizontal velocity chosen to give
divergence free velocity field at t0
z
0.75
0
0 0.25
x,y
16
Single Mode Calculations
17
Multi-mode Calculations
  • Alpha-group established by Guy Dimonte in 1998
  • Want to determine the growth rate of an unstable
    layer subject to a multi-mode perturbation
  • Predicted scaling law (Youngs 1984)
  • h ? g A t2
  • A (?2 - ?1) / (?2 ?1)
  • g gravitational acceleration
  • Want to determine the value of ? using a wide
    variety of codes and experiments
  • Finite difference / Finite Volume
  • ALE
  • Front Tracking
  • Pseudo-spectral / Spectral elements

18
Multi-mode Calculations
Initial Conditions
A 0.5 (at interface) g 2.0 P 500 (at
interface) Constant entropy on each side of
interface Hydrostatic equilibrium Domain 0 lt x
lt10, 0 lt y lt 20 Interface at y 10.625
19
Multi-mode Calculations
Initial Perturbation
Multimode perturbation in location of
interface Power concentrated in modes 32 to
64 For 128 x 256 grid, perturbed wavelengths
range from 2 to 4 grid points For 256 x 512 grid,
perturbed wavelengths range from 4 to 8 grid
points
20
Multi-mode Calculations
128 x 128 x 256
Density Time 10
21
Multi-mode Calculations
128 x 128 x 256
Density Time 10
22
Alpha Group Calculations 3D
23
Results
  • Finite difference/finite volume codes give ?
    0.03
  • Initial conditions are poorly resolved
  • Unphysical amount of molecular mixing due to
    numerical dissipation
  • ? increases with grid resolution
  • Amplitude scales as t instead of t2
  • Front tracking codes give ? 0.05 0.07
  • Experiments give ? 0.04 0.06
  • Initial conditions hard to determine and control
  • t2 scaling difficult to reproduce in simulations
    due to limits on resolution and box size
Write a Comment
User Comments (0)
About PowerShow.com