Gbor Tth - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

Gbor Tth

Description:

The Zoom Movie. What Happens at Earth -More Detail. More Detail at Earth ... 2003 Halloween Storm Simulation with GM, IM and IE Components ... – PowerPoint PPT presentation

Number of Views:42
Avg rating:3.0/5.0
Slides: 45
Provided by: ipam
Category:
Tags: gbor | halloween | movie | tth

less

Transcript and Presenter's Notes

Title: Gbor Tth


1
The Grand Challenge of Space Weather
Prediction
  • Gábor Tóth
  • Center for Space Environment Modeling
  • University of Michigan

2
Collaborators
  • Tamas Gombosi, Kenneth Powell
  • Ward Manchester, Ilia Roussev
  • Darren De Zeeuw, Igor Sokolov
  • Aaron Ridley, Kenneth Hansen
  • Richard Wolf, Stanislav Sazykin (Rice
    University)
  • József Kóta (Univ. of Arizona)

Grants
DoD MURI and NASA CT Projects
3
Outline of Talk
  • What is Space Weather and Why to Predict It?
  • Parallel MHD Code BATSRUS
  • Space Weather Modeling Framework (SWMF)
  • Some Results
  • Concluding Remarks

4
What Space Weather Means
Conditions on the Sun and in the solar wind,
magnetosphere, ionosphere, and thermosphere that
can influence the performance and reliability of
space-born and ground-based technological systems
and can endanger human life or health.
Space physics that affects us.
5
Solar Activity...
6
Affects Earth The Aurorae
7
Other Effects of Space Weather
8
MHD Code BATSRUS
  • Block Adaptive Tree Solar-wind Roe Upwind Scheme
  • Conservative finite-volume discretization
  • Shock-capturing Total Variation Diminishing
    schemes
  • Parallel block-adaptive grid (Cartesian and
    generalized)
  • Explicit and implicit time stepping
  • Classical and semi-relativistic MHD equations
  • Multi-species chemistry
  • Splitting the magnetic field into B0 B1
  • Various methods to control the divergence of B

9
MHD Equations in Conservative vs.
Non-Conservative Form
  • Conservative form is required for correct jump
    conditions across shock waves.
  • Energy conservation provides proper amount of
    Joule heating for reconnection even in ideal
    MHD.
  • Non-conservative pressure equation is preferred
    for maintaining positivity.
  • Hybrid scheme use pressure equation where
    possible.

10
Conservative Finite-Volume Method
11
Limited Reconstruction TVD
  • Finite volume data stored at cell centers, fluxes
    computed at interfaces between cells. Need an
    interpolation scheme to give accurate values at
    the two sides of the interface.
  • Reconstruction process can introduce new extrema.
    Need to limit the slopes so that reconstructed
    values are bounded by cell-center values.
  • Limited reconstruction results in a total
    variation diminishing scheme
  • First order near discontinuities and second order
    in smooth regions.

12
Splitting the Magnetic Field
  • The magnetic field has huge gradients near the
    Sun and Earth
  • Large truncation errors.
  • Pressure calculated from total energy can become
    negative.
  • Difficult to maintain boundary conditions.
  • Solution split the magnetic field as B B0 B1
    where B0 is a divergence and curl free
    analytic function.
  • Gradients in B1 are small.
  • Total energy contains B1 only.
  • Boundary condition for B1 is simple.

13
Vastly Disparate Scales
  • Spatial
  • Resolution needed at Earth 1/4 RE
  • Resolution needed at Sun 1/32 RS
  • Sun-Earth distance 1AU
  • 1 AU 215 RS 23,456 RE
  • Temporal
  • CME needs 3 days to arrive at Earth.
  • Time step is limited to a fraction of a second
    in some regions.

14
Adaptive Block Structure
Blocks communicate with neighbors through ghost
cells
Each block is NxNxN
15
The Octtree Data Structure
16
Parallel Distribution of the Blocks
17
Optimized Load Balancing
18
Parallel Performance
19
Why Explicit Time-Stepping May Not Be Good Enough
  • Explicit schemes have time step limited by CFL
    condition ?t lt ?x/fastest wave speed.
  • High Alfvén speeds and/or small cells may lead to
    smaller time steps than required for accuracy.
  • The problem is particularly acute near planets
    with strong magnetic fields.
  • Implicit schemes do not have ?t limited by CFL.

20
Building a Parallel Implicit Solver
  • BDF2 second-order implicit time-stepping scheme
    requires solution of a large nonlinear system of
    equations at each time step.
  • Newton linearization allows the nonlinear system
    to be solved by an iterative process in which
    large linear systems are solved.
  • Krylov solvers (GMRES, BiCGSTAB) with
    preconditioning are robust and efficient for
    solving large linear systems.
  • Schwarz preconditioning allows the process to be
    done in parallel
  • Each adaptive block preconditions using local
    data only
  • MBILU preconditioner

21
Timing Results
  • Halem 192 CPU Compaq ES-45
  • Chapman 256 CPU SGI 3800
  • Lomax 256 CPU Compaq ES-45
  • Grendel 118 CPU PC Cluster (1.6 GHz AMD)

22
Getting the Best of Both Worlds - Partial Implicit
  • Fully implicit scheme has no CFL limit, but each
    iteration is expensive (memory and CPU)
  • Fully explicit is inexpensive for one iteration,
    but CFL limit may mean a very small ?t
  • Set optimal ?t limited by accuracy requirement
  • Solve blocks with unrestrictive CFL explicitly
  • Solve blocks with restrictive CFL implicitly
  • Load balance explicit and implicit blocks
    separately

23
Comparison of Explicit, Implicit and Partial
Implicit
24
Timing Results for Space Weather on Compaq
25
Controlling the Divergence of B
  • Projection Scheme (Brackbill and Barnes)
  • Solve a Poisson equation to remove div B after
    each time step.
  • Expensive on a block adaptive parallel grid.
  • 8-Wave Scheme (Powell and Roe)
  • Modify MHD equations for non-zero divergence so
    it is advected.
  • Simple and robust but div B is not small.
    Non-conservative terms.
  • Diffusive Control (Dedner et al.)
  • Add terms that diffuse the divergence of the
    field.
  • Simple but it may diffuse the solution too.
  • Conservative Constrained Transport (Balsara, Dai,
    Ryu, Tóth)
  • Use staggered grid for the magnetic field to
    conserve div B
  • Exact but complicated. Does not allow local time
    stepping.

26
Effect of Div B Control Scheme
27
From Codes To Framework
  • The Sun-Earth system consists of many different
    interconnecting domains that are independently
    modeled.
  • Each physics domain model is a separate
    application, which has its own optimal
    mathematical and numerical representation.
  • Our goal is to integrate models into a flexible
    software framework.
  • The framework incorporates physics models with
    minimal changes.
  • The framework can be extended with new
    components.
  • The performance of a well designed framework can
    supercede monolithic codes or ad hoc couplings of
    models.

28
Why Frameworks?
  • Difficult to create complex science codes,
    integrating
  • multiple codes
  • interfaces
  • runtime behavior
  • Difficult to modify or extend large systems
  • adding new physics modules
  • updating codes,
  • Difficult to utilize complete systems
  • what I/O, parameters needed
  • how to submit to multiple sites
  • Framework A reusable system design,.
  • Component A packaging of executable software
    with a well-defined interface
  • Coupling components does not mean the science is
    correct.

29
Physics Domains ID Models
  • Solar Corona SC BATSRUS
  • Eruptive Event Generator EE BATSRUS
  • Inner Heliosphere IH BATSRUS
  • Solar Energetic Particles SP Kótas SEP model
  • Global Magnetosphere GM BATSRUS
  • Inner Magnetosphere IM Rice Convection Model
  • Ionosphere Electrodynamics IE Ridleys potential
    solver
  • Upper Atmosphere UA General Ionosphere
    Thermosphere Model (GITM)

30
Space Weather Modeling Framework
31
The SWMF Architecture
32
Parallel Layout and Execution
LAYOUT.in for 20 PE-s
SC/IH GM IM/IE
  • ID ROOT LAST STRIDE
  • COMPONENTMAP
  • SC 0 9 1
  • IH 0 9 1
  • GM 10 17 1
  • IE 18 19 1
  • IM 19 19 1
  • END

33
Parallel Field Line Tracing
  • Stream line and field line tracing is a common
    problem in space physics. Two examples
  • Coupling inner and global magnetosphere models
  • Coupling solar energetic particle model with MHD
  • Tracing a line is an inherently serial procedure
  • Tracing many lines can be parallelized, but
  • Vector field may be distributed over many PE-s
  • Collecting the vector field onto one PE may be
    too slow and it requires a lot of memory

34
Coupling Inner and Global Magnetosphere Models
Pressure
Inner magnetosphere model needs the field line
volumes, average pressure and density along
field lines connected to the 2D grid on the
ionosphere. Global magnetosphere model needs the
pressure correction along the closed field lines
35
Interpolated Tracing Algorithm
1. Trace lines inside blocks starting from
faces.
2. Interpolate and communicate mapping.
3. Repeat 2. until the mapping is obtained
for all faces.
4. Trace lines inside blocks starting from
cell centers.
5. Interpolate mapping to cell centers.
36
Parallel Algorithm without Interpolation
PE 1
PE 2
1. Find next local field line.
2. If there is a local field line then 2a.
Integrate in local domain.
2b. If not done send to other PE.
PE 3
PE 4
3. Go to 1. unless time to receive.
  • 4. Receive lines from other PE-s.
  • 5. If received line go to 2a.

6. Go to 1. unless all finished.
37
Interpolated versus No Interpolation
38
Modeling a Coronal Mass Ejection
  • Set B0 to a magnetogram based potential field.
  • Obtain MHD steady state solution.
  • Use source terms to model solar wind acceleration
    and heating so that steady solution matches
    observed solar wind parameters.
  • Perturb this initial state with a flux rope.
  • Follow CME propagation.
  • Let CME hit the Magnetosphere of the Earth.

39
Initial Steady State in the Corona
  • Solar surface is colored with the radial magnetic
    field.
  • Field lines are colored with the velocity.
  • Flux rope is shown with white field lines.

40
Close-up of the Added Flux Rope
41
Two Hours After Eruption in the Solar Corona
42
65 Hours After Eruption in the Inner Heliosphere
43
Sun to Earth CME Simulation
  • In Solar Corona and Heliosphere the resolution
    ranges from 1/32RS to 4 RS
  • Between 4 and 14 million cells SC/IH grid
  • In Global Magnetosphere the resolution ranges
    from 1/8RE to 8 RE

44
The Zoom Movie
45
What Happens at Earth -More Detail
46
More Detail at Earth
Pressure and magnetic field
Before shock
After shock
Density and magnetic field at shock arrival time
South Turning BZ
North Turning BZ
47
Ionosphere Electrodynamics
  • Before shock hits.
  • After shock currents and the resulting electric
    potential increase.
  • Region-2 currents develop.
  • Although region-1 currents are strong, the
    potential decreases due to the shielding effect.

48
Upper Atmosphere
Before shock arrival
  • The Hall conductance is calculated by the Upper
    Atmosphere component and it is used by the
    Ionosphere Electrodynamics.
  • After the shock hits the conductance increases in
    the polar regions due to the electron
    precipitation.
  • Note that the conductance caused by solar
    illumination at low latitudes does not change
    significantly.

After shock arrival
49
Performance of the SWMF
50
(No Transcript)
51
2003 Halloween Storm Simulation with GM, IM and
IE Components
  • The magnetosphere during the solar storm
    associated with an X17 solar eruption.
  • Using satellite data for solar wind parameters
  • Solar wind speed 1800 km/s.
  • Time October 29, 0730UT
  • Shown are the last closed field lines shaded
    with the thermal pressure.
  • The cut planes are shaded with the values of the
    electric current density.

52
GM, IM, IE Run vs. Observations
53
Concluding Remarks
  • The Space Weather Modeling Framework (SWMF)
    uses sate-of-the-art methods to
    achieve flexible and efficient coupling and
    execution of the physics models.
  • Missing pieces for space weather prediction
  • Better models for solar wind heating and
    acceleration
  • Better understanding of CME initiation
  • More observational data to constrain the model
  • Even faster computers and improved algorithms.
Write a Comment
User Comments (0)
About PowerShow.com