Title: Gbor Tth
1The Grand Challenge of Space Weather
Prediction
- Gábor Tóth
- Center for Space Environment Modeling
- University of Michigan
2Collaborators
- Tamas Gombosi, Kenneth Powell
- Ward Manchester, Ilia Roussev
- Darren De Zeeuw, Igor Sokolov
- Aaron Ridley, Kenneth Hansen
- Richard Wolf, Stanislav Sazykin (Rice
University) - József Kóta (Univ. of Arizona)
Grants
DoD MURI and NASA CT Projects
3Outline of Talk
- What is Space Weather and Why to Predict It?
- Parallel MHD Code BATSRUS
- Space Weather Modeling Framework (SWMF)
- Some Results
- Concluding Remarks
4What Space Weather Means
Conditions on the Sun and in the solar wind,
magnetosphere, ionosphere, and thermosphere that
can influence the performance and reliability of
space-born and ground-based technological systems
and can endanger human life or health.
Space physics that affects us.
5Solar Activity...
6Affects Earth The Aurorae
7Other Effects of Space Weather
8MHD Code BATSRUS
- Block Adaptive Tree Solar-wind Roe Upwind Scheme
- Conservative finite-volume discretization
- Shock-capturing Total Variation Diminishing
schemes - Parallel block-adaptive grid (Cartesian and
generalized) - Explicit and implicit time stepping
- Classical and semi-relativistic MHD equations
- Multi-species chemistry
- Splitting the magnetic field into B0 B1
- Various methods to control the divergence of B
9MHD Equations in Conservative vs.
Non-Conservative Form
- Conservative form is required for correct jump
conditions across shock waves. - Energy conservation provides proper amount of
Joule heating for reconnection even in ideal
MHD. - Non-conservative pressure equation is preferred
for maintaining positivity. - Hybrid scheme use pressure equation where
possible.
10Conservative Finite-Volume Method
11Limited Reconstruction TVD
- Finite volume data stored at cell centers, fluxes
computed at interfaces between cells. Need an
interpolation scheme to give accurate values at
the two sides of the interface. - Reconstruction process can introduce new extrema.
Need to limit the slopes so that reconstructed
values are bounded by cell-center values. - Limited reconstruction results in a total
variation diminishing scheme - First order near discontinuities and second order
in smooth regions.
12Splitting the Magnetic Field
- The magnetic field has huge gradients near the
Sun and Earth - Large truncation errors.
- Pressure calculated from total energy can become
negative. - Difficult to maintain boundary conditions.
- Solution split the magnetic field as B B0 B1
where B0 is a divergence and curl free
analytic function. - Gradients in B1 are small.
- Total energy contains B1 only.
- Boundary condition for B1 is simple.
13Vastly Disparate Scales
- Spatial
- Resolution needed at Earth 1/4 RE
- Resolution needed at Sun 1/32 RS
- Sun-Earth distance 1AU
- 1 AU 215 RS 23,456 RE
- Temporal
- CME needs 3 days to arrive at Earth.
- Time step is limited to a fraction of a second
in some regions.
14Adaptive Block Structure
Blocks communicate with neighbors through ghost
cells
Each block is NxNxN
15The Octtree Data Structure
16Parallel Distribution of the Blocks
17Optimized Load Balancing
18Parallel Performance
19Why Explicit Time-Stepping May Not Be Good Enough
- Explicit schemes have time step limited by CFL
condition ?t lt ?x/fastest wave speed. - High Alfvén speeds and/or small cells may lead to
smaller time steps than required for accuracy. - The problem is particularly acute near planets
with strong magnetic fields. - Implicit schemes do not have ?t limited by CFL.
20Building a Parallel Implicit Solver
- BDF2 second-order implicit time-stepping scheme
requires solution of a large nonlinear system of
equations at each time step. - Newton linearization allows the nonlinear system
to be solved by an iterative process in which
large linear systems are solved. - Krylov solvers (GMRES, BiCGSTAB) with
preconditioning are robust and efficient for
solving large linear systems. - Schwarz preconditioning allows the process to be
done in parallel - Each adaptive block preconditions using local
data only - MBILU preconditioner
21Timing Results
- Halem 192 CPU Compaq ES-45
- Chapman 256 CPU SGI 3800
- Lomax 256 CPU Compaq ES-45
- Grendel 118 CPU PC Cluster (1.6 GHz AMD)
22Getting the Best of Both Worlds - Partial Implicit
- Fully implicit scheme has no CFL limit, but each
iteration is expensive (memory and CPU) - Fully explicit is inexpensive for one iteration,
but CFL limit may mean a very small ?t - Set optimal ?t limited by accuracy requirement
- Solve blocks with unrestrictive CFL explicitly
- Solve blocks with restrictive CFL implicitly
- Load balance explicit and implicit blocks
separately
23Comparison of Explicit, Implicit and Partial
Implicit
24Timing Results for Space Weather on Compaq
25Controlling the Divergence of B
- Projection Scheme (Brackbill and Barnes)
- Solve a Poisson equation to remove div B after
each time step. - Expensive on a block adaptive parallel grid.
- 8-Wave Scheme (Powell and Roe)
- Modify MHD equations for non-zero divergence so
it is advected. - Simple and robust but div B is not small.
Non-conservative terms. - Diffusive Control (Dedner et al.)
- Add terms that diffuse the divergence of the
field. - Simple but it may diffuse the solution too.
- Conservative Constrained Transport (Balsara, Dai,
Ryu, Tóth) - Use staggered grid for the magnetic field to
conserve div B - Exact but complicated. Does not allow local time
stepping.
26Effect of Div B Control Scheme
27From Codes To Framework
- The Sun-Earth system consists of many different
interconnecting domains that are independently
modeled. - Each physics domain model is a separate
application, which has its own optimal
mathematical and numerical representation. - Our goal is to integrate models into a flexible
software framework. - The framework incorporates physics models with
minimal changes. - The framework can be extended with new
components. - The performance of a well designed framework can
supercede monolithic codes or ad hoc couplings of
models.
28Why Frameworks?
- Difficult to create complex science codes,
integrating - multiple codes
- interfaces
- runtime behavior
- Difficult to modify or extend large systems
- adding new physics modules
- updating codes,
- Difficult to utilize complete systems
- what I/O, parameters needed
- how to submit to multiple sites
- Framework A reusable system design,.
- Component A packaging of executable software
with a well-defined interface - Coupling components does not mean the science is
correct.
29Physics Domains ID Models
- Solar Corona SC BATSRUS
- Eruptive Event Generator EE BATSRUS
- Inner Heliosphere IH BATSRUS
- Solar Energetic Particles SP Kótas SEP model
- Global Magnetosphere GM BATSRUS
- Inner Magnetosphere IM Rice Convection Model
- Ionosphere Electrodynamics IE Ridleys potential
solver - Upper Atmosphere UA General Ionosphere
Thermosphere Model (GITM)
30Space Weather Modeling Framework
31The SWMF Architecture
32Parallel Layout and Execution
LAYOUT.in for 20 PE-s
SC/IH GM IM/IE
- ID ROOT LAST STRIDE
- COMPONENTMAP
- SC 0 9 1
- IH 0 9 1
- GM 10 17 1
- IE 18 19 1
- IM 19 19 1
- END
33Parallel Field Line Tracing
- Stream line and field line tracing is a common
problem in space physics. Two examples - Coupling inner and global magnetosphere models
- Coupling solar energetic particle model with MHD
- Tracing a line is an inherently serial procedure
- Tracing many lines can be parallelized, but
- Vector field may be distributed over many PE-s
- Collecting the vector field onto one PE may be
too slow and it requires a lot of memory
34Coupling Inner and Global Magnetosphere Models
Pressure
Inner magnetosphere model needs the field line
volumes, average pressure and density along
field lines connected to the 2D grid on the
ionosphere. Global magnetosphere model needs the
pressure correction along the closed field lines
35Interpolated Tracing Algorithm
1. Trace lines inside blocks starting from
faces.
2. Interpolate and communicate mapping.
3. Repeat 2. until the mapping is obtained
for all faces.
4. Trace lines inside blocks starting from
cell centers.
5. Interpolate mapping to cell centers.
36Parallel Algorithm without Interpolation
PE 1
PE 2
1. Find next local field line.
2. If there is a local field line then 2a.
Integrate in local domain.
2b. If not done send to other PE.
PE 3
PE 4
3. Go to 1. unless time to receive.
- 4. Receive lines from other PE-s.
- 5. If received line go to 2a.
6. Go to 1. unless all finished.
37Interpolated versus No Interpolation
38Modeling a Coronal Mass Ejection
- Set B0 to a magnetogram based potential field.
- Obtain MHD steady state solution.
- Use source terms to model solar wind acceleration
and heating so that steady solution matches
observed solar wind parameters. - Perturb this initial state with a flux rope.
- Follow CME propagation.
- Let CME hit the Magnetosphere of the Earth.
39Initial Steady State in the Corona
- Solar surface is colored with the radial magnetic
field. - Field lines are colored with the velocity.
- Flux rope is shown with white field lines.
40Close-up of the Added Flux Rope
41Two Hours After Eruption in the Solar Corona
4265 Hours After Eruption in the Inner Heliosphere
43Sun to Earth CME Simulation
- In Solar Corona and Heliosphere the resolution
ranges from 1/32RS to 4 RS - Between 4 and 14 million cells SC/IH grid
- In Global Magnetosphere the resolution ranges
from 1/8RE to 8 RE
44The Zoom Movie
45What Happens at Earth -More Detail
46More Detail at Earth
Pressure and magnetic field
Before shock
After shock
Density and magnetic field at shock arrival time
South Turning BZ
North Turning BZ
47Ionosphere Electrodynamics
- Before shock hits.
- After shock currents and the resulting electric
potential increase. - Region-2 currents develop.
- Although region-1 currents are strong, the
potential decreases due to the shielding effect.
48Upper Atmosphere
Before shock arrival
- The Hall conductance is calculated by the Upper
Atmosphere component and it is used by the
Ionosphere Electrodynamics. - After the shock hits the conductance increases in
the polar regions due to the electron
precipitation. - Note that the conductance caused by solar
illumination at low latitudes does not change
significantly.
After shock arrival
49Performance of the SWMF
50(No Transcript)
512003 Halloween Storm Simulation with GM, IM and
IE Components
- The magnetosphere during the solar storm
associated with an X17 solar eruption. - Using satellite data for solar wind parameters
- Solar wind speed 1800 km/s.
- Time October 29, 0730UT
- Shown are the last closed field lines shaded
with the thermal pressure. - The cut planes are shaded with the values of the
electric current density.
52GM, IM, IE Run vs. Observations
53Concluding Remarks
- The Space Weather Modeling Framework (SWMF)
uses sate-of-the-art methods to
achieve flexible and efficient coupling and
execution of the physics models. - Missing pieces for space weather prediction
- Better models for solar wind heating and
acceleration - Better understanding of CME initiation
- More observational data to constrain the model
- Even faster computers and improved algorithms.