CS 290H: Preconditioning Iterative Methods - PowerPoint PPT Presentation

1 / 9
About This Presentation
Title:

CS 290H: Preconditioning Iterative Methods

Description:

http://www.cs.ucsb.edu/~gilbert/cs290. The Landscape of Sparse Ax=b Solvers. Direct ... One matrix-vector multiplication per iteration. Two vector dot products ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 10
Provided by: JohnGi84
Category:

less

Transcript and Presenter's Notes

Title: CS 290H: Preconditioning Iterative Methods


1
CS 290H Preconditioning Iterative Methods
  • John R. Gilbert (gilbert_at_cs.ucsb.edu)
  • http//www.cs.ucsb.edu/gilbert/cs290

2
The Landscape of Sparse Axb Solvers
D
3
Conjugate gradient iteration
x0 0, r0 b, d0 r0 for k 1, 2,
3, . . . ak (rTk-1rk-1) / (dTk-1Adk-1)
step length xk xk-1 ak dk-1
approx solution rk rk-1 ak
Adk-1 residual ßk
(rTk rk) / (rTk-1rk-1) improvement dk
rk ßk dk-1
search direction
  • One matrix-vector multiplication per iteration
  • Two vector dot products per iteration
  • Four n-vectors of working storage

4
Conjugate gradient Convergence
  • In exact arithmetic, CG converges in n steps
    (completely unrealistic!!)
  • Accuracy after k steps of CG is related to
  • consider polynomials of degree k that are equal
    to 1 at 0.
  • how small can such a polynomial be at all the
    eigenvalues of A?
  • Thus, eigenvalues close together are good.
  • Condition number ?(A) A2 A-12
    ?max(A) / ?min(A)
  • Residual is reduced by a constant factor by
    O(?1/2(A)) iterations of CG.

5
Preconditioners
  • Suppose you had a matrix B such that
  • condition number ?(B-1A) is small
  • By z is easy to solve
  • Then you could solve (B-1A)x B-1b instead of Ax
    b
  • B A is great for (1), not for (2)
  • B I is great for (2), not for (1)
  • Domain-specific approximations sometimes work
  • B diagonal of A sometimes works
  • Or, bring back Gaussian elimination. . .

6
Preconditioned conjugate gradient iteration
x0 0, r0 b, d0 B-1 r0, y0
B-1 r0 for k 1, 2, 3, . . . ak
(yTk-1rk-1) / (dTk-1Adk-1) step length xk
xk-1 ak dk-1 approx
solution rk rk-1 ak Adk-1
residual yk B-1 rk
preconditioning
solve ßk (yTk rk) / (yTk-1rk-1)
improvement dk yk ßk dk-1
search direction
  • One matrix-vector multiplication per iteration
  • One solve with preconditioner per iteration

7
Incomplete Cholesky factorization (IC, ILU)
  • Compute factors of A by Gaussian elimination,
    but ignore fill
  • Preconditioner B RTR ? A, not formed explicitly
  • Compute B-1z by triangular solves (in time
    nnz(A))
  • Total storage is O(nnz(A)), static data structure
  • Either symmetric (IC) or nonsymmetric (ILU)

8
Complexity of linear solvers
Time to solve model problem (Poissons equation)
on regular mesh
9
CS 290H Lecture 1Class outline
  • Approximately 3 homeworks
  • Final project implementation experiment, or
    survey paper, or real application
  • Assigned readings from online resources
  • Read Shewchuk paper sections 1-5, 7, and 8 skim
    6 and 9.
Write a Comment
User Comments (0)
About PowerShow.com