Title: Conjugate GradientMethod
1Conjugate Gradient Method
2Overview
- Most popular iterative method for solving large
systems of sparse linear equations Axb - A is known, square, symmetric, positive-definite
- If A is dense, best choice is factoring with back
substitution
3Introduction
- Quadratic form
- If A is symmetric and positive-definite, f(x) is
minimized by the solution to Axb
4Method of Steepest Descent
- Choose the direction in which f decreases the
most the direction opposite to f(x(i))
Error
Residual
Residual as the direction of steepest descent
5Line Search
6Summary
Simplification Ax, Ar?Ar
7Convergence Analysis
- Two cases where one-step convergence is possible
- e(i) is the eigenvector (of A)
- All eigenvalues of A are the same
8Convergence Analysis (cont)
- In general, depends on condition number of A and
the slope of e(0) (relative to coord. system of
eigenvectors) at the starting point
9skip sec.5 eigenvectors
10Method of Conjugate Directions
- Idea Pick a set of n orthogonal directions. In
each direction, take exactly one step. After n
steps, well be done
Algorithm should look like
But we dont know e(i) !
11Conjugate Directions
- Make the search directions d(i)A-orthogonal
Find minimum along d(i)
12Conjugate Directions
13Proof
- Can this really solve x in n steps? since weve
changed orthogonal to A-orthogonal?
Intuition Each step of conjugate directions
eliminates one A-orthogonal component of error
14represent error in basis of d(i)
Initial error
After n iterations, e(n) 0
15To find A-orthogonal directions
- Conjugate Gram-Schmidt process
Difficulty need to keep all the old search
directions to construct the new one!
16skip Optimality of Conjugate directions
- Conclusion it finds at every step the best
solution within the bounds of where its been
allowed to explore - Very important Krylov subspace
17Conjugate Gradient
- Search directions are constructed by conjugation
of the residuals - Setting ui r(I)
- Reasons residuals are independent, good search
directions,
18(No Transcript)
19Method of Conjugate Gradient
Krylov subspace
20(No Transcript)
21Other Topics
- Precoditioning
- CG on Normal Equations
- Nonlinear CG