Conjugate GradientMethod - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Conjugate GradientMethod

Description:

Difficulty: need to keep all the old search directions to construct the new one! ... Reasons: residuals are independent, good search directions, ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 22
Provided by: jyunmi
Category:

less

Transcript and Presenter's Notes

Title: Conjugate GradientMethod


1
Conjugate Gradient Method
  • Ref Shewchunk

2
Overview
  • Most popular iterative method for solving large
    systems of sparse linear equations Axb
  • A is known, square, symmetric, positive-definite
  • If A is dense, best choice is factoring with back
    substitution

3
Introduction
  • Quadratic form
  • If A is symmetric and positive-definite, f(x) is
    minimized by the solution to Axb

4
Method of Steepest Descent
  • Choose the direction in which f decreases the
    most the direction opposite to f(x(i))

Error
Residual
Residual as the direction of steepest descent
5
Line Search
6
Summary
Simplification Ax, Ar?Ar
7
Convergence Analysis
  • Two cases where one-step convergence is possible
  • e(i) is the eigenvector (of A)
  • All eigenvalues of A are the same

8
Convergence Analysis (cont)
  • In general, depends on condition number of A and
    the slope of e(0) (relative to coord. system of
    eigenvectors) at the starting point

9
skip sec.5 eigenvectors
10
Method of Conjugate Directions
  • Idea Pick a set of n orthogonal directions. In
    each direction, take exactly one step. After n
    steps, well be done

Algorithm should look like
But we dont know e(i) !
11
Conjugate Directions
  • Make the search directions d(i)A-orthogonal

Find minimum along d(i)
12
Conjugate Directions
  • To get stepsize
  • Algorithm Summary

13
Proof
  • Can this really solve x in n steps? since weve
    changed orthogonal to A-orthogonal?

Intuition Each step of conjugate directions
eliminates one A-orthogonal component of error
14
represent error in basis of d(i)
Initial error
After n iterations, e(n) 0
15
To find A-orthogonal directions
  • Conjugate Gram-Schmidt process

Difficulty need to keep all the old search
directions to construct the new one!
16
skip Optimality of Conjugate directions
  • Conclusion it finds at every step the best
    solution within the bounds of where its been
    allowed to explore
  • Very important Krylov subspace

17
Conjugate Gradient
  • Search directions are constructed by conjugation
    of the residuals
  • Setting ui r(I)
  • Reasons residuals are independent, good search
    directions,

18
(No Transcript)
19
Method of Conjugate Gradient
Krylov subspace
20
(No Transcript)
21
Other Topics
  • Precoditioning
  • CG on Normal Equations
  • Nonlinear CG
Write a Comment
User Comments (0)
About PowerShow.com