MA5233 Lecture 6 Krylov Subspaces and Conjugate Gradients - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

MA5233 Lecture 6 Krylov Subspaces and Conjugate Gradients

Description:

MA5233 Lecture 6. Krylov Subspaces and Conjugate Gradients. Wayne M. Lawton ... converged, that is. then. therefore. for. Theorem 3. If subordinate to the 2-norm. then ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 15
Provided by: wlaw
Category:

less

Transcript and Presenter's Notes

Title: MA5233 Lecture 6 Krylov Subspaces and Conjugate Gradients


1
MA5233 Lecture 6 Krylov Subspaces and Conjugate
Gradients
  • Wayne M. Lawton
  • Department of Mathematics
  • National University of Singapore
  • 2 Science Drive 2
  • Singapore 117543

Email matwml_at_nus.edu.sg http//www.math.nus/matwm
l Tel (65) 6874-2749
1
2
EUCLIDEAN SPACES
Definition A Euclidean structure on a vector space
that satisfies
is a function
for all
Bilinear
and
Symmetric
and
Positive Definite

Definition The norm is
2
3
EXAMPLES
Example 1. (Standard)
Example 2.
is positive definite and symmetric.
where
Example 3.
is positive except at possible a finite
where
number of points hence p is nonnegative.
Example 4.
and
are obtained by

the Euclidean space in example 3. Then V is a Real
3
Hilbert Space Complete Real Euclidean Space.
4
ORTHONORMAL BASES
Definition
is an orthonormal basis for V
if
is an orthonormal basis
Example
for the standard Euclidean space iff the matrix
satisfies
is the transpose matrix defined by
where
and
is the identity matrix defined by
Such a matrix is called orthogonal
and
and satisfies
4
5
GRAM-SCHMIDT PROCESS
Given a basis
for a Euclidean space V
there exists a unique upper triangular matrix
with positive numbers on its diagonal such that
are orthogonal (and therefore are a basis for V).
Proof We apply the Gram-Schmidt Process
For j 2 to d
5
6
QR FACTORIZATION
Given a basis
for
Gram-Schmidt
yields an upper triangular matrix
with positive numbers on its diagonal such that
are therefore, since
is upper triangular,
a factorization
that has important applications to least-squares
problems (section 5.3) and to compute
eigenvalues and eigenvectors (section 5.5)
6
7
PARTIAL HESSENBERG FACTORIZATION
Definition A (not necessarily square) matrix
is upper Hessenberg if
We consider a matrix
and integer
and orthonormal vectors
such that
or, equivalently
7
8
KRYLOV SPACES AND ARNOLDI ITERATION
If the Krylov space
has dimension n, then an orthonormal basis
can be computed by GS using the Arnoldi Iteration
based on the equation
(Recall that
For j 2 to n
8
9
COMPLETE HESSENBERG FACTORIZATION
Possibly using more than one Krylov subspace we
can
construct an orthonormal basis
for
such that
where
We observe that the number of Krylov subspaces
equals 1 number of zeros on the diagonal
beneath the main diagonal.
9
10
TRI-DIAGONAL MATRIX
Theorem
iff
Proof.
therefore
Corollary If
then
is tridiagonal.
10
11
LANCZOS ITERATION
Theorem If
then
and an orthonormal basis for
can be computed by GS using the Lanczos Iteration
For j 1 to n-1
11
12
CONJUGATE GRADIENT ITERATION
that Hestenes and Stiefel made famous solves Ax
b
under the assumption that A is symmetric and pos.
def.
For j 1 to n-1
12
13
CONJUGATE GRADIENT ITERATION
Theorem 1. The following sets all
and
Proof By induction
if j lt n-1then
and
since
if j lt n-1then
since
and
13
14
CONJUGATE GRADIENT ITERATION
Theorem 2. If A is symmetric and positive definite
then if the CG algorithm to solve Ax 0 has not
then
converged, that is
minimizes
for
and convergence is monotonic
Proof If
then
therefore
Theorem 3. If subordinate to the 2-norm
then
Proof See the handouts
14
Write a Comment
User Comments (0)
About PowerShow.com