Title: Lecture 14 Eigenanalysis
1Lecture 14 - Eigen-analysis
2Lectures Goals
- QR Factorization
- Householder
- Hessenberg Method
3QR Factorization
The technique can be used to find the eigenvalue
using a successive iteration using Householder
transformation to find an equivalent matrix to
A having an eigenvalues on the diagonal
4QR Factorization
Another form of factorization
A QR
Produces an orthogonal matrix (Q) and a right
upper triangular matrix (R) Orthogonal matrix
- inverse is transpose
5QR Factorization
Why do we care? We can use Q and R to find
eigenvalues 1. Get Q and R (A QR) 2. Let A
RQ 3. Diagonal elements of A are eigenvalue
approximations 4. Iterate until converged
Note QR eigenvalue method gives all eigenvalues
simultaneously, not just the
dominant ?
6QR Eigenvalue Method
In practice, QR factorization on any given matrix
requires a number of steps First transform A into
Hessenberg form
Hessenberg matrix - upper triangular plus first
sub-diagonal
Special properties of Hessenberg matrix make it
easier to find Q, R, and eigenvalues
7QR Factorization
- Construction of QR Factorization
8QR Factorization
- Use Householder reflections and given rotations
to reduce certain elements of a vector to zero. - Use QR factorization that preserve the
eigenvalues. - The eigenvalues of the transformed matrix are
much easier to obtain.
9Jordan Canonical Form
- Any square matrix is orthogonally similar to a
triangular matrix with the eigenvalues on the
diagonal
10Similarity Transformation
- Transformation of the matrix A of the form H-1AH
is known as similarity transformation. - A real matrix Q is orthogonal if QTQ I.
- If Q is orthogonal, then A and Q -1AQ are said to
be orthogonally similar - The eigenvalues are preserved under the
similarity transformation.
11Upper Triangular Matrix
- The diagonal elements Rii of the upper triangular
matrix R are the eigenvalues
12Householder Reflector
- Householder reflector is a matrix of the form
- It is straightforward to verify that Q is
symmetric and orthogonal
13Householder Matrix
- Householder matrix reduces zk1 ,,zn to zero
- To achieve the above operation, v must be a
linear combination of x and ek
14Householder Transformation
15Householder Matrix
- Corollary (kth Householder matrix) Let A be an
nxn matrix and x any vector. If k is an integer
with 1lt kltn-1 we can construct a vector w(k)
and matrix H(k) I - 2w(k)w(k) so that
16Householder matrix
- Define the value ? so that
- The vector w is found by
- Choose ? sign(xk)g to reduce round-off error
17Householder Matrices
18Example Householder Matrix
19Example Householder Matrix
20Basic QR Factorization
- A Q R
- Q is orthogonal, QTQ I
- R is upper triangular
- QR factorization using Householder matrices
- Q H(1)H(2).H(n-1)
21Example QR Factorization
22QR Factorization
QR A
- Similarity transformation B QTAQ preserve the
eigenvalues
23Finding Eigenvalues Using QR Factorization
- Generate a sequence A(m) that are orthogonally
similar to A - Use Householder transformation H-1AH
- the iterates converge to an upper triangular
matrix with the eigenvalues on the diagonal
Find all eigenvalues simultaneously!
24QR Eigenvalue Method
- QR factorization A QR
- Similarity transformation A(new) RQ
25Example QR Eigenvalue
26Example QR Eigenvalue
27MATLAB Example
A 2.4634 1.8104 -1.3865 -0.0310
3.0527 1.7694 0.0616 -0.1047 -0.5161 A
2.4056 1.8691 1.3930 0.0056
2.9892 -1.9203 0.0099 -0.0191 -0.3948 A
2.4157 1.8579 -1.3937 -0.0010
3.0021 1.8930 0.0017 -0.0038 -0.4178 A
2.4140 1.8600 1.3933 0.0002
2.9996 -1.8982 0.0003 -0.0007 -0.4136 A
2.4143 1.8596 -1.3934 0.0000
3.0001 1.8972 0.0001 -0.0001
-0.4143 e 2.4143 3.0001 -0.4143
A1 2 -1 2 2 -1 2 -1 2 A 1 2
-1 2 2 -1 2 -1 2
Q,RQR_factor(A) Q -0.3333 -0.5788
-0.7442 -0.6667 -0.4134 0.6202 -0.6667
0.7029 -0.2481 R -3.0000 -1.3333
-0.3333 0.0000 -2.6874 2.3980 0.0000
0.0000 -0.3721 eQR_eig(A,6) A
2.1111 2.0535 1.4884 0.1929 2.7966
-2.2615 0.2481 -0.2615 0.0923
QR factorization
eigenvalue
28Improved QR Method
- Using similarity transformation to form an upper
Hessenberg Matrix (upper triangular matrix one
nonzero band below diagonal) . - More efficient to form Hessenberg matrix without
explicitly forming the Householder matrices (not
given in textbook).
function A Hessenberg(A) n,nn size(A) for
k 1n-2 H Householder(A(,k),k1)
A HAH end
29Improved QR Method
A 2.4056 -2.1327 0.9410 -0.0114
-0.4056 -1.9012 0.0000 0.0000
3.0000 A 2.4157 2.1194 -0.9500
-0.0020 -0.4157 -1.8967 0.0000 0.0000
3.0000 A 2.4140 -2.1217 0.9485
-0.0003 -0.4140 -1.8975 0.0000 0.0000
3.0000 A 2.4143 2.1213 -0.9487
-0.0001 -0.4143 -1.8973 0.0000 0.0000
3.0000 e 2.4143 -0.4143 3.0000
eig(A) ans 2.4142 -0.4142 3.0000
A1 2 -1 2 2 -1 2 -1 2 A 1 2
-1 2 2 -1 2 -1 2
Q,RQR_factor_g(A) Q 0.4472 0.5963
-0.6667 0.8944 -0.2981 0.3333 0
-0.7454 -0.6667 R 2.2361 2.6833
-1.3416 -1.4907 1.3416 -1.7889 -1.3333
0 -1.0000 eQR_eig_g(A,6) A
2.1111 -2.4356 0.7071 -0.3143 -0.1111
-2.0000 0 0.0000 3.0000 A
2.4634 2.0523 -0.9939 -0.0690 -0.4634
-1.8741 0.0000 0.0000 3.0000
Hessenberg matrix
eigenvalue
MATLAB function
30Summary
- QR Factorization
- Householder matrix
- Hessenberg matrix
31Interpolation
32Lectures Goals
- Interpolation methods
- Lagranges Interpolation
- Newtons Interpolation
- Hermites Interpolation
- Rational Function Interpolation
- Spline (Linear,Quadratic, Cubic)
- Interpolation of 2-D data
33Interpolation Methods
Why would we be interested in interpolation
methods?
- Interpolation method are the basis for other
procedures that we will deal with
- Numerical differentiation
- Numerical integration
- Solution of ODE (ordinary differential equations)
and PDE (partial differential equations)
34Interpolation Methods
Why would we be interested in interpolation
methods?
- These methods demonstrate some important theory
about polynomials and the accuracy of numerical
methods. - The interpolation of polynomials serve as an
excellent introduction to some techniques for
drawing smooth curves.
35Interpolation Methods
Interpolation uses the data to approximate a
function, which will fit all of the data points.
All of the data is used to approximate the values
of the function inside the bounds of the data.
We will look at polynomial and rational function
interpolation of the data and piece-wise
interpolation of the data.
36Polynomial Interpolation Methods
- Lagrange Interpolation Polynomial - a
straightforward, but computational awkward way to
construct an interpolating polynomial. - Newton Interpolation Polynomial - there is no
difference between the Newton and Lagrange
results. The difference between the two is the
approach to obtaining the coefficients.
37Lagrange Interpolation
This method is generally the easiest to work.
The data does not have to be equally spaced and
is useful for finding the points between
quadratic and cubic methods. However, it does
not provide an accurate model for large sets of
terms.
38Lagrange Interpolation
The function can be defined as
where,
39Lagrange Interpolation
The function can be defined as
where, the coefficients are defined as
40Lagrange Interpolation
The method works for quadratic and cubic
polynomials. As you add additional points in the
degree of the polynomial increases. So if you
have n points it will fit a (n-1)th degree
polynomial.
41Example of Lagrange Interpolation
What are the coefficients of the polynomial and
what is the value of P2(2.3)?
42Example of Lagrange Interpolation
- The values are evaluated
- P(x) 9.2983(x-1.7)(x-3.0)
- - 19.4872(x-1.1)(x-3.0)
- 8.2186(x-1.1)(x-1.7)
- P(2.3) 9.2983(2.3-1.7)(2.3-3.0)
- - 19.4872(2.3-1.1)(2.3-3.0)
- 8.2186(2.3-1.1)(2.3-1.7)
- 18.3813
43Lagrange Interpolation Program
The Lagrange interpolation is broken up into two
programs to evaluate the new polynomial.
- C Lagrange_coef(x,y), which evaluates the
coefficients of the Lagrange technique - P(x) Lagrange_eval(t,x,c), which uses the
coefficients and x values to evaluate the
polynomial - Plottest(x,y),which will plot the Lagrange
polynomial
44Example of Lagrange Interpolation
What happens if we increase the number of data
points?
Coefficient for 2 is
45Example of Lagrange Interpolation
Note that the coefficient creates a P4(x)
polynomial and comparison between the two curves.
The original value P2(x) is given. The problem
with adding additional points will create
bulges in the graph.
46Newton Interpolation
The Newton interpolation uses a divided
difference method. The technique allows one to
add additional points easily.
47Newton Interpolation
For given set of data points (x1,y1), (x2,y2),
(x3,y3)
48Newton Interpolation
The function can be defined as
49Newton Interpolation
The method works for quadratic and cubic
polynomials. As you add additional points in the
degree of the polynomial increases. So if you
have n points it will fit a (n-1)th degree
polynomial. The method is setup to show the a
pattern for combining a table computation and
provide additional points.
50Example of Newton Interpolation
What are the coefficients of the polynomial and
what is the value of P2(2.3)?
The true function of the points is f(x) 2x
51Example of Newton Interpolation
52Example of Newton Interpolation
The coefficients are the top row of the chart
53Example of Newton Interpolation
- The values are evaluated
- P(x) 1 (x-0)
- 0.5(x-0)(x-1)
- 0.1667(x-0)(x-1)(x-2)
- 0.04167(x)(x-1)(x-2)(x-3)
- P(2.3) 1 (2.3)
- 0.5(2.3)(1.3)
- 0.1667(2.3)(1.3)(0.3)
- 0.04167(2.3)(1.3)(0.3)(-0.7)
- 4.9183 (4.9246)
54Newton Interpolation Program
The Newton interpolation is broken up into two
programs to evaluate the new polynomial.
- C Newton_coef(x,y), which evaluates the
coefficients of the Newton technique - P(x) Newton_eval(t,x,c), which uses the
coefficients and x values to evaluate the
polynomial - Plottest_new(x,y),which will plot the Newton
polynomial
55Summary
- The Lagrange and Newton Interpolation are
basically the same methods, but use different
coefficients. The polynomials depend on the
entire set of data points.
56Homework
- Check the Homework webpage