Title: Lecture 21 Approximation Methods
1Lecture 21 - Approximation Methods
- CVEN 302
- October 15, 2001
2Lectures Goals
- Discrete Least Square Approximation
- Linear
- Quadratic
- Higher order Polynomials
- Nonlinear
- Continuous Least Square
- Orthogonal Polynomials
- Gram Schmidt -Legendre Polynomial
3Approximation Methods
What is the difference between approximation and
interpolation?
- Interpolation matches the data points exactly.
In case of experimental data, this assumption is
not often true. - Approximation - we want to consider the curve
that will fit the data with the smallest error.
4Least Square Fit Approximations
Suppose we want to fit the data set.
We would like to find the best straight line to
fit the data?
5Least Square Fit Approximations
The problem is how to minimize the error. We can
use the error defined as
However, the errors can cancel one another and
still be wrong.
6Least Square Fit Approximations
We could minimax the error, defined as
The error minimization is going to have problems.
7Least Square Fit Approximations
The solution is the minimization of the sum of
squares. This will give a least square solution.
This is known as the maximum likelihood principle.
8Least Square Approximations
Assume
The error is defined as
9Least Square Error
The sum of the errors
Substitute for the error
10Least Square Error
How do you minimize the error?
Take the derivative with the coefficients and set
it equal to zero.
11Least Square Error
The first component, a
12Least Square Error
The second component, b
13Least Square Coefficients
The equations can be rewritten
14Least Square Coefficients
The equations can be rewritten
15Least Square Coefficients
The coefficients are defined as
16Least Square Example
Using the results into table of the values
Given the data
17Least Square Example
The equation is
18Least Square Error
How do you minimize the error for a quadratic fit?
Take the derivative with the coefficients and set
it equal to zero.
19Least Square Coefficients for Quadratic fit
The equations can be written as
20Least Square of Quadratic Fit
The matrix can be solved using a Gaussian
elimination and the coefficients can be found.
21Quadratic Least Square Example
The linear results
22Quadratic Least Square Example
23Quadratic Least Square Example
a 0.225, b -1.018 , c 0.998
y 0.225x2 -1.018x 0.998
24Polynomial Least Square
The technique can be used to all forms of
polynomials of the form
25Polynomial Least Square
Solving large sets of linear equations are not a
simple task. They can have the undesirable
property known as ill-conditioning. The results
of this method is that round-off errors in
solving for the coefficients cause unusually
large errors in the curve fits.
26Polynomial Least Square
How do you measure the error of higher order
polynomials?
27Polynomial Least Square
Or measure of the variance of the problem
Where, n is the degree polynomial and N is the
number of elements and Yk are the data points
and,
28Polynomial Least Square Example
Example 2 can be fitted with cubic equation and
the coefficients are a0 1.004 a1 -1.079 a2
0.351 a3 - 0.069
29Polynomial Least Square Example
However, if we were to look at the higher order
polynomials such the sixth and seventh
order. The results are not all that promising.
30Polynomial Least Square Example
The standard deviation of the polynomial fit
shows that the best fit for the data is the
second order polynomial.
31Summary
- The linear least squared method is straight
forward to determine the coefficients of the line.
32Summary
- The quadratic and higher order polynomial curve
fits use a similar technique and involve solving
a matrix of (n1) x (n1).
33Summary
- The higher order polynomials fit required that
one selects the best fit for the data and a means
of measuring the fit is the standard deviation of
the results as a function of the degree of the
polynomial.
34Homework
- Check the Homework webpage