Title: Ch10_pres
1Widrow-Hoff Learning (LMS Algorithm)
2ADALINE Network
3Two-Input ADALINE
4Mean Square Error
Training Set
Input
Target
Notation
Mean Square Error
5Error Analysis
The mean square error for the ADALINE Network is
a quadratic function
6Stationary Point
Hessian Matrix
The correlation matrix R must be at least
positive semidefinite. If there are any zero
eigenvalues, the performance index will either
have a weak minumum or else no stationary point,
otherwise there will be a unique global minimum
x.
If R is positive definite
7Approximate Steepest Descent
Approximate mean square error (one sample)
Approximate (stochastic) gradient
8Approximate Gradient Calculation
9LMS Algorithm
10Multiple-Neuron Case
Matrix Form
11Analysis of Convergence
For stability, the eigenvalues of this matrix
must fall inside the unit circle.
12Conditions for Stability
(where li is an eigenvalue of R)
Therefore the stability condition simplifies to
13Steady State Response
If the system is stable, then a steady state
condition will be reached.
The solution to this equation is
This is also the strong minimum of the
performance index.
14Example
Banana
Apple
15Iteration One
Banana
16Iteration Two
Apple
17Iteration Three
18Adaptive Filtering
Tapped Delay Line
Adaptive Filter
19Example Noise Cancellation
20Noise Cancellation Adaptive Filter
21Correlation Matrix
22Signals
23Stationary Point
0
0
24Performance Index
25LMS Response
26Echo Cancellation