Title: Pattern Recognition: Statistical and Neural
1Nanjing University of Science Technology
Pattern RecognitionStatistical and Neural
Lonnie C. Ludeman Lecture 21 Oct 28, 2005
2Lecture 21 Topics
- Example Analysis of simple Neural Network
- Example - Synthesis of special forms of
Artificial Neural Networks - 3. General concepts of Training an Artificial
Neural Network- Supervised and unsupervised,traini
ng sets - 4. Neural Networks Nomenclature and Notation
- 5. Derivation and Description of the
Backpropagation Algorithm for Feedforward Neural
Networks
3Example Analyze the following Neural Network
-1
0
1
1
1
0
0
-1
1
4Solution Outputs of layer 1 ANEs
5Thus from layer 1 we have
Output of layer 2 ANE is
- 2 0
lt 0
6(No Transcript)
7Final Solution Output Function for Given Neural
Network
8Example Synthesize a Neural Network
Given the following decision regions build a
neural network to perform the classification
process
Solution Use Hyperplane-AND-OR structure
9Each gk(x) specifies a hyperplane boundary
10Solution
Hyperplane Layer
AND Layer
OR Layer
all f() µ()
11Training a Neural Network
Without a teacher
With a teacher
12(No Transcript)
13Training Set
xj are the training samples dj is the class
assigned to training sample xj
14Example of a training set
( x1 0, 1 ,2 T , d1 C1 ) , ( x2 0,
1 ,0 T , d2 C1 ) , ( x3 0, 1 ,1 T ,
d3 C1 ) , ( x4 1, 0 ,2 T , d4 C2 )
, ( x5 1, 0 ,3 T , d5 C2 ) , ( x6
0, 0 ,1 T , d6 C3 ) , ( x7 0, 0 ,2 T ,
d7 C3 )
( x8 0, 0 ,3 T d8 C3 ) ( x9 0, 0
,3 T d9 C3 ) ( x10 1, 1 ,0 T d10
C4 ) ( x11 2, 2 ,0 T d11 C4 ) ( x12
2, 2 ,2 T d12 C5 ) ( x13 3, 2, 2 T
d13 C6 )
15General Weight Update Algorithm
x(k) is the training sample for the k th
iteration d(k) is the class assigned to
training sample x(k) y(k) is the output vector
for the k th training sample
16Training with a Teacher( Supervised)
1. Given a set of N ordered samples with their
known class assignments. 2. Randomly select all
weights in the neural network. 3. For each
successive sample in the total set of samples,
evaluate the output. 4. Use these outputs and the
input sample to update the weights 5. Stop at
some predetermined number of iterations or if
given performance measure is satisfied. If not
stopped go to step 3
17Training without a Teacher( Unsupervised)
1. Given a set of N ordered samples with unknown
class assignments. 2. Randomly select all weights
in the neural network. 3. For each successive
sample in the total set of samples, evaluate the
outputs. 4. Using these outputs and the inputs
update the weights 5. If weights do not change
significantly stop with that result. If weights
change return to step 3
18Supervised Training of a Feedforward Neural
Network
Nomenclature
19Output vector of layer m
Output vector of layer L
1
Node Number Layer m
Node Number Layer L
20Weight Matrix for layer m
N
Nm
Node Nm
Node 2
Node 1
21Layers, Nets, Outputs, Nonlinearities
fix
22Define the performance Ep for sample x(p) as
We wish to select weights so that Ep is Minimized
Use Gradient Algorithm
23Gradient Algorithm for Updating the weights
p
p
w(p)
x(p)
24Derivation of weight update equation for Last
Layer (Rule 1) Backpropagation Algorihm
The partial of ym(L) with respect to wkj(L) is
25General Rule 1 for Weight Update
Therefore
26Derivation of weight update equation for Next to
Last Layer (L-1) Backpropagation Algorithm
27(No Transcript)
28General Rule 2 for Weight Update- Layer L-1
Backpropagation Algorithm
Therefore
and the weight correction is as follows
29where weight correction (general Rule 2) is
w
(L-1)
30Backpropagation Training Algorithm for
Feedforward Neural networks
31Input pattern sample xk
32Calculate Outputs First Layer
33Calculate Outputs Second Layer
34Calculate Outputs Last Layer
35Check Performance
Single Sample Error
Over all Samples Error
Ns - 1
ETOTAL(p) ? ½ ? (dx(p-i) f( wT(p-i)?x(p-i) )2
i 0
Can be computed recursively
ETOTAL(p1) ETOTAL(p) Ep1 (p1) Ep-Ns
(p-Ns )
36Change Weights Last Layer using Rule 1
37Change Weights previous Layer using Rule 2
38Change Weights previous Layer using Modified Rule
2
39Input pattern sample xk1
Continue Iterations Until
40Repeat process until performance is satisfied or
maximum number of iterations are reached.
If performance not satisfied at maximum number
of iterations the algorithm stops and NO design
is obtained.
If performance is satisfied then the current
weights and structure provide the required design.
41Freeze Weights to get Acceptable Neural Net Design
42Backpropagation Algorithm for Training
Feedforward Artificial Neural Networks
43Summary Lecture 21
- Example Analysis of simple Neural Network
- Example - Synthesis of special forms of
Artificial Neural Networks - 3. General concepts of Training an Artificial
Neural Network- Supervised and unsupervised,and
description of training sets - 4. Neural Networks Nomenclature and Notation
- 5. Derivation and Description of the
Backpropagation Algorithm for Feedforward Neural
Networks
44End of Lecture 21