FeedForward Networks - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

FeedForward Networks

Description:

Weights on connections between the layers. Activation function for each computing element ... connection has a set of weights denoted by vector w. SOFMs in ... – PowerPoint PPT presentation

Number of Views:19
Avg rating:3.0/5.0
Slides: 19
Provided by: Pil85
Category:

less

Transcript and Presenter's Notes

Title: FeedForward Networks


1
FeedForward Networks
  • Winter 2005
  • Feb 18th

2
Feedforward Networks
  • A network of generalized perceptrons
  • Supervised learning
  • Neurons in layers
  • Information flows from layer to next layer only
    (i.e. feed-forward flow of information)
  • Want to minimize our error function

3
Structure
  • 3-layer feedforward network
  • Input, hidden, and output layers
  • Weights on connections between the layers
  • Activation function for each computing element

x1
o1
Hj
x2
w(k,j)
W(j,i)
o2


xn
on
4
Notation
  • Xx1,x2,,xn training pattern
  • Hj output of hidden unit j
  • Oi output of output unit I (i 1..M)
  • w(k,j) weight of link from input unit k to
    hidden unit j
  • W(j,i) weight of link from hidden unit j to
    output unit i

5
Computation
  • Input calculation
  • for each layer
  • Hidden Layer
  • applying weights and activation function g to
    input values
  • Output Layer
  • Applying weights and activation function g to
    previous layer outputs

6
Weight Changes
  • Error function
  • Computed back-to-front (backpropagation)
  • Hidden-Ouput Layer
  • Input-Hidden Layer

7
General Notation
  • We have M layers
  • Vim output of cell i in layer m
  • input Vi0 xi
  • output ViM oi where m 0,,M
  • him input to cell i in layer m
  • wjim connection from Vjm-1 to Vim

8
Backpropagation Algm
  • Initialize all weights with random values
  • Select a pattern x and attach it to input layer
    (m0)
  • Propagate the signals through all layers

9
Backpropagation Algm cont.
  • Calculate the deltas of output layer
  • Calculate the deltas for inner layers
  • Adapt all connection weights
  • Go back to step 2 for next pattern

10
Feedforward Summary
  • Quite powerful in computing functions
  • Works well in practice
  • Computationally intensive
  • For majority of problems solvable by feedforward
    networks, the 3 layered approach is sufficient
    (i.e. one layer of hidden computing units)

11
Self-Organizing Feature Maps
  • SOFM (or Kohonen Networks) can partition space
    into regions
  • Usually composed of an 2d or 3d lattice of m
    neurons where the neurons are not explicitly
    connected to each other
  • Neighbourhood of each neuron is defined by a
    n-dim sphere of radius r
  • Each neuron connects to an n-dimensional input
    vector denoted by x
  • Neuron to input vector connection has a set of
    weights denoted by vector w

12
SOFMs in 1D
  • 1D array of neurons

13
SOFMs in 2D
  • Regular 2D lattice
  • Mapping between the n-dim input space V and the
    SOFM 2D space is shown

14
Algorithm Notation
  • r neighbourhood radius
  • ? - learning constant
  • used to control learning rate
  • F - neighbourhood function
  • defines the neighbourhood relation between
    neurons in a similar way to fuzzy membership
    functions

15
Kohonen Learning Algorithm
16
Algorithm Synopsis
  • In step 1, we select a training vector
  • as with the other ANN algorithms
  • In step 2, we find the neuron with weight vector
    values closest to the input vector values
  • neuron k
  • In step 3, we update the weight vectors using the
    neighbourhood update rule
  • Weight vectors of neurons in ks neighbourhood
    are updated based on their distance to k
  • Weight vectors of neurons outside of ks
    neighbourhood are not modified
  • In step 4, we usually shrink the neighbourhood
    radius and modify the learning constant

17
Learning Problems
  • We initially use course granularity and by
    modifying the neighbourhood radius and learning
    constant, we gradually go towards fine granuality
  • Difficult to find how to modify those parameters
  • Difficulties with higher dimensional input spaces

18
SOFM Additional Material
  • For a nice tutorial on SOFMs and some interesting
    examples, please see
  • http//www.ai-junkie.com/som1.html
Write a Comment
User Comments (0)
About PowerShow.com