Title: Pattern Recognition using Hebbian Learning
1Pattern Recognition using Hebbian Learning and
Floating-Gates
Certain pattern recognition problems have been
shown to be easily solved by Artificial neural
networks and many neural network chips have been
made and sold. Most of these are not terribly
biologically realistic.
2Output layer neurons
weights
Hidden layer neurons
weights
Input layer neurons
x (-1010) y(-1010) w1 0.3, w2 0.7
x
A 2-dimensional example
y
1, -1
3x (-1010) y(-1010) w1 0.5, w2 0.11
x
y
1, -1
Putting the two together
-.53
y
1, -1
0.53
-0.25
We respond to a smaller region of this 2-D input
space.
x
4So in general, we can apply this type of
operation on an N-dimensional input With the
hidden units defining hyperplanes in this input
space. The individual output units combine these
hyperplanes to create specific subregions of this
N-dim space. This is what pattern recognition is
about.
As you might expect, these two images live very
far apart from each other in this very high
dimensional space.
Easy Task
But if we had a set of 100 faces that we wanted
to recognize, this might be harder. What happens
if the faces are rotated, shifted, or scaled?
5How do I pick the weight matrices to solve these
tasks??
One way is to present inputs and adjust the
weights if the output is not what we want.
where
This is known as the perceptron learning rule
A training set of examples with target output
values is defined and presented one by one,
adjusting the weight matrix after each
evaluation. The learning rule Assigns large value
weights to components of the inputs that allow
discrimination between the classes of
inputs. e.g., many faces and many helicopters
6Face vs. Helicopter Example
7Associative Memory and Energy Functions
The Hopfield Recurrent Network
The concept of an energy function of a recurrent
neural network was introduced by Hopfield (1982)
to describe the state of a network. By studying
the dynamics, it is possible to show that the
activity in the network will always decrease in
energy, evolving towards a "local minima".
The network defines an 'energy landscape' in
which the state of the network settles. By
starting close to minima (stored patterns)
compared to other points in the landscape The
network will settle towards the minima and
'recall' the stored patterns.
8This view of neural processing has its merits,
provides insight into this type of computational
structure and has spawned new fields on its own,
but does not describe the current neurobiological
state of knowledge very well. In particular,
neurons communicate with spikes and the
backpropagation learning rule Is not a good match
to what has been found. So what do we know about
neurobiological learning?
Hebbian learning
If both cells are active, strengthen the synapse
If only the post-synaptic cell is active, weaken
the synapse
9In fact, learning at some synapses seems to be
even more specific. Temporal ordering seems to
play a role in determining the change in the
synapse.
strengthen
weaken
Dw
Time between pre-syn and post-syn spikes
Abbott and Blum, 1996
10Chip Idea
1. Design a spiking neural network that can
learn using the spike-timing rule to solve a
particular temporal pattern recognition
problem 2. Design a floating-gate modification
circuit that can implement the learning rule