Title: Feedback Networks and Associative Memories
1Feedback Networksand Associative Memories
??????? ?????????
2Content
- Introduction
- Discrete Hopfield NNs
- Continuous Hopfield NNs
- Associative Memories
- Hopfield Memory
- Bidirection Memory
3Feedback Networksand Associative Memories
??????? ?????????
4Feedforward/Feedback NNs
- Feedforward NNs
- The connections between units do not form cycles.
- Usually produce a response to an input quickly.
- Most feedforward NNs can be trained using a wide
variety of efficient algorithms. - Feedback or recurrent NNs
- There are cycles in the connections.
- In some feedback NNs, each time an input is
presented, the NN must iterate for a potentially
long time before it produces a response. - Usually more difficult to train than feedforward
NNs.
5Supervised-Learning NNs
- Feedforward NNs
- Perceptron
- Adaline, Madaline
- Backpropagation (BP)
- Artmap
- Learning Vector Quantization (LVQ)
- Probabilistic Neural Network (PNN)
- General Regression Neural Network (GRNN)
- Feedback or recurrent NNs
- Brain-State-in-a-Box (BSB)
- Fuzzy Congitive Map (FCM)
- Boltzmann Machine (BM)
- Backpropagation through time (BPTT)
6Unsupervised-Learning NNs
- Feedforward NNs
- Learning Matrix (LM)
- Sparse Distributed Associative Memory (SDM)
- Fuzzy Associative Memory (FAM)
- Counterprogation (CPN)
- Feedback or Recurrent NNs
- Binary Adaptive Resonance Theory (ART1)
- Analog Adaptive Resonance Theory (ART2, ART2a)
- Discrete Hopfield (DH)
- Continuous Hopfield (CH)
- Discrete Bidirectional Associative Memory (BAM)
- Kohonen Self-organizing Map/Topology-preserving
map (SOM/TPM)
7The Hopfield NNs
- In 1982, Hopfield, a Caltech physicist,
mathematically tied together many of the ideas
from previous research. - A fully connected, symmetrically weighted network
where each node functions both as input and
output node. - Used for
- Associated memories
- Combinatorial optimization
8Associative Memories
- An associative memory is a content-addressable
structure that maps a set of input patterns to a
set of output patterns. - Two types of associative memory autoassociative
and heteroassociative. - Auto-association
- retrieves a previously stored pattern that most
closely resembles the current pattern. - Hetero-association
- the retrieved pattern is, in general, different
from the input pattern not only in content but
possibly also in type and format.
9Associative Memories
Auto-association
A
Hetero-association
Niagara
Waterfall
10Optimization Problems
- Associate costs with energy functions in Hopfield
Networks - Need to be in quadratic form
- Hopfield Network finds local, satisfactory
soluions, doesnt choose solutions from a set. - Local optimums, not global.
11Feedback Networksand Associative Memories
??????? ?????????
12The Discrete Hopfield NNs
13The Discrete Hopfield NNs
wij wji wii 0
14The Discrete Hopfield NNs
wij wji wii 0
15State Update Rule
- Asynchronous mode
- Update rule
Stable?
16Energy Function
FactE is lower bounded (upper
bounded).
If E is monotonically decreasing (increasing),
the system is stable.
17The Proof
Suppose that at time t 1, the kth neuron is
selected for update.
18The Proof
Suppose that at time t 1, the kth neuron is
selected for update.
19The Proof
20The Proof
Stable
1
? 0
1
0
1
?1
lt 0
lt 0
?1
? 0
lt 0
1
?1
?1
0
lt 0
21Feedback Networksand Associative Memories
??????? ?????????
22The Neuron of Continuous Hopfield NNs
23The Dynamics
Gi
24The Continuous Hopfield NNs
25The Continuous Hopfield NNs
Stable?
26Equilibrium Points
- Consider the autonomous system
- Equilibrium Points Satisfy
27Lyapunov Theorem
Call E(y) as energy function.
The system is asymptotically stable if the
following holds
There exists a positive-definite function E(y)
s.t.
28Lyapunov Energy Function
29Lyapunov Energy Function
I1
I2
I3
In
w1n
w3n
w2n
wn3
w13
w23
w12
w32
wn2
w31
w21
wn1
u1
u2
u3
un
g1
C1
g2
C2
g3
C3
gn
Cn
. . .
. . .
v1
v2
v3
vn
v1
v2
v3
vn
30Stability of Continuous Hopfield NNs
Dynamics
31Stability of Continuous Hopfield NNs
Dynamics
gt 0
32Stability of Continuous Hopfield NNs
Stable
33Basins of Attraction
34Basins of Attraction
35Local/Global Minima
Energy Landscape
36Feedback Networksand Associative Memories
??????? ?????????
37Associative Memories
- Also named content-addressable memory.
- Autoassociative Memory
- Hopfield Memory
- Heteroassociative Memory
- Bidirection Associative Memory (BAM)
38Associative Memories
Stored Patterns
Autoassociative
Heteroassociative
39Feedback Networksand Associative Memories
- Associative Memories
- Hopfield Memory
- Bidirection Memory
??????? ?????????
40Hopfield Memory
Fully connected
14,400 weights
12?10 Neurons
41Example
42Example
Memory Association
43Example
How to Store Patterns?
Memory Association
44The Storage Algorithm
Suppose the set of stored pattern is of dimension
n.
?
45The Storage Algorithm
46Analysis
Suppose that x ? xi.
47Example
48Example
49Example
E4
Stable
E0
E?4
50Example
E4
Stable
E0
E?4
51Problems of Hopfield Memory
- Complement Memories
- Spurious stable states
- Capacity
52Capacity of Hopfield Memory
- The number of storable patterns w.r.t. the size
of the network. - Study methods
- Experiments
- Probability
- Information theory
- Radius of attraction (?)
53Capacity of Hopfield Memory
- The number of storable patterns w.r.t. the size
of the network.
Hopfield (1982) demonstrated that the maximum
number of patterns that can be stored in the
Hopfield model of n nodes before the error in the
retrieved pattern becomes severe is around
0.15n. The memory capacity of the Hopfield
model can be increased as shown by Andrecut
(1972).
54Radius of attraction (0 ? ? ? 1/2)
55Feedback Networksand Associative Memories
- Associative Memories
- Hopfield Memory
- Bidirection Memory
??????? ?????????
56Bidirection Memory
Y Layer
Forward Pass
Backward Pass
Wwijn?m
X Layer
57Bidirection Memory
Stored Patterns
Y Layer
?
Forward Pass
Backward Pass
Wwijn?m
X Layer
58The Storage Algorithm
Stored Patterns
59Analysis
Suppose xk is one of the stored vector.
?0
60Energy Function