Title: Neural Networks Architecture
1Neural Networks Architecture
- Baktash Babadi
- IPM, SCS
- Fall 2004
2The Neuron Model
3Architectures (1)
- Feed Forward Networks
- The neurons are arranged in separate layers
- There is no connection between the neurons in the
same layer - The neurons in one layer receive inputs from the
previous layer - The neurons in one layer delivers its output to
the next layer - The connections are unidirectional
- (Hierarchical)
4Architectures (2)
- Recurrent Networks
- Some connections are present from a layer to the
previous layers
5Architectures (3)
- Associative networks
- There is no hierarchical arrangement
- The connections can be bidirectional
6Why Feed Forward?
Why Recurrent/Associative?
7An Example of Associative Networks Hopfield
Network
- John Hopfield (1982)
- Associative Memory via artificial neural networks
- Solution for optimization problems
- Statistical mechanics
8Neurons in Hopfield Network
- The neurons are binary units
- They are either active (1) or passive
- Alternatively or
- The network contains N neurons
- The state of the network is described as a vector
of 0s and 1s
9The architecture of Hopfield Network
- The network is fully interconnected
- All the neurons are connected to each other
- The connections are bidirectional and symmetric
- The setting of weights depends on the application
10Updating the Hopfield Network
- The state of the network changes at each time
step. There are four updating modes - Serial Random
- The state of a randomly chosen single neuron will
be updated at each time step - Serial-Sequential
- The state of a single neuron will be updated at
each time step, in a fixed sequence - Parallel-Synchronous
- All the neurons will be updated at each time step
synchronously - Parallel Asynchronous
- The neurons that are not in refractoriness will
be updated at the same time
11The updating Rule (1)
- Here we assume that updating is serial-Random
- Updating will be continued until a stable state
is reached. - Each neuron receives a weighted sum of the inputs
from other neurons - If the input is positive the state of the
neuron will be 1, otherwise 0
12The updating rule (2)
13Convergence of the Hopfield Network (1)
- Does the network eventually reach a stable state
(convergence)? - To evaluate this a energy value will be
associated to the network - The system will be converged if the energy is
minimized
14Convergence of the Hopfield Network (2)
- Why energy?
- An analogy with spin-glass models of Ferro-
magnetism (Ising model) - The system is stable if the energy is minimized
15Convergence of the Hopfield Network (3)
16Convergence of the Hopfield Network (4)
- The changes of E with updating
In each case the energy will decrease or remains
constant thus the system tends to Stabilize.
17The Energy Function
- The energy function is similar to a
multidimensional (N) terrain
Local Minimum
Local Minimum
Global Minimum
18Hopfield network as a model for associative memory
- Associative memory
- Associates different features with eacother
- Karen ?? green
- George ?? red
- Paul ?? blue
- Recall with partial cues
19Neural Network Model of associative memory
- Neurons are arranged like a grid
20Setting the weights
- Each pattern can be denoted by a vector of -1s or
1s - If the number of patterns is m then
- Hebbian Learning
- The neurons that fire together , wire together
21Limitations of Hofield associative memory
- 1) The evoked pattern is sometimes not
necessarily the most similar pattern to the input - 2) Some patterns will be recall more than others
- 3) Spurious states non-original patterns
- Capacity 0.15 N
22Hopfield network and the brain (1)
- In the real neuron, synapses are distributed
along the dendritic tree and their distance
change the synaptic weight - In hopfield network there is no dendritic
geometry - If they are distributed uniformly, the geometry
is not important
23Hopfield network and the brain (2)
- In the brain the Dale principle holds and the
connections are not symmetric - The hopfield network with assymetric weights and
dale principle, work properly
24Hopfield network and the brain (3)
- The brain is insensitive to noise and local
lesions - Hopfield network can tolerate noise in the input
and partial loss of synapses
25Hopfield network and the brain (4)
- In brain the neurons are not binary devices, they
generate continuous values of firing rates - Hopfield network with sigmoid transfer function
is even more powerful than the binary version
26Hopfield network and the brain (5)
- In the brain most of the neurons are silent or
firing at low rates but in hopfield network many
of the neurons are active - In sparse hopfield network the capacity is even
more
27Hopfield network and the brain (6)
- In hopfield network updating is serial which is
far from biological reality - In parallel updating hopfield network the
associative memories can be recalled as well
28Hopfield network and the brain (7)
- When the number of learned patterns in hopfield
network will be overloaded, the performance of
the network will fall abruptly for all the stored
patterns - But in real brain an overload of memories affect
only some memories and the rest of them will be
intact - Catastrophic inference
29Hopfield network and the brain (8)
- In hopfield network the usefull information
appears only when the system is in the stable
state - The Brain do not fall in stable states and
remains dynamic
30Hopfield network and the brain (9)
- The connectivity in the brain is much less than
hopfield network - The diluted hopfield network works well