Title: Un Supervised Learning
1Un Supervised Learning
2Learning From Examples
1 9 16 36
25 4
1 3 4 6 5
2
3Supervised Learning
- When a set of targets of interest is provided by
an external teacher - we say that the learning is Supervised
- The targets usually are in the form of an input
output mapping - that the net should learn
4Feed Forward Nets
- Feed Forward Nets learn under supervision
- classification - all patterns in the training set
are coupled with the correct classification - classifying written digits into 10 categories
(the US post zip code project) - function approximation the values to be learnt
for the training points is known - time series prediction such as weather forecast
and stock values
5Hopfield Nets
- Associative Nets (Hopfield like) store predefined
memories. - During learning, the net goes over all patterns
to be stored (Hebb Rule) -
6Hopfield, Cntd
- When presented with an input pattern that is
similar to one of the memories, the network
restores the right memory, previously stored in
its weights (synapses)
7How do we learn?
- Many times there is no teacher to tell us how
to do things - A baby that learns how to walk
- Grouping of events into a meaningful scene
(making sense of the world) - Development of ocular dominance and orientation
selectivity in our visual system
8Self Organization
- Network Organization is fundamental to the brain
- Functional structure
- Layered structure
- Both parallel processing and serial processing
require organization of the brain
9Self Organizing Networks
- Discover significant patterns or features in the
input data - Discovery is done without a teacher
- Synaptic weights are changed according to
- local rules
- The changes affect a neurons immediate
environment - until a final configuration develops
10Questions
- How can a useful configuration develop from self
organization? - Can random activity produce coherent structure?
-
11Answer biologically
- There are self organized structures in the brain
- Neuronal networks grow and evolve to be
computationally efficient both in vitro and in
vivo - Random activation of the visual system can lead
to layered and structured organization
12Answer mathematically
- A. Turing, 1952
- Global order can arise from local interactions
- Random local interactions between neighboring
neurons can coalesce into states of global order,
and lead to coherent spatio temporal behavior
13Mathematically, Cntd
-
- Network organization takes place at 2 levels that
interact with each other - Activity certain activity patterns are produced
by a given network in response to input signals - Connectivity synaptic weights are modified in
response to neuronal signals in the activity
patterns - Self Organization is achieved if there is
positive feedback between changes in synaptic
weights and activity patterns -
-
14Principles of Self Organization
- Modifications in synaptic weights tend to self
amplify - Limitation of resources lead to competition among
synapses - Modifications in synaptic weights tend to
cooperate - Order and structure in activation patterns
represent redundant information that is
transformed into knowledge by the network
15(No Transcript)
16Redundancy
- Unsupervised learning depends on redundancy in
the data - Learning is based on finding patterns and
extracting features from the data
17Un Supervised Hebbian Learning
- The learning rule is Hebbian like
The change in weight depends on the product of
the neurons output and input, with a term that
makes the weights decrease
18US Hebbian Learning, Cntd
- Such a net converges into a weight vector that
maximizes the average on
- This means that the weight vector points at
the first principal component of the data - The network learns a feature of the data without
any prior knowledge - This is called feature extraction
19Visual Model
- Linsker (1986) proposed a model of self
organization in the visual system, based on
unsupervised Hebbian learning - Input is random dots (does not need to be
structured) - Layers as in the visual cortex, with FF
connections only (no lateral connections) - Each neuron receives inputs from a well defined
area in the previous layer (receptive fields) - The network developed center surround cells in
the 2nd layer of the model and orientation
selective cells in a higher layer - A self organized structure evolved from (local)
hebbian updates
20Un Supervised Competitive Learning
- In Hebbian networks, all neurons can fire at the
same time - Competitive learning means that only a single
neuron from each group fires at each time step - Output units compete with one another.
- These are winner takes all units (grandmother
cells)
21Simple Competitive Learning
N inputs units P output neurons P x N weights
22Network Activation
- The unit with the highest field hi fires
- i is the winner unit
- Geometrically is closest to the current
input vector - The winning units weight vector is updated to be
even closer to the current input vector
23Learning
- Starting with small random weights, at each
step - a new input vector is presented to the network
- all fields are calculated to find a winner
- is updated to be closer to the input
24Result
- Each output unit moves to the center of mass of a
cluster of input vectors ? - clustering
25Model Horizontal Vertical linesRumelhart
Zipser, 1985
- Problem identify vertical or horizontal signals
- Inputs are 6 x 6 arrays
- Intermediate layer with 8 WTA units
- Output layer with 2 WTA units
- Cannot work with one layer
26Rumelhart Zipser, Cntd
27Self Organizing (Kohonen) Maps
- Competitive networks (WTA neurons)
- Output neurons are placed on a lattice, usually
2-dimensional - Neurons become selectively tuned to various input
patterns (stimuli) - The location of the tuned (winning) neurons
become ordered in such a way that creates a - meaningful coordinate system for different input
features ? - a topographic map of input patterns is formed
28SOMs, Cntd
- Spatial locations of the neurons in the map are
indicative of statistical features that are
present in the inputs (stimuli) ? - Self Organization
29Biological Motivation
- In the brain, sensory inputs are represented by
topologically ordered computational maps - Tactile inputs
- Visual inputs (center-surround, ocular dominance,
orientation selectivity) - Acoustic inputs
30Biological Motivation, Cntd
- Computational maps are a basic building block of
sensory information processing - A computational map is an array of neurons
representing slightly different tuned processors
(filters) that operate in parallel on sensory
signals - These neurons transform input signals into a
place coded structure
31Kohonen Maps
- Simple case 2-d input and 2-d output layer
- No lateral connections
- Weight update is done for the winning neuron and
its surrounding neighborhood - The output layer is a sort of an elastic net that
wants to come as close as possible to the inputs - The output maps conserves the topological
relationships of the inputs
32Feature Mapping
33Kohonen Maps, Cntd
- Examples of topologic conserving mapping between
input and output spaces - Retintopoical mapping between the retina and the
cortex - Ocular dominance
- Somatosensory mapping (the homunculus)
34(No Transcript)
35Models
- Goodhill (1993) proposed a model for the
development of retinotopy and ocular dominance,
based on Kohonen Maps - Two retinas project to a single layer of cortical
neurons - Retinal inputs were modeled by random dots
patterns - Added between eyes correlation in the inputs
- The result is an ocular dominance map and a
retinotopic map as well -
36(No Transcript)
37Models, Cntd
- Farah (1998) proposed an explanation for the
spatial ordering of the homunculus using a simple
SOM. - In the womb, the fetus lies with its hands close
to its face, and its feet close to its genitals - This should explain the order of the
somatosensory areas in the homunculus
38Other Models
- Semantic self organizing maps to model language
acquisition - Kohonen feature mapping to model layered
organization in the LGN - Combination of unsupervised and supervised
learning to model complex computations in the
visual cortex
39Examples of Applications
- Kohonen (1984). Speech recognition - a map of
phonemes in the Finish language - Optical character recognition - clustering of
letters of different fonts - Angeliol etal (1988) travelling salesman
problem (an optimization problem) - Kohonen (1990) learning vector quantization
(pattern classification problem) - Ritter Kohonen (1989) semantic maps
40Summary
- Unsupervised learning is very common
- US learning requires redundancy in the stimuli
- Self organization is a basic property of the
brains computational structure - SOMs are based on
- competition (wta units)
- cooperation
- synaptic adaptation
- SOMs conserve topological relationships between
the stimuli - Artificial SOMs have many applications in
computational neuroscience
41(No Transcript)