Title: 2L490 SOM 1
1Self Organizing Maps
- A major principle of organization is the
topographic - map, i.e. groups of adjacent neurons process
- information from neighboring parts of the sensory
- systems.
- Topographic maps can be distorted in the sense
- that the amount of neurons involved is more re-
- lated to the importance of the task performed,
than - to the size of the region of the body surface
that - provides the input signals.
2Brain Maps
- A part of the brain that contains many
topographic - maps is the cerebral cortex. Some of these are
- Visual cortex
- Various maps, such as retinotopic map
- Somatosensory cortex
- Somatotopic map
- Auditory cortex
- Tonotopic map
3Somatotopic Map
The somatosensory cortex processes
the information of the sen- sory neurons that
lie below the skin. Note that both the skin and
the somatosensory cortex can be seen
as two-dimensional spaces
4Somatosensory Man
Picture of the male body with the body parts
scaled accor- ding to the area de- voted to these
parts in the somatosenso- ry cortex
5Unsupervised Selforganizing Learning
- The neurons are arranged in some grid of fixed
topology - The winning neuron is the neuron with its weight
vector nearest to the supplied input vector - In principle all neurons are allowed to change
their weight - The amount of change of a neuron, however,
depends on the distance (in the grid) of that
neuron to the winning neuron. Larger distance
implies smaller change.
6Grid Topologies
- The following topologies are frequently used
- One-dimensional grids
- Line
- Ring
- Two-dimensional grids
- Square grid
- Torus
- Hexagonal grid
- If additional knowledge of the input space is
avail- - able more sophisticated topologies can be used.
7Neighborhoods box distance
Square and hexagonal grid with neighborhoods
based on box distance
Grid-lines are not shown
8Manhattan or Link Distance
Distance to the cen- tral cell measured in number
of links
9Euclidean Distance
10Topologically Correct Maps
The aim of unsupervised self-organizing learning
is to construct a topologically correct map of
the input space. For any two neurons i and j
in the grid, let d(i,j) be their fixed distance
in the grid.
A mapping is called topological correct when
11Neighborhood Functions
- The allowed weight change of neuron j when i
- is the winning neuron is given by the neighbor-
- hood function h(i, j). Common choices are
-
-
- (Winner
takes it all)
12Unsupervised Self-organizing Learning(incremental
version)
13Unsupervised Self-organizing Learning (batch
version)
14Error Function
15Gradients of the Error functions
Because It follows that the gradient of the
error is given by
16Tuning the Learning Process
- The learning process usually consists of
- two phases
- A phase in which the weight vectors reorder and
become disentangled. In this phase neigh-borhoods
(b) must be large. - A phase in which the weight vectors are fine-
tuned to the part of the training set for which
they are the respective winners. In this phase
the neighborhoods (b) must be small to avoid
interference from other neurons.
17Phonotopic Map
- Input vectors are 15 dimensional speech samples
from the Finnish language - Each vector component represents the average
output power over 10ms interval in a certain
range of the spectrum (200 Hz 6400 Hz) - Neurons are organized in a 8x12 hexagonal grid
- After formation of the map, the individual
neurons were calibrated to represent phonemes - The resulting map is called the phonetic
typewriter
18Phonetic Typewriter
The phonetic typewriter is constructed by Tuevo
Kohonen, see e.g. his book Self-Organizing
Maps, Springer, 1995.
19Travelling Salesman Problem
TSP is one of the notorious difficult
(NP-Complete) combinatorial optimization
problems. The so-called elastic net method can be
used to (approximately) solve the Euclidean
version of this problem (Durbin and
Willshaw). To that end one uses a SOM in which
the neurons are arranged in a one-dimensional
cycle.
http//www.patol.com/java/TSP/index.html
20Space-filling curves
More space filling curves