Un Supervised Learning - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Un Supervised Learning

Description:

Tactile inputs. Visual inputs (center-surround, ocular dominance, orientation selectivity) ... organization is a basic property of the brain's computational ... – PowerPoint PPT presentation

Number of Views:51
Avg rating:3.0/5.0
Slides: 30
Provided by: minerv9
Category:

less

Transcript and Presenter's Notes

Title: Un Supervised Learning


1
Un Supervised Learning
  • Self Organizing Maps

2
Un Supervised Competitive Learning
  • In Hebbian networks, all neurons can fire at the
    same time
  • Competitive learning means that only a single
    neuron from each group fires at each time step
  • Output units compete with one another.
  • These are winner takes all units (grandmother
    cells)

3
UnSupervised Competitive Learning
  • In the hebbian like models, all the neurons can
    fire together
  • In Competitive Learning models, only one unit (or
    one per group) can fire at a time
  • Output units compete with one another ?
  • Winner Takes All units
  • (grandmother cells)

4
US Competitive, Cntd
  • Such networks cluster the data points
  • The number of clusters is not predefined but is
    limited to the number of output units
  • Applications include VQ, medical diagnosis,
    document classification and more

5
Simple Competitive Learning
N inputs units P output neurons P x N weights

6
Simple Model, Cntd
  • All weights are positive and normalized
  • Inputs and outputs are binary
  • Only one unit fires in response to an input

7
Network Activation
  • The unit with the highest field hi fires
  • i is the winner unit
  • Geometrically is closest to the current
    input vector
  • The winning units weight vector is updated to be
    even closer to the current input vector
  • Possible variation adding lateral inhibition

8
Learning
  • Starting with small random weights, at each
    step
  • a new input vector is presented to the network
  • all fields are calculated to find a winner
  • is updated to be closer to the input

9
Learning Rule
  • Standard Competitive Learning

Can be formulated as hebbian
10
Result
  • Each output unit moves to the center of mass of a
    cluster of input vectors ?
  • clustering

11
Competitive Learning, Cntd
  • It is important to break the symmetry in the
    initial random weights
  • Final configuration depends on initialization
  • A winning unit has more chances of winning the
    next time a similar input is seen
  • Some outputs may never fire
  • This can be compensated by updating the non
    winning units with a smaller update

12
Model Horizontal Vertical linesRumelhart
Zipser, 1985
  • Problem identify vertical or horizontal signals
  • Inputs are 6 x 6 arrays
  • Intermediate layer with 8 WTA units
  • Output layer with 2 WTA units
  • Cannot work with one layer

13
Rumelhart Zipser, Cntd
14
Geometrical Interpretation
  • So far the ordering of the output units
    themselves was not necessarily informative
  • The location of the winning unit can give us
    information regarding similarities in the data
  • We are looking for an input output mapping that
    conserves the topologic properties of the inputs
    ? feature mapping
  • Given any two spaces, it is not guaranteed that
    such a mapping exits!

15
Biological Motivation
  • In the brain, sensory inputs are represented by
    topologically ordered computational maps
  • Tactile inputs
  • Visual inputs (center-surround, ocular dominance,
    orientation selectivity)
  • Acoustic inputs

16
Biological Motivation, Cntd
  • Computational maps are a basic building block of
    sensory information processing
  • A computational map is an array of neurons
    representing slightly different tuned processors
    (filters) that operate in parallel on sensory
    signals
  • These neurons transform input signals into a
    place coded structure

17
Self Organizing (Kohonen) Maps
  • Competitive networks (WTA neurons)
  • Output neurons are placed on a lattice, usually
    2-dimensional
  • Neurons become selectively tuned to various input
    patterns (stimuli)
  • The location of the tuned (winning) neurons
    become ordered in such a way that creates a
  • meaningful coordinate system for different input
    features ?
  • a topographic map of input patterns is formed

18
SOMs, Cntd
  • Spatial locations of the neurons in the map are
    indicative of statistical features that are
    present in the inputs (stimuli) ?
  • Self Organization

19
Kohonen Maps
  • Simple case 2-d input and 2-d output layer
  • No lateral connections
  • Weight update is done for the winning neuron and
    its surrounding neighborhood

20
Neighborhood Function
  • F is maximal for i and drops to zero far from i,
    for example
  • The update pulls the winning unit (weight
    vector) to be closer to the input, and also drags
    the close neighbors of this unit ?

21
  • The output layer is a sort of an elastic net that
    wants to come as close as possible to the inputs
  • The output maps conserves the topological
    relationships of the inputs
  • Both ? and s can be changed during the learning

22
Feature Mapping
23
Topologic Maps in the Brain
  • Examples of topologic conserving mapping between
    input and output spaces
  • Retintopoical mapping between the retina and the
    cortex
  • Ocular dominance
  • Somatosensory mapping (the homunculus)

24
Models
  • Goodhill (1993) proposed a model for the
    development of retinotopy and ocular dominance,
    based on Kohonen Maps
  • Two retinas project to a single layer of cortical
    neurons
  • Retinal inputs were modeled by random dots
    patterns
  • Added between eyes correlation in the inputs
  • The result is an ocular dominance map and a
    retinotopic map as well

25
(No Transcript)
26
Models, Cntd
  • Farah (1998) proposed an explanation for the
    spatial ordering of the homunculus using a simple
    SOM.
  • In the womb, the fetus lies with its hands close
    to its face, and its feet close to its genitals
  • This should explain the order of the
    somatosensory areas in the homunculus

27
Other Models
  • Semantic self organizing maps to model language
    acquisition
  • Kohonen feature mapping to model layered
    organization in the LGN
  • Combination of unsupervised and supervised
    learning to model complex computations in the
    visual cortex

28
Examples of Applications
  • Kohonen (1984). Speech recognition - a map of
    phonemes in the Finish language
  • Optical character recognition - clustering of
    letters of different fonts
  • Angeliol etal (1988) travelling salesman
    problem (an optimization problem)
  • Kohonen (1990) learning vector quantization
    (pattern classification problem)
  • Ritter Kohonen (1989) semantic maps

29
Summary
  • Unsupervised learning is very common
  • US learning requires redundancy in the stimuli
  • Self organization is a basic property of the
    brains computational structure
  • SOMs are based on
  • competition (wta units)
  • cooperation
  • synaptic adaptation
  • SOMs conserve topological relationships between
    the stimuli
  • Artificial SOMs have many applications in
    computational neuroscience
Write a Comment
User Comments (0)
About PowerShow.com