Connectionist Models: Basics - PowerPoint PPT Presentation

1 / 54
About This Presentation
Title:

Connectionist Models: Basics

Description:

Motor cortex Somatosensory cortex Sensory associative cortex Pars opercularis Visual associative cortex Broca s area Visual cortex Connectionist Models – PowerPoint PPT presentation

Number of Views:107
Avg rating:3.0/5.0
Slides: 55
Provided by: SriniNa9
Category:

less

Transcript and Presenter's Notes

Title: Connectionist Models: Basics


1
Connectionist Models
Adapted from Neural Basis of Thought and
Language Jerome Feldman, Spring 2007,
feldman_at_icsi.berkeley.edu
2
(No Transcript)
3
Neural networks abstract from the details of real
neurons
  • Conductivity delays are neglected
  • An output signal is either discrete (e.g., 0 or
    1) or it is a real-valued number (e.g., between 0
    and 1)
  • Net input is calculated as the weighted sum of
    the input signals
  • Net input is transformed into an output signal
    via a simple function (e.g., a threshold
    function)

4
The McCullough-Pitts Neuron
Threshold
  • yj output from unit j
  • Wij weight on connection from j to i
  • xi weighted sum of input to unit i

5
Mapping from neuron
Nervous System Computational Abstraction
Neuron Node
Dendrites Input link and propagation
Cell Body Combination function, threshold, activation function
Axon Output link
Spike rate Output
Synaptic strength Connection strength/weight
6
Simple Threshold Linear Unit
7
Simple Neuron Model
1
8
A Simple Example
  • a x1w1x2w2x3w3... xnwn
  • a 1x1 0.5x2 0.1x3
  • x1 0, x2 1, x3 0
  • Net(input) f 0.5
  • Threshold bias 1
  • Net(input) threshold biaslt 0
  • Output 0

.
9
Simple Neuron Model
1
1
1
1
10
Simple Neuron Model
1
1
1
1
1
11
Simple Neuron Model
0
1
1
1
12
Simple Neuron Model
0
1
0
1
1
13
Different Activation Functions
  • Threshold Activation Function (step)
  • Piecewise Linear Activation Function
  • Sigmoid Activation Funtion
  • Gaussian Activation Function
  • Radial Basis Function

14
Types of Activation functions
15
The Sigmoid Function
ya
xneti
16
The Sigmoid Function
Output1
ya
Output0
xneti
17
The Sigmoid Function
Output1
Sensitivity to input
ya
Output0
xneti
18
Changing the exponent k
K gt1
K lt 1
19
Nice Property of Sigmoids
20
Radial Basis Function
21
Stochastic units
  • Replace the binary threshold units by binary
    stochastic units that make biased random
    decisions.
  • The temperature controls the amount of noise

temperature
22
Types of Neuron parameters
  • The form of the input function - e.g. linear,
    sigma-pi (multiplicative), cubic.
  • The activation-output relation - linear,
    hard-limiter, or sigmoidal.
  • The nature of the signals used to communicate
    between nodes - analog or boolean.
  • The dynamics of the node - deterministic or
    stochastic.

23
Computing various functions
  • McCollough-Pitts Neurons can compute logical
    functions.
  • AND, NOT, OR

24
Computing other functions the OR function
i1 i2 y0
0 0 0
0 1 1
1 0 1
1 1 1
  • Assume a binary threshold activation function.
  • What should you set w01, w02 and w0b to be so
    that you can get the right answers for y0?

25
Many answers would work
  • y f (w01i1 w02i2 w0bb)
  • recall the threshold function
  • the separation happens when w01i1 w02i2 w0bb
    0
  • move things around and you get
  • i2 - (w01/w02)i1 - (w0bb/w02)

26
Decision Hyperplane
  • The two classes are therefore separated by the
    decision' line which is defined by putting the
    activation equal to the threshold.
  • It turns out that it is possible to generalise
    this result to TLUs with n inputs.
  • In 3-D the two classes are separated by a
    decision-plane.
  • In n-D this becomes a decision-hyperplane.

27
Linearly separable patterns
PERCEPTRON is an architecture which can solve
this type of decision boundary problem. An "on"
response in the output node represents one
class, and an "off" response represents the
other.
Linearly Separable Patterns
28
The Perceptron
29
The Perceptron
Input Pattern
30
The Perceptron
Input Pattern
Output Classification
31
A Pattern Classification
32
Pattern Space
  • The space in which the inputs reside is referred
    to as the pattern space.
  • Each pattern determines a point in the space by
    using its component values as space-coordinates.
  • In general, for n-inputs, the pattern space will
    be n-dimensional.

33
The XOR Function
X1/X2 X2 0 X2 1
X1 0 0 1
X1 1 1 0
34
The Input Pattern Space
 
35
The Decision planes
 
From S. Harris Computer Cartoons
http//www.sciencecartoonsplus.com/galcomp2.htm
36
Multi-layer Feed-forward Network
37
Pattern Separation and NN architecture
38
Triangle nodes and McCullough-Pitts Neurons?
A
B
C
39
Representing concepts using triangle nodes
triangle nodes when two of the neurons fire, the
third also fires
40
Basic Ideas
  • Parallel activation streams.
  • Top down and bottom up activation combine to
    determine the best matching structure.
  • Triangle nodes bind features of objects to values
  • Mutual inhibition and competition between
    structures
  • Mental connections are active neural connections

41
Bottom-up vs. Top-down Processes
  • Bottom-up When processing is driven by the
    stimulus
  • Top-down When knowledge and context are used to
    assist and drive processing
  • Interaction The stimulus is the basis of
    processing but almost immediately top-down
    processes are initiated

42
Stroop Effect
  • Interference between form and meaning

43
Name the words
  • Book Car Table Box Trash Man Bed
  • Corn Sit Paper Coin Glass House Jar
  • Key Rug Cat Doll Letter Baby Tomato
  • Check Phone Soda Dish Lamp Woman

44
Name the print color of the words
  • Blue Green Red Yellow Orange Black Red
  • Purple Green Red Blue Yellow Black Red
  • Green White Blue Yellow Red Black Blue
  • White Red Yellow Green Black Purple

45
Connectionist ModelMcClelland Rumelhart (1981)
  • Knowledge is distributed and processing occurs in
    parallel, with both bottom-up and top-down
    influences
  • This model can explain the Word-Superiority
    Effect because it can account for context effects

46
Connectionist Model of Word Recognition
47
Do rhymes compete?
  • Cohort (Marlsen-Wilson)
  • onset similarity is primary because of the
    incremental nature of speech
  • Cat activates cap, cast, cattle, camera, etc.
  • NAM (Neighborhood Activation Model)
  • global similarity is primary
  • Cat activates bat, rat, cot, cast, etc.
  • TRACE (McClelland Elman)
  • global similarity constrained by incremental
    nature of speech

48
Do rhymes compete?
  • Temporal Sequence Learning in LTM
  • global similarity constrained by incremental
    nature of speech

49
A 2-step Lexical Model
50
Linking memory and tasks
From S. Harris Computer Cartoons
http//www.sciencecartoonsplus.com/galcomp2.htm
51
Distributed vs Local Representation
John 1 1 0 0
Paul 0 1 1 0
George 0 0 1 1
Ringo 1 0 0 1
John 1 0 0 0
Paul 0 1 0 0
George 0 0 1 0
Ringo 0 0 0 1
  • What happens if you want to represent a group?
  • How many persons can you represent with n bits?
    2n
  • What happens if one neuron dies?
  • How many persons can you represent with n bits? n

52
Visual System
  • 1000 x 1000 visual map
  • For each location, encode
  • orientation
  • direction of motion
  • speed
  • size
  • color
  • depth
  • Blows up combinatorically!

53
Computational Model of Object Recognition(Riesenh
uber and Poggio, 1999)
invariance
invariance
54
Eye Movements Beyond Feedforward Processing
  • 1) Examine scene freely
  • 2) estimate material
  • circumstances of family
  • 3) give ages of the people
  • 4) surmise what family has
  • been doing before arrival
  • of unexpected visitor
  • 5) remember clothes worn by
  • the people
  • 6) remember position of people
  • and objects
  • 7) estimate how long the unexpected
  • visitor has been away from family

55
How does activity lead to structural change?
  • The brain (pre-natal, post-natal, and adult)
    exhibits a surprising degree of activity
    dependent tuning and plasticity.
  • To understand the nature and limits of the tuning
    and plasticity mechanisms we study
  • How activity is converted to structural changes
    (say the ocular dominance column formation)
  • It is centrally important to understand these
    mechanisms to arrive at biological accounts of
    perceptual, motor, cognitive and language
    learning
  • Biological Learning is concerned with this topic.
Write a Comment
User Comments (0)
About PowerShow.com