Title: Connectionist Models: Basics
1Connectionist Models
Adapted from Neural Basis of Thought and
Language Jerome Feldman, Spring 2007,
feldman_at_icsi.berkeley.edu
2(No Transcript)
3Neural networks abstract from the details of real
neurons
- Conductivity delays are neglected
- An output signal is either discrete (e.g., 0 or
1) or it is a real-valued number (e.g., between 0
and 1) - Net input is calculated as the weighted sum of
the input signals - Net input is transformed into an output signal
via a simple function (e.g., a threshold
function)
4The McCullough-Pitts Neuron
Threshold
- yj output from unit j
- Wij weight on connection from j to i
- xi weighted sum of input to unit i
5Mapping from neuron
Nervous System Computational Abstraction
Neuron Node
Dendrites Input link and propagation
Cell Body Combination function, threshold, activation function
Axon Output link
Spike rate Output
Synaptic strength Connection strength/weight
6Simple Threshold Linear Unit
7Simple Neuron Model
1
8A Simple Example
- a x1w1x2w2x3w3... xnwn
- a 1x1 0.5x2 0.1x3
- x1 0, x2 1, x3 0
- Net(input) f 0.5
- Threshold bias 1
- Net(input) threshold biaslt 0
- Output 0
.
9Simple Neuron Model
1
1
1
1
10Simple Neuron Model
1
1
1
1
1
11Simple Neuron Model
0
1
1
1
12Simple Neuron Model
0
1
0
1
1
13Different Activation Functions
- Threshold Activation Function (step)
- Piecewise Linear Activation Function
- Sigmoid Activation Funtion
- Gaussian Activation Function
- Radial Basis Function
14Types of Activation functions
15The Sigmoid Function
ya
xneti
16The Sigmoid Function
Output1
ya
Output0
xneti
17The Sigmoid Function
Output1
Sensitivity to input
ya
Output0
xneti
18Changing the exponent k
K gt1
K lt 1
19Nice Property of Sigmoids
20Radial Basis Function
21Stochastic units
- Replace the binary threshold units by binary
stochastic units that make biased random
decisions. - The temperature controls the amount of noise
temperature
22Types of Neuron parameters
- The form of the input function - e.g. linear,
sigma-pi (multiplicative), cubic. - The activation-output relation - linear,
hard-limiter, or sigmoidal. - The nature of the signals used to communicate
between nodes - analog or boolean. - The dynamics of the node - deterministic or
stochastic.
23Computing various functions
- McCollough-Pitts Neurons can compute logical
functions. - AND, NOT, OR
24Computing other functions the OR function
i1 i2 y0
0 0 0
0 1 1
1 0 1
1 1 1
- Assume a binary threshold activation function.
- What should you set w01, w02 and w0b to be so
that you can get the right answers for y0?
25Many answers would work
- y f (w01i1 w02i2 w0bb)
- recall the threshold function
- the separation happens when w01i1 w02i2 w0bb
0 - move things around and you get
- i2 - (w01/w02)i1 - (w0bb/w02)
26Decision Hyperplane
- The two classes are therefore separated by the
decision' line which is defined by putting the
activation equal to the threshold. - It turns out that it is possible to generalise
this result to TLUs with n inputs. - In 3-D the two classes are separated by a
decision-plane. - In n-D this becomes a decision-hyperplane.
27Linearly separable patterns
PERCEPTRON is an architecture which can solve
this type of decision boundary problem. An "on"
response in the output node represents one
class, and an "off" response represents the
other.
Linearly Separable Patterns
28The Perceptron
29The Perceptron
Input Pattern
30The Perceptron
Input Pattern
Output Classification
31A Pattern Classification
32Pattern Space
- The space in which the inputs reside is referred
to as the pattern space. - Each pattern determines a point in the space by
using its component values as space-coordinates. - In general, for n-inputs, the pattern space will
be n-dimensional.
33The XOR Function
X1/X2 X2 0 X2 1
X1 0 0 1
X1 1 1 0
34The Input Pattern Space
35The Decision planes
From S. Harris Computer Cartoons
http//www.sciencecartoonsplus.com/galcomp2.htm
36Multi-layer Feed-forward Network
37Pattern Separation and NN architecture
38Triangle nodes and McCullough-Pitts Neurons?
A
B
C
39Representing concepts using triangle nodes
triangle nodes when two of the neurons fire, the
third also fires
40Basic Ideas
- Parallel activation streams.
- Top down and bottom up activation combine to
determine the best matching structure. - Triangle nodes bind features of objects to values
- Mutual inhibition and competition between
structures - Mental connections are active neural connections
41Bottom-up vs. Top-down Processes
- Bottom-up When processing is driven by the
stimulus - Top-down When knowledge and context are used to
assist and drive processing - Interaction The stimulus is the basis of
processing but almost immediately top-down
processes are initiated
42Stroop Effect
- Interference between form and meaning
43Name the words
- Book Car Table Box Trash Man Bed
- Corn Sit Paper Coin Glass House Jar
- Key Rug Cat Doll Letter Baby Tomato
- Check Phone Soda Dish Lamp Woman
44Name the print color of the words
- Blue Green Red Yellow Orange Black Red
- Purple Green Red Blue Yellow Black Red
- Green White Blue Yellow Red Black Blue
- White Red Yellow Green Black Purple
45Connectionist ModelMcClelland Rumelhart (1981)
- Knowledge is distributed and processing occurs in
parallel, with both bottom-up and top-down
influences - This model can explain the Word-Superiority
Effect because it can account for context effects
46Connectionist Model of Word Recognition
47Do rhymes compete?
- Cohort (Marlsen-Wilson)
- onset similarity is primary because of the
incremental nature of speech - Cat activates cap, cast, cattle, camera, etc.
- NAM (Neighborhood Activation Model)
- global similarity is primary
- Cat activates bat, rat, cot, cast, etc.
- TRACE (McClelland Elman)
- global similarity constrained by incremental
nature of speech
48Do rhymes compete?
- Temporal Sequence Learning in LTM
- global similarity constrained by incremental
nature of speech
49A 2-step Lexical Model
50Linking memory and tasks
From S. Harris Computer Cartoons
http//www.sciencecartoonsplus.com/galcomp2.htm
51Distributed vs Local Representation
John 1 1 0 0
Paul 0 1 1 0
George 0 0 1 1
Ringo 1 0 0 1
John 1 0 0 0
Paul 0 1 0 0
George 0 0 1 0
Ringo 0 0 0 1
- What happens if you want to represent a group?
- How many persons can you represent with n bits?
2n
- What happens if one neuron dies?
- How many persons can you represent with n bits? n
52Visual System
- 1000 x 1000 visual map
- For each location, encode
- orientation
- direction of motion
- speed
- size
- color
- depth
- Blows up combinatorically!
53Computational Model of Object Recognition(Riesenh
uber and Poggio, 1999)
invariance
invariance
54Eye Movements Beyond Feedforward Processing
- 1) Examine scene freely
- 2) estimate material
- circumstances of family
- 3) give ages of the people
- 4) surmise what family has
- been doing before arrival
- of unexpected visitor
- 5) remember clothes worn by
- the people
- 6) remember position of people
- and objects
- 7) estimate how long the unexpected
- visitor has been away from family
55How does activity lead to structural change?
- The brain (pre-natal, post-natal, and adult)
exhibits a surprising degree of activity
dependent tuning and plasticity. - To understand the nature and limits of the tuning
and plasticity mechanisms we study - How activity is converted to structural changes
(say the ocular dominance column formation) - It is centrally important to understand these
mechanisms to arrive at biological accounts of
perceptual, motor, cognitive and language
learning - Biological Learning is concerned with this topic.