Computing Spike by Spike - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Computing Spike by Spike

Description:

time t. Spikes from Poisson processes. An elementary generative scheme for ... Context sensitivity and characteristic dynamics are predictions for experiments ... – PowerPoint PPT presentation

Number of Views:153
Avg rating:3.0/5.0
Slides: 30
Provided by: K2177
Category:

less

Transcript and Presenter's Notes

Title: Computing Spike by Spike


1
Computing Spike by Spike
Klaus Pawelzik, David Rotermund, Udo Ernst,
University Bremen.
  • Brains high performance despite constraints
  • The brain a generative machine?
  • Dynamics from on-line estimation.
  • On-line learning rule for spike prediction.
  • Biological implementation.
  • A model for the cortex?

2
Perception rapid, robust, sometimes optimal
Mandon, Ernst, Pawelzik, Kreiter, 2003 U.Ernst,
priv. comm., Method Williams and Thornber,
Neural Computation, 2001.
S.Thorpe et al., Nature 1996
3
The brain as generative model
4
Judgements and prejustice
5
in explanations
The world
a parametrization
Estimation
6
Generative models are universal
7
Decision making in the ideal world
and in the real world
Slide borrowed from Rob de Ruyter van Steveninck
8
The cortex complex in space and time
Shadlen Newsome, 1994
J.Nolte, 1988
9
What, if rate is all there is?
Probability of next spike at neuron s_t
Bernoulli code
10
An elementary generative scheme for local networks
11
Generation cause by cause
Dynamics Maximize likelihood w.r.t. internal
variables
Learning Maximize likelihood w.r.t. weights
Constraints positivity, normalization!
12
Estimation spike-by-spike
Update
Learning
Hebbian rule maximizes likelihood (see Lee and
Seung 1999)
13
Estimating causes on-line from stochastic spikes
14
Estimating explanations
15
Sparseness from iterating priors
M2, N100, e0.1
16
Functions as generations
Learning
Execution
17
Representing Boolean functions
18
Results for 5 bit Boolean functions
Learning
Performance
19
Application USPS hand written digits
Learned weights
20
Digit classification with less than a spike per
neuron
N500, batch learning
N100, weights annealed
Nearest neighbor
21
Combining basic networksExample16 bit function
Layers
22
Whatve neurons got to do with it?
Context task, expectation, attention
Output
Input
23
The neuron is a dynamical system
24
Compartmental realization
Passive dendritic potential
Inhibition
Active somatic potential
R
V
threshold? spike?reset
Spike!
25
Biophysical solution of the X-OR problem
Spikes !
26
Optimal representation of natural images ...
Ringach (2002)
... depends on speed!
27
Summary
Hypothesis Networks predict their
inputs Constraints of neurons, spikes, speed,
noise Model Spike based generative model with
hidden variables Trick Poisson ? Bernoulli
process Algorithm On-line estimation of hidden
parameters from each spike Applications Boolean
functions 100 performance, one spike/neuron
enough Handwritten digits batch learning
yields mixtures of templates and local
features, 1 spike per neuron enough. Cortex
model Canonical microcircuit with two
compartments per neuron and lateral inhibition
realizes algorithm. Explains speed of cortical
computation with rate coding Natural images
weights resemble simple receptive
fields. Orientation tuning is contrast
invariant. Context sensitivity and
characteristic dynamics are predictions for
experiments
28
Acknowledgements
Misha Tsodyks (Weizmann Institute) ---- Matthias
Bethge (Redwood Institute) ---- Bailu
Si (University Bremen)
29
Thanks!
Write a Comment
User Comments (0)
About PowerShow.com