Modeling How Bats Capture Targets - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Modeling How Bats Capture Targets

Description:

Antonym: predictive' uses models of the target's motion and of the bat's self ... Modeling Approach. Used an Echo State Machine (Jaeger and Haas, 2004) ... – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 23
Provided by: harry8
Category:

less

Transcript and Presenter's Notes

Title: Modeling How Bats Capture Targets


1
Modeling How Bats Capture Targets
  • Harry R. Erwin, PhD
  • Hybrid Intelligent Systems Group
  • School of Computing and Technology
  • University of Sunderland

2
Well, How Do Bats Capture Targets?
  • Figure from Webster and Brazier, Experimental
    Studies on Target Detection, Evaluation and
    Interception by Echo-locating Bats, 1965.
  • A bat (Myotis lucifugus) capturing a moth in
    foliage.
  • 100 millisecond intervals.
  • The bat had first detected the tree about 500
    milliseconds before the first image.
  • Data available to the bata few biosonar
    snapshots in the dark.

3
Evidence for Predictive Target Capture in Bats
  • Definition non-predictiveuses the current
    state estimate or last known target localization
    to control the capture. E.g., simple homing, lead
    pursuit, lag pursuit.
  • Antonym predictiveuses models of the targets
    motion and of the bats self-motion to select a
    capture strategy to optimize the capture
    probability.
  • For the evidence, see Campbell, K.A. and R.A.
    Suthers, Predictive tracking of horizontally
    moving targets by the fishing bat, Noctilio
    leporinus, in Animal Sonar Processes and
    Performance, P.E. Nachtigall and P.W. Moore,
    Editors. 1988, Plenum Press. p. 501-506.

4
Modeling trajectory prediction using neural
microcircuits
  • Based on
  • H. Jaeger and H. Haas, "Harnessing nonlinearity
    predicting chaotic systems and saving energy in
    wireless telecommunication," Science, vol. 304,
    pp. 78-80, 2004.
  • W. Maass, T. Natschläger, and H. Markram,
    "Real-time computing without stable states a new
    framework for neural computation based on
    perturbations," Neural Computation, vol. 14, pp.
    2531-2560, 2002, and
  • N. Bertschinger and T. Natschläger, "Real-time
    computation at the edge of chaos in recurrent
    neural networks," Neural Computation, vol. 16,
    pp. 1413-1436, 2004.

5
Background
  • Natschläger, et al. (2002), suggest that the
    stereotypical neural microcircuits of the cortex
    may be the computational units of the brain.
  • These microcircuits appear well-adapted to
    handling continuous streams of information, but
    sensorimotor integration in actively echolocating
    bats requires a computational unit that can
    generate a continuous output stream representing
    the location of a target from asynchronously
    received discrete echo returns.
  • Hence this research.

6
A Simple Task Performed by Bats
  • Model nonstationary target acceleration,
    velocity, and position accurately enough to be
    able to approach the target within 5-10
    centimeters.
  • Handle asynchronous echo return timing (with
    inter-cry intervals ranging over 2-3 orders of
    magnitude).
  • Predict forward over a variable time interval
    ranging up to a second. (This can be estimated as
    the distance to the target divided by the closing
    rate.)

7
Modeling Approach
  • Used an Echo State Machine (Jaeger and Haas,
    2004). All cells were standard perceptrons with a
    sigmoidal output.
  • Target trajectory was represented by a pair of
    arrays of place cells organized in general
    Cartesian coordinates.
  • Sensory afference structured similarly, but with
    the input place cells signaling only when a
    return was received. A flag indicated that a
    return was received during an interval.
  • Echo reservoir with 100-1000 cells.
  • Output place cells continuously reflected input
    from the echo reservoir cells, input place cells,
    previous output place cell values, the flag
    variable, and constant bias.

8
Initial Parameterization
  • Based on behavioural data.
  • 100 place cells in each of X and Y dimensions
    over a 1 meter interval.
  • Radius of target motion 25-35 cm, uniformly
    distributed (UD).
  • Center of motion UD over a 30x30 cm square.
  • 5-7 radians/sec in either direction, UD.
  • 20 msec interval.
  • Initial phase UD between 0 and 2?.
  • Cry rate 12-15.6 cries/sec UD.

9
Model Structure
Input Array
OutputArray
Echo State Reservoir
Circle indicates internal feedback
10
Programming the Echo State Machine
  • Initialized with random synaptic weight values
    (other than those onto the output cells).
  • Took the internal synaptic weight values for the
    echo state reservoir and computed the spectral
    radius (treating the synaptic weights as defining
    a linear transformation). Scaled the weights of
    the reservoir to make the spectral radius to
    0.98. (This value defines how quickly the
    internal state fades.)

11
Training the Echo State Machine
  • Ran the system to generate a large number of cell
    activation sequences with desired input values
    and with the output cells clamped to their
    corresponding correct values. (Note this assumed
    that state was an attractor!) Sampled only after
    the initialization transient for each sequence.
  • Used least squares estimation to determine the
    weights at the output cells needed to produce the
    clamped values. As long as the resulting system
    was robust, it was expected to converge to the
    desired behavior.

12
Initial Results
  • The echo state machine failed to interpolate
    between updates, and the reservoir did not
    maintain a continuous representation of a track,
    probably due to the magnitude of the time delays
    in the network.
  • In addition, as the place cells were almost
    always inactive, setting all cells inactive was a
    solution that produced a low error and so tended
    to be found by the training algorithm, suggesting
    that the assumption that the desired output state
    was a robust attractor was false.

13
Discussion
  • The failure to interpolate between state updates
    suggests persistent neural activity is necessary
    and should probably be neuron-local rather than
    generated by the network to produce changes on
    the right time scales.
  • The zero (or spontaneous activity) state acting
    as an attractor suggests that selective feedback
    strength is too weak in the system (based on
    Brunel, 2003). This is a problem with the
    training algorithm used (least squares fit).

14
Required Characteristics of Model Neurons
  • Persistent activity is necessary, probably at the
    neural (rather than network) level.
  • See Fellous, J.-M. and Sejnowski, T. J. (2003)
    Cerebral Cortex, 13, 1232-1241, where they
    discuss evidence for this based on in vitro
    experiments involving cortical slice
    preparations. Note, their results have been
    challenged.
  • In addition, there has to be some way of
    integrating forward velocity and acceleration to
    calculate a forward displacement, so simple
    persistent activity is insufficient.

15
Displacement or Local Representations?
  • It would be useful if the predicted displacement
    could be computed globally as a vector and then
    added to the latest known location of the target.
  • This would involve output place cells receiving
    two inputsthe last known location and the
    displacement representation.

16
Simplified Network Topology
Displacement Signals
D
E
Input Place Cells
B
A
X
Y
Output Place Cells
17
Analysis
  • Displacement signal D should fail to activate
    cell Y if cell A is active and should succeed if
    cell B is active.
  • Displacement signal E should activate cell Y if
    cell A is active, and should fail if B is active.
  • So what happens?
  • Pay attention to cell Y, where the action is.
  • Hint Minsky and Papert (1969)

18
Equations
  • First consider cell A acting on cell Y
  • DwDY AwAY lt T (subthreshold)
  • EwEY AwAY gt T (suprathreshold)
  • Hence EwEY gt DwDY
  • Now consider cell B acting on the same cell
  • DwDY BwBY gt T (suprathreshold)
  • EwEY BwBY lt T (subthreshold)
  • Hence DwDY gt EwEY, a contradiction

19
Discussion
  • It is hard (but not impossible) to use neural
    networks to do vector arithmetic.
  • Hence trajectories are likely to be represented
    locallye.g., by a linked list of neural elements
    (to use computing terminology). Each element
    would retain a full local state.
  • See Priebe, Churchland, and Lisberger,
    Reconstruction of Target Speed for the Guidance
    of Pursuit Eye Movements. Journal of
    Neuroscience, 2001. 21(9) p. 31963206, for
    related issues.
  • Whether these neural elements are distributed is
    an interesting question.

20
Further Discussion
  • Note that a vector representation can be
    constructed by hand, but it requires a hidden
    layer where each neuron associates a specific
    displacement with a specific input location.
  • An output place cell is then triggered if the
    displacement plus input location equals the
    corresponding output location.
  • This layer would be 6-dimensional, requiring
    1,000,000 neurons for a 10x10x10 subdivision of
    space, and 1,000,000,000,000 for a 100x100x100
    subdivision.

21
Where Do We Go Now?
  • More realistic network architectures including
  • Some form of short term memory
  • Local representations
  • Bistable networks
  • A sensible model of forward integration
  • Echo state networks (and liquid state networks)
    are too generalized, although something of the
    sort with more realistic models of neocortical
    neurons may be suitable.

22
Acknowledgements
  • Cynthia F. Moss, Ph.D., Department of Psychology,
    Program in Neuroscience and Cognitive Science,
    University of Maryland.
  • Steven Womble, Ph.D., John Murray, and the other
    members of the Hybrid Intelligent Systems Group,
    School of Computing and Technology, University of
    Sunderland.
Write a Comment
User Comments (0)
About PowerShow.com