Particle Filtering - PowerPoint PPT Presentation

About This Presentation
Title:

Particle Filtering

Description:

Part I (PF from Dynamic Bayes Net Perspective) Understand ... A. Doucet, N. de Freitas, and N. Gordon, editors. Sequential Monte Carlo Methods in Practice. ... – PowerPoint PPT presentation

Number of Views:164
Avg rating:3.0/5.0
Slides: 79
Provided by: darrylm
Category:

less

Transcript and Presenter's Notes

Title: Particle Filtering


1
Particle Filtering
2
Organization of Slides
  • Part I (PF from Dynamic Bayes Net Perspective)
    Understand particle filtering as a likelihood
    monte carlo sampling method on DBNs
  • Review of likelihood sampling
  • Uses RN
  • Part II (PF from general filtering perspective
  • Uses the Arulampalam tutorial

3
(No Transcript)
4
(No Transcript)
5
(No Transcript)
6
(No Transcript)
7
(No Transcript)
8
(No Transcript)
9
Outcome ltCloudy,sprinkler,Rain,Wetgrassgt
10
(No Transcript)
11
Problem may need many many samples if the
required probability is very low..
12
(No Transcript)
13
(No Transcript)
14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
Particle Filters a Solution to Hard Problems in
Navigation, Target Tracking, and Perception
  • Darryl Morrell Ya Xue
  • Department of Electrical Engineering
  • Arizona State University
  • Portions of this work supported by AFOSR under
    award number F49620-00-1-0124

27
References for More Information
  • M. Arulampalam, S. Maskell, N. Gordon, and T.
    Clapp. A tutorial on particle filters for online
    nonlinear/non-Gaussian Bayesian tracking. IEEE
    Transactions on Signal Processing, 50(2)174-188,
    February 2002.
  • This is an excellent tutorial paper-read this
    first.
  • A. Doucet, N. de Freitas, and N. Gordon, editors.
    Sequential Monte Carlo Methods in Practice.
    Springer-Verlag, 2001.
  • This is a broad ranging collection of articles
    that will introduce your to most of the important
    particle filter developments.

28
Introduction
  • The importance of Monte Carlo methods for
    inference in science and engineering problems has
    grown steadily over the past decade. This growth
    has largely been propelled by an explosive
    increase in accessible computing power. it has
    become clear that Monte Carlo methods can
    significantly expand the class of problems that
    can be addressed practically.
  • (Introduction to Feb 2002 IEEE Transactions on
    Signal Processing special issue on Monte Carlo
    Methods)

29
Sequential Monte Carlo Techniques
  • Sequential Monte Carlo techniques have been
    developed in a wide range of disciplines, and go
    under many names
  • Bootstrap filtering
  • The condensation algorithm
  • Particle filtering
  • Interacting particle approximations
  • Survival of the fittest

30
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

31
Applications of Particle Filters
  • Particle filters have provided solutions to
    problems from many disciplines
  • image processing and understanding
  • tracking complex objects (e.g. people) in video
    sequences
  • robot navigation
  • tracking and identifying complex military targets
    (e.g. vehicle convoys)

32
Some Specific Applications
  • Terrain aided navigation
  • Car positioning using map information
  • Robot navigation
  • Tracking of articulated targets using video
  • Tracking of complex targets using distributed
    sensors.

33
Terrain Aided Navigationhttp//www.control.isy.li
u.se/research/sensorfusion/sensorfusion/sensorfus
ion.html
  • Observations are measured ground clearance.
  • Unknowns are aircraft position and velocity.
  • The particle filter is needed because measured
    ground clearance does not uniquely determine
    position.

34
Car Positioning Using Map InfoGustafsson et al.,
Particle Filters for Positioning, Navigation,
and Tracking, IEEE Transactions on SP, Feb 2002
  • Observations are yaw rate and speed information
    computed from wheel speed sensors.
  • Vehicle position is unknown.
  • The map provides constraints on the vehicle
    position.

35
Mobile Robot Localizationhttp//www.cs.washington
.edu/ai/Mobile_Robotics/mcl/2
  • Observations are sensor data (image, video,
    sonar, laser rangefinder, etc.)
  • Robot position is unknown.
  • The robots position is estimated by correlating
    sensor data with known maps.

36
Tracking Articulated Objectshttp//www.dai.ed.ac.
uk/CVonline/LOCAL_COPIES/RINGER1/mocap_overview.h
tml
  • Observations are video sequences from two
    cameras.
  • Unknowns are positions and velocities of model
    components

37
Tracking with Networks of Distributed
Sensorshttp//www.parc.xerox.com/spl/projects/cos
ense/
  • Targets are tracked using an ad hoc network of
    distributed micro-sensors.

38
Other Applications
  • Channel equalization
  • Estimation of parameters of multiple chirp
    signals
  • Multiple target tracking
  • Bearings-only target tracking
  • Track before detect target tracking
  • Image segmentation

39
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

40
Fundamental Concepts
  • Bayesian inference
  • Monte Carlo samples
  • Importance Sampling
  • Resampling

41
Bayesian Inference
  • X is unknown-a random variable or set (vector) of
    random variables
  • Z is observed-also a set of random variables
  • We wish to infer X by observing Z.
  • The probability distribution p(x) models our
    prior knowledge of X.
  • The conditional probability distribution p(zx)
    models the relationship between Z and X.

42
Bayes Theorem
  • The conditional distribution p(xz) represents
    posterior information about X given Z.

43
Monte Carlo Samples (Particles)
  • The posterior distribution p(xz) may be
    difficult or impossible to compute in closed
    form.
  • An alternative is to represent p(xz) using Monte
    Carlo samples (particles)
  • Each particle has a value and a weight

x
x
44
Importance Sampling
  • Ideally, the particles would represent samples
    drawn from the distribution p(xz).
  • In practice, we usually cannot get p(xz) in
    closed form in any case, it would usually be
    difficult to draw samples from p(xz).
  • We use importance sampling
  • Particles are drawn from an importance
    distribution.
  • Particles are weighted by importance weights.

45
Resampling
  • In inference problems, most weights tend to zero
    except a few (from particles that closely match
    observations), which become large.
  • We resample to concentrate particles in regions
    where p(xz) is larger.

x
x
46
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

47
Anatomy of a Simple Particle Filter
  • A simple particle filter requires the following
  • A system state evolution model
  • An observation model
  • Particle computation processes
  • Propagate forward in time
  • Compute weights given observations
  • Resampling

48
System State
  • The state represents the unknown whose value we
    want to infer. For example,
  • Position (and velocity) of a robot, car, plane,
    ...
  • Position of articulated model components.
  • The system state at (discrete) time k is denoted
    xk.
  • The state evolves according to the following
    dynamics equation
  • xk1 fk (xk, wk)

49
Observation Model
  • The observation zk may be an image, a frame of
    video, a radar or sonar measurement, etc.
  • The relationship between the observation and the
    state is given by the conditional probability
    distribution p(zk xk).
  • This distribution may be derived from a
    functional relationship between zk and xk
  • zk hk (xk, vk)

50
Objective-Find p(xkzk,,z1)
  • The objective of the particle filter is to
    compute the conditional distribution
  • p(xkzk,,z1)
  • To do this analytically, we would use the
    Chapman-Kolmogorov equation and Bayes Theorem
    along with Markov model assumptions.
  • The particle filter gives us an approximate
    computational technique.

51
Particle Filter Algorithm
  • Create particles as samples from the initial
    state distribution p(x0).
  • For k going from 1 to K
  • Sample each particle from a proposal
    distribution.
  • Compute weights for each particle using the
    observation value.
  • (Optionally) resample particles.

52
Initial State Distribution
x0
x0
53
State Update
x1 f0 (x0, w0)
x1
This is one way to sample from a proposal
distribution.
54
Compute Weights
p(z1x1)
x1
Before
x1
After
55
Resample
x1
x1
56
Particle Filter Demonstration
  • A target moves from left to right.
  • Two sensors
  • Each measures the distance from itself to the
    target.
  • Sensors at (30,0) and (0,50)
  • 4000 Particles were used to track the target.
  • The animation on the following slide shows the
    particles, the true target position, and the
    estimated target position.

57
Particle Filter Demonstration
58
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

59
Variations on this Simple Implementation
  • Use a different importance distribution
  • In this implementation, the importance
    distribution is the predicted state distribution
    p(xk1zk,,z1).
  • Several papers have pointed out that this
    distribution may not be the best one can use.
  • If the observation at time k1 is available,
    significant improvement in performance can be
    obtained.

60
Variations
  • Use a different resampling technique
  • Resampling adds variance to the estimate several
    resampling techniques are available that minimize
    this added variance.
  • Our simple resampling leaves several particles
    with the same value methods for spreading them
    are available.

61
Variations
  • Reduce the resampling frequency
  • Our implementation resamples after every
    observation, which may add unneeded variance to
    the estimate.
  • Alternatively, one can resample only when the
    particle weights warrant it. This can be
    determined by the effective sample size.

62
Variations
  • Rao-Blackwellization
  • Some components of the model may have linear
    dynamics and can be well estimated using a
    conventional Kalman filter.
  • The Kalman filter is combined with a particle
    filter to reduce the number of particles needed
    to obtain a given level of performance.

63
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

64
Advantages of Particle Filters
  • Under general conditions, the particle filter
    estimate becomes asymptotically optimal as the
    number of particles goes to infinity.
  • Non-linear, non-Gaussian state update and
    observation equations can be used.
  • Multi-modal distributions are not a problem.
  • Particle filter solutions to inference problems
    are often easy to formulate.

65
Disadvantages of Particle Filters
  • Naïve formulations of problems usually result in
    significant computation times.
  • It is hard to tell if you have enough particles.
  • The best importance distribution and/or
    resampling methods may be very problem specific.

66
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

67
A Foveal Sensor
  • A foveal sensor has a high acuity area (similar
    to the fovea of the eye) that can be steered
    towards a desired location.

Foveal Region
Target Position
Low Acuity Region
68
Mathematical Model
  • The foveal sensor is modeled mathematically as
  • zk tan-1(Ck(xk-dk))
  • xk is the target position.
  • dk controls the location of the center of the
    foveal region.
  • Ck controls the width of the foveal region.

Foveal Region
zk
xk
dk
69
Before 1st Observation
Position
Estimated Initial Position
70
Collecting 1st Observation
1 Foveal sensor is configured using predicted
values.
Position
2 Observation is obtained
3 Position (and velocity) estimates are computed
71
Collecting 2nd Observation
1 Foveal sensor is configured using predicted
values.
Position
2 Observation is obtained
3 Position (and velocity) estimates are computed
72
Collecting 3rd Observation
1 Foveal sensor is configured using predicted
values.
Position
2 Observation is obtained
3 Position (and velocity) estimates are computed
73
Implementation
  • We implemented a particle filter to estimate the
    target position from observations.
  • The foveal region is centered on the predicted
    target position.
  • The gain is either set to a constant value or
    adjusted to include a certain percentage of the
    particles in the foveal region.
  • The implementation took a few hours.
  • Tuning the filter has taken a few weeks.

74
Comparison with Previous Foveal Sensor
  • A two dimensional linear system dynamics model is
    used.
  • The system state transition matrix is stable.
  • The following plot shows curves of constant
    estimation error as a function of process and
    observation noise variance
  • Stat fixed gain-Kalman filter implementation of
    fixed gain sensor
  • PF fixed gain-Particle filter implementation of a
    fixed gain sensor
  • PF Var. gain-Particle filter implementation of an
    adaptive gain sensor

75
Curves of Constant Error
76
Discussion of Results
  • Adaptive acuity gives better performance than
    fixed acuity.
  • The particle filter implementations do not
    perform well with very small observation noise
    variances.
  • The number of particles is too small for very
    sharply peaked observation densities-few
    particles fall within the peaks.
  • Several approaches to improve the performance for
    small observation variances are currently under
    investigation.

77
Fixed vs. Adaptive Acuity
  • The foveal sensor collects observations of
    position.
  • The acuity of the foveal region is adjusted so
    that 80 of the predicted particle positions fall
    into the foveal region.
  • The gain of the foveal region is smoothed using a
    low-pass filter with an exponentially decaying
    impulse response.
  • We show plots of the average squared estimate
    error as a function of time for
  • Fixed acuity foveal sensor for gains of 0.25, 1,
    and 4
  • Adaptive acuity foveal sensor

78
Average Estimate Error
79
Presentation Outline
  • Applications of particle filters
  • Fundamental concepts
  • Anatomy of a simple particle filter
  • Variations on the simple particle filter
  • Pros and Cons of particle filters
  • Application to configuration of a foveal sensor
  • Conclusions

80
Conclusions
  • Particle filters (and other Monte Carlo methods)
    are a powerful tool to solve difficult inference
    problems.
  • Formulating a filter is now a tractable exercise
    for many previously difficult or impossible
    problems.
  • Implementing a filter effectively may require
    significant creativity and expertise to keep the
    computational requirements tractable.
Write a Comment
User Comments (0)
About PowerShow.com