Probabilistic Localization And Mapping for Mobile Robotics

1 / 49
About This Presentation
Title:

Probabilistic Localization And Mapping for Mobile Robotics

Description:

... problem to provide a mobile robot with autonomous capabilities' [I.J. ... 'Chicken-egg problem' ... Solves the global localization and kidnapped robot problem ... –

Number of Views:174
Avg rating:3.0/5.0
Slides: 50
Provided by: ranjithunn
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic Localization And Mapping for Mobile Robotics


1
Probabilistic Localization And Mapping for Mobile
Robotics
  • Ranjith Unnikrishnan
  • Marc Zinck
  • Monday December 3, 2001

2
Outline
  • Probability in robotics
  • Localization
  • Mapping
  • Monte-Carlo Localization

3
Probability in Robotics
  • Pays Tribute to Inherent Uncertainty
  • Know your own ignorance
  • Robustness
  • Scalability
  • No need for perfect world model
  • Relieves programmers

4
Limitations of Probability
  • Computationally inefficient
  • Consider entire probability densities
  • Approximation
  • Representing continuous probability
    distributions.

5
Probabilistic Localization
Simmons/Koenig 95 Kaelbling et al 96 Burgard
et al 96
6
Probabilistic Action model
st-1
st-1
at-1
at-1
  • Continuous probability density Bel(st) after
    moving 40m (left figure) and 80m (right figure).
    Darker area has higher probablity.

7
Probabilistic Sensor Model
Probabilistic sensor model for laser range finders
8
Bayes Rule
9
Law of Total Probability
10
Markov Assumption
Future is Independent of Past Given Current State
Assume Static World
11
Probabilistic Model
12
Derivation Markov Localization
Kalman 60, Rabiner 85
13
Example Markov Localization
Burgard et al 96 Fox 99
14
Summary of Localization
  • The most fundamental problem to provide a mobile
    robot with autonomous capabilities I.J.Cox

15
Where am I?
  • Building a map with an accurate set of sensors
    Easy!
  • Localization with an accurate map Simple!
  • Fact You start off with noisy sensors

16
A similar problem Line fitting
  • Goal To group a bunch of points into two
    best-fit line segments

17
Chicken-egg problem
  • If I knew which line each point belonged to, I
    could compute the best-fit lines.

18
Chicken-egg problem
  • If I knew what the two best-fit lines were, I
    could find out which line each point belonged to.

19
Expectation-Maximization (EM)
Algorithm
  • Initialize Make random guess for lines
  • Repeat
  • Find the line closest to each point and group
    into two sets. (Expectation Step)
  • Find the best-fit lines to the two sets
    (Maximization Step)
  • Iterate until convergence
  • The algorithm is guaranteed to converge to some
    local optima

20
Example
21
Example
22
Example
23
Example
24
Example
Converged!
25
Probabilistic Mapping
Maximum Likelihood Estimation
  • E-Step Use current best map and data to find
    belief probabilities
  • M-step Compute the most likely map based on the
    probabilities computed in the E-step.
  • Alternate steps to get better map and
    localization estimates
  • Convergence is guaranteed as before.

26
The E-Step
  • P(std,m) P(st o1, a1 ot,m) . P(st atoT,m)

27
The M-Step
  • Updates occupancy grid
  • P(mxyl d)

28
Probabilistic Mapping
  • Addresses the Simultaneous Mapping and
    Localization problem (SLAM)
  • Robust
  • Hacks for easing computational and processing
    burden
  • Caching
  • Selective computation
  • Selective memorization

29
Results
30
Localization flashback
  • Several algorithms with vastly different
    properties
  • The Kalman filter
  • Topological Markov Localization
  • Grid-based Markov Localization
  • Sampling-based methods

31
Localization flashback
  • The Kalman Filter
  • Concise, closed form equations
  • Robust and accurate for tracking position
  • Does not handle non-Gaussian or non-linear motion
    and measurement models
  • Restricted sub-optimal extensions with varying
    success
  • Topological Markov Localization
  • Grid-based Markov Localization
  • Sampling-based methods

32
Localization flashback
  • The Kalman Filter
  • Topological Markov Localization
  • Feature-based localization
  • Bayesian Landmark Learning (BaLL)
  • Very coarse resolution
  • Low accuracy
  • Grid-based Markov Localization
  • Sampling-based methods

33
Localization flashback
  • The Kalman filter
  • Topological Markov Localization
  • Grid-based Markov Localization
  • Fine resolution by discretizing state space
  • Very robust
  • A priori commitment to precision
  • Very high computational burden, with effects on
    accuracy
  • Sampling-based methods

34
What is the right representation?
???
35
Sampling the Action Model
Start
  • Sampling-based model of position belief

36
Sampling-based Methods
  • Invented in the 70s!
  • Rediscovered independently in target-tracking,
    statistical and computer vision literature
  • Bootstrap filter
  • Monte-Carlo filter
  • Condensation algorithm
  • Particle Filters

37
Monte-Carlo Localization
  • Represent the probability density Bel(st) by a
    set of randomly drawn samples
  • From samples, we can always approximately
    reconstruct density (e.g. histogram)
  • Reason The discrete distribution defined by the
    sample will approximate the desired one.
  • Goal Recursively compute at each time instance t
    the set of samples St that is drawn from Bel(st)

38
Algorithm Prediction phase
  • 1. Draw a random sample St-1 from the current
    belief Bel(st-1)

39
Algorithm Update phase - I
  • 2. For this St-1, guess a set of successor poses
    st, as per the distribution p(stat-1,st-1,m) to
    form St-1

p(stat-1,st-1,m)
St-1
40
Algorithm Update phase - II
  • 3. Weight each sample in St-1 by mt
    p(otst,m), or what is called the importance
    factor.

mt p(otst,m)
weighted St-1
41
Algorithm Resampling
  • 4. Draw each element sjt-1 in St-1 with
    probability equal to its weight mj to form the
    new set St

Bel(st)
St
42
Algorithm
  • 5. Normalize the importance factors and repeat
    from (2).

Bel(st)
St
43
Justification
  • Predictive phase retrieves an empirical
    predictive density (stratified sampling) that
    approximates the real one.
  • Update phase retrieves an empirical posterior
    density (importance sampling) by weighting more
    likely states.
  • The entire procedure is called Sampling /
    Importance Resampling (SIR)

44
Animation
45
Monte-Carlo Localization (MCL)
  • Solves the global localization and kidnapped
    robot problem
  • Multi-modal (unlike the Kalman filter)
  • Drastic reduction in memory requirement
  • More accurate than ML with a fixed cell size
  • Easy to implement
  • Fast

46
References
  • AAAI Tutorial on Probabilistic Robotics
    (Sebastian Thrun)
  • Probabilistic Algorithms in Robotics (Thrun)
  • Robust Monte Carlo Localization for Mobile Robots
    (Thrun, Fox, Burgard, Dellaert)
  • Monte Carlo Localization for Mobile Robots
    (Dellaert, Fox, Burgard, Thrun)

47
Mapping with MCL
48
Performance comparison
Monte Carlo localization
Markov localization (grids)
49
Cons
  • Insufficient points in key areas because of small
    sample set size
  • Counter-intuitively, the performance degrades
    with better sensors
  • Hack of injecting noise
  • Mixture-MCL MCL with mixture proposal
    distribution
Write a Comment
User Comments (0)
About PowerShow.com