Title: Map Building without Localization by Dimensionality Reduction Techniques
1Map Building without Localization by
Dimensionality Reduction Techniques
Session VISION, GRAPHICS AND ROBOTICS
- Takehisa YAIRI
- RCAST, University of Tokyo
2Outline
- Background
- Motivation, Purpose and Problem to consider
- Related Works
- SLAM, and Mapping with DR methods
- Proposed Framework - LFMDR
- Basic idea, Assumptions, Formalization
- Experiment
- Visibility-Only and Bearing-Only Mappings
- Conclusion
3Motivation
- Map building
- An essential capability for intelligent agents
- SLAM (Simultaneous Localization and Mapping)
- Has been mainstream for many years
- Very successful both in theory and practice
- I like SLAM too, but I feel somethings missing..
- Are the mapping and localization really
inseparable ? - Are the motion and measurement models necessary ?
- How about the aspect of map building as an
abstraction of the world ? - Is there another map building framework ?
4Purpose
- Reconsider the robot map building from the
viewpoint of dimensionality reduction and
propose an alternative framework - Localization-Free Mapping by Dimensionality
Reduction (LFMDR) - No localization, no motion and measurement models
- Heuristics Closely located objects tend to
share similar histories of being observed by a
robot
tN
t2
Reduce dimensionality, while preserving locality
t1
5Map Building Problem to Consider
- Feature-based map (i.e. Not topological, not
occupancy-grid) - A map is represented by 2-D coordinates of
objects - There EXIST motion and measurement models
- But, they are not necessarily known in advance
m
Motion model (State transition model)
Positions of objects
Measurement model (Observation model)
Move
Observation
State
Exist, but may be unknown
6Related Works SLAM Thrun 02
- Problem Estimate m and x1t from y1t , given
f and g - Solutions
- Kalman Filter with extended state
- Incremental maximum likelihood Thrun, et.al. 98
- Rao-Blackwellized Particle Filter Montemerlo,
et.al. 02 - Motion and measurement models must be given
- Estimations of map and robot position are coupled
Given
Output
Input
Measurement data
Motion model
Map
Measurement model
Robot trajectory
7Related Works Dimensionality Reduction and
Mapping (1)
- Idea of using DR for robot map building is not
new itself .. - Brunskill Roy 05
- PPCA to extract low-dimensional geometric
features (line segments) from range measurements - Pierce Kuipers 97
- PCA to obtain low-level mappings between robots
actions and perceptions (sensorimotor mapping)
Point features(High dimensional)
DR
Line segments(Low dimensional)
8Related Works Dimensionality Reduction and
Mapping (2)
Observation Space
- Another existing idea is to estimate robots
states (locations, poses) from a sequence of high
dimensional observation data - Appearance manifolds Ham, et.al. 05
- LLP Kalman filter
- Action respecting embedding Bowling, et.al. 05
- SDE
- Wifi-SLAM Ferris, et.al. 07
- GP-LVM
Dimensionality Reduction
x2
State Space
x1
q
9Related Works Dimensionality Reduction and
Mapping (cont.)
Observation data (from time 1 to N)
Dimension of features
Time
DR
x2
State Space
- Treat row vectors as data points
- Estimate x1N and g, from y1N , given f
trajectory
x1
q
10Proposed Framework LFMDR (1)Assumptions
- All objects are uniquely identifiable
- Measurement model can be decomposed to
homogeneous submodels for individual objects - Locations of at least 3 objects are known in
advance (Anchor objects)
Decomposable
An observation about an object is roughly
dependent only on its location, given the map and
robots position
The second assumption may look too restrictive,
but, ..
11Proposed Framework LFMDR (2)Interpretation as
a DR Problem
- Imagine a mapping between an object position and
its history of observation
ObservationData Matrix
Mapping
Time
(2-dimensional)
XY coordinates
Observation History Space
(N-dimensional)
If two objects are closely located, their
histories of observation are similar
12Proposed Framework LFMDR (3)Illustration
13Proposed Framework LFMDR (4)Procedure
- Explore the environment and obtain observation
history data Y1N - Break Y1N into a set of column vectors
y(j)1Nj1,,M - Apply a DR method to the vectors and obtain a set
of 2-D vectors - Perform the optimal Affine transformation w.r.t
anchor objects, and obtain final estimates
14Features of LFMDR (1)(Comparison with SLAM)
- Common
- Based on state space model
- Different
- No assumption that motion and measurement models
are known - Map is directly estimated without robot
localization (localization-free mapping) - Off-line procedure
- Larger amount of data required
- Assumption of no missing data
Advantages
Disadvantages
15Features of LFMDR (2)(Comparison with Other
DR-based Approaches)
- Comparison with Brunskill Roy 05
- Global vs. Local
- Comparison with Ham,et.al. 05 Bowling,et.al.
05 Ferris,et.al. 07 - Column vectors vs. Row vectors(i.e., Object
positions vs. Robot positions)
DR
DR
v.s.
DR
DR
v.s.
16Experiment
- Applied to 2 different situations
- Case 1 Visibility-only mapping
- Case 2 Bearing-only mapping
- Common settings
- 2.5mx2.5m square region
- 50 objects (incl. 4 anchors)
- Exploration with random direction change and
obstacle avoidance - Evaluation
- Mean Position Error (MPE)
- Mean Orientation Error (MOE)
- Averaged over 25 runs
WEBOTS simulator
Triangle Orientation
A
A
Difference
B
C
B
C
A-B-C
A-C-B
17DR Methods
- Linear PCA
- SMACOF-MDS DeLeeuw 77
- (a) Equal weights, (b) kNN-based weighting
- Kernel PCA Scholkopf,et.al. 98
- (a) Gaussian, (b) Polynomial
- ISOMAP Tennenbaum,et.al. 00
- LLE RoweisSaul 00
- Laplacian Eigenmap BelkinNiyogi 02
- Hessian LLE DonohoGrimes 03
- SDE Weinberger, et.al. 05
Parameters(k, s2, d) were tuned manually
18Case 1 Visibility-Only MappingDescription
- Building a map using only visibility information
- i.e., Whether each object is visible (1) or not
(0) - An assumption in this simulation
- An object is visible if its horizontal visual
angle of non-occluded part is larger than 5 deg
19Case 1 Visibility-Only MappingVisibility
Measurements
Observation history vector of an object
Visibility Observation Data
(Binary matrix)
Column
Normalization
Object ID
Euc. norm
Time
Compensate variety of the frequencies the objects
are observed
Observation HistorySpace
20Case 1 Visibility-Only MappingMaps After 2000
Time Steps
LPCA
Isomap(k6)
SMACOF (k5)
KPCA(Gaussian, s20.5)
LLE (k8)
LEM (k6)
SDE (k7)
HLLE (k8)
21Case 1 Visibility-Only MappingMean Position
Errors
22Case 1 Visibility-Only MappingFinal Map Errors
23Case 2 Bearing-Only MappingDescription
- Building a map only with bearing measurements
- Motivated by recent popularity of Bearing-Only
SLAM - Assuming all objects are always visible (No
missing observation)
(Relative direction angles to objects)
Bearingangles
24Case 2 Bearing-Only MappingBearing Measurements
Original Bearing Data
Object ID
Use a unit directional vectorinstead of bearing
angle
1
2
j
M
q1,1 q2,1 qj,1 qM,1
q1,2 q2,2 qj,2 qM,2
q1,N q2,N qj,N qM,N
1
Time
2
p
-p
N
Discontinuity
Unit directional vectors
1
2
j
M
Observation History Space
cosq1,1 cosq2,1 cosqj,1 cosqM,1
sinq1,1 sinq2,1 sinqj,1 sinqM,1
cosq1,2 cosq2,2 cosqj,2 cosqM,2
sinq1,2 sinq2,2 sinqj,2 sinqM,2
cosq1,N cosq2,N cosqj,N cosqM,N
sinq1,N sinq2,N sinqj,N sinqM,N
1
Time
2N-dimensional
2
N
25Case 2 Bearing-Only Mapping Maps After 2000
Time Steps
LPCA
SMACOF (k8)
Isomap (k9)
LLE (k8)
LEM (k7)
SDE (k7)
26Case 2 Bearing-Only Mapping Mean Position
Errors
27Case 2 Bearing-Only Mapping Final Map Errors
() It might imply the distribution approaches to
linear
28Conclusion
- Reconsidered robot map building from the
viewpoint of dimensionality reduction - Proposed a new framework named LFMDR
- Motion and measurement models are not required
- Not need to estimate robots poses
(localization-free) - However, larger amount of data is needed
- Tested on two types of sensor measurements
- Visibility information, and Bearing angles
- Compared a variety of DR methods
29Future Works
- Relaxation of restrictions
- Missing measurements
- Data association problem
- Scalability
- Mapping of a larger number of objects
- On-line algorithm
- Tracking of moving objects
- Multi-sensor fusion
- e.g. mapping with bearing and range measurements