Title: Life in the Atacama Project Workshop
1Life in the AtacamaProject Workshop
- Visual Odometry and VisualizationJuly 28-30,
2003 - Dennis Strelow, Sanjiv Singh, and Ivan
KiriginCarnegie Mellon
2Introduction (1)
- Mars rovers need to know where they are
- to navigate to new locations
- to tag sensor data with the location where it
was observed
3Introduction (1)
- Mars rovers need to know where they are
- to navigate to new locations
- to tag sensor data with the location where it
was observed - Difficult to know for long traverses
- location estimates from odometry drift
- GPS not available on Mars
4Introduction (2)
- Image and inertial measurements might provide a
good mechanism for motion estimation - Image observations of objects in the environment
make it possible to reduce long-term drift in
motion estimates - Inertial observations remove ambiguities in
image-only estimates
5Introduction (3)
- Investigating algorithms for
- six degree of freedom (e.g., nonplanar)
vehicle motion estimation - from image and inertial measurements
- without a priori knowledge of the environment
6Introduction (3)
- Investigating algorithms for
- six degree of freedom (e.g., nonplanar)
vehicle motion estimation - from image and inertial measurements
- without a priori knowledge of the environment
- Emphasis on operation with
- small and inexpensive inertial sensors
7Introduction (4)
- Logged measurements on Hyperion so that we could
determine
8Introduction (4)
- Logged measurements on Hyperion so that we could
determine - 1. Whether our existing sensors and algorithms
are suitable for visual odometry on the rover? - 2. If not, what changes should be made to make
them suitable?
9Outline
- Atacama data acquisition
- Method
- Experimental results
- Recommendations
- Visualization
10Outline
- Atacama data acquisition
- Sensors
- Observations
- Method
- Experimental results
- Recommendations
- Visualization
11Atacama data acquisition (1) Sensors
12Atacama data acquisition (2) Sensors, cont.
- Sony firewire video camera
- 1024 x 768 color images
- 15 images per second
13Atacama data acquisition (3) Sensors, cont.
14Atacama data acquisition (4) Sensors, cont.
- Silicon Sensing Systems CRS04 rate gyros
- 1 axis, so we have 3
- - 150 degrees/s
- 1 cm x 3 cm x 3 cm
- 12 grams
- approximately 300 each
- acquire at 200 Hz
15Atacama data acquisition (5) Sensors, cont.
- Crossbow CXL04LP3 accelerometer
- 3 axis, so we have 1
- - 4 g
- 2 cm x 4.75 cm x 2.5 cm
- 46 grams
- approximately 300
- acquire at 200 Hz
16Atacama data acquisition (6) Sensors, cont.
- Omnidirectional cameras combine
- conventional camera
- convex mirror
- To provide a wide field of view
- 360 azimuth
- 90-140 elevation
- Wide field of view promising for motion
estimation
17Atacama data acquisition (7) Sensors, cont.
18Atacama data acquisition (8) Sensors, cont.
19Atacama data acquisition (9) Observations
- April 2003 acquisition days
20Atacama data acquisition (10) Observations,
cont.
21Atacama data acquisition (11) Observations,
cont.
- April 2003 omnidirectional
22Atacama data acquisition (12) Observations,
cont.
- Best conventional dataset
- April 16
- 5003 images at 1.5 Hz
- 55 minutes 45 seconds
- Images covers 690 m (of 1338 m traveled that day)
23Atacama data acquisition (13) Observations,cont.
24Atacama data acquisition (13) Observations,cont.
25Atacama data acquisition (13) Observations,cont.
26Atacama data acquisition (14) Observations,cont.
27Atacama data acquisition (15) Observations,
cont.
- Best omnidirectional dataset
- April 24
- 4515 images at 1.5 Hz
- 50 minutes 18 seconds
- Images cover 402 m (of 1143 m traveled that day)
28Atacama data acquisition (16) Observations,cont.
29Atacama data acquisition (16) Observations,cont.
30Atacama data acquisition (17) Observations,cont.
31Outline
- Atacama data acquisition
- Method
- Overview
- Batch method
- Online method
- Experimental results
- Recommendations
- Visualization
32Method (1) Overview
- Suite of algorithms for
- 6 degree of freedom sensor (vehicle) motion
estimation - from image, gyro, and accelerometer
measurements
33Method (1) Overview
- Suite of algorithms for
- 6 degree of freedom sensor (vehicle) motion
estimation - from image, gyro, and accelerometer
measurements - in scenarios where
- the environment contains no fiducials whose
appearance or location is known a priori
34Method (2) Overview, cont.
- If you know the sensors location
- then its easy to determine the location of
objects in the environment
35Method (2) Overview, cont.
- If you know the sensors location
- then its easy to determine the location of
objects in the environment - If you know the location of objects in the
environment - then its easy to determine the sensors location
36Method (2) Overview, cont.
- If you know the sensors location
- then its easy to determine the location of
objects in the enivornment - If you know the location of objects in the
environment - then its easy to determine the sensors
location - If neither, then simultaneous estimation
37Method (3) Overview, cont.
- Simultaneous localization and mapping (SLAM)
- Typically recover
- vehicle position in the plane
- landmark positions in the plane
- from
- range data
- vehicle odometry
38Method (4) Overview, cont.
- Shape-from-motion (SFM)
- Typically recover
- 3D camera rotations and translations
- 3D positions of imaged points
- from
- tracked image features
39Method (6) Overview, cont.
- Our problem
- Recover
- 3D camera rotations and translations
- other unknowns
- from
- image, gyro, accelerometer
40Method (7) Overview, cont.
- Our problem
- Recover
- 3D camera rotations and translations
- other unknowns
- from
- image, gyro, accelerometer
- What exactly are the measurements?
41Method (8) Overview, cont.
- Camera tracked 2D image features
42Method (9) Overview, cont.
- Camera correctly tracked 2D image features
43Method (10) Overview, cont.
- Gyro biased camera coordinate system angular
velocities
Accelerometer biased camera coordinate system
apparent accelerations
44Method (11) Overview, cont.
- Image and inertial streams have different rates
- No direct measurements of position or linear
velocity
45Method (12) Batch algorithm
- Optimal estimates of
- motion
- related unknowns
- using the entire image and inertial observation
sequence at once
46Method (13) Batch algorithm, cont.
- Uses Levenberg-Marquardt to minimize
- a combined image and inertial error function
47Method (13) Batch algorithm, cont.
- Uses Levenberg-Marquardt to minimize
- a combined image and inertial error function
- with respect to
- 6 DOF camera position at the time of each image
- Linear velocity at the time of each image
- 3D world position of each tracked point feature
- Gravity vector w.r.t. world coordinate system
- Gyro and accelerometer biases
48Method (14) Batch algorithm, cont.
- Batch algorithm can produce good motion
estimates - even when the estimates from image or inertial
measurements alone are poor
49Method (15) Online algorithm
- The batch algorithm is not appropriate for online
motion estimation - requires that all measurements be available
beforehand
50Method (16) Online algorithm, cont.
- Our online algorithm is an iterated extended
Kalman filter (IEKF) that estimates - the current camera position and linear velocity
- the 3D positions of all currently visible
points - the gravity vector and sensor biases
51Method (17) Online algorithm, cont.
- Handles the same class of problems as the batch
method - but inherits some problems from the IEKF
framework
52Outline
- Atacama data acquisition
- Method
- Experimental results
- Initial Atacama experiments
- PRB Crane
- Recommendations
- Visualization
53Experimental results (1) Initial Atacama
experiments
54Experimental results (2) Initial Atacama
experiments, cont.
- Recovered rotations, translations, points
- from above
55Experimental results (3) Initial Atacama
experiments, cont.
- Recovered rotations, translations, points
- from ground level
56Experimental results (4) PRB Crane
- With addition of
- better tracking data
- inertial data
- online operation
- it is possible to generate accurate motion
estimates over a longer sequence -
57Experimental results (5) PRB Crane, cont.
58Experimental results (6) PRB Crane, cont.
59Experimental results (7) PRB Crane, cont.
60Experimental results (8) PRB Crane, cont.
61Experimental results (9) PRB Crane, cont.
- Average rotation error 0.14 radians
- Average translation error 23.9 (41.8) cm
- 0.7 of 34.9 m traveled
62Experimental results (10) PRB Crane, cont.
- Average rotation error 0.14 radians
- Average translation error 23.9 (41.8) cm
- 0.7 of 34.9 m traveled
- (Ask me about the details of the error analysis)
63Outline
- Atacama data acquisition
- Method
- Experimental results
- Recommendations
- Tracking
- Camera
- Inertial sensors
- Visualization
64Recommendations (1) Tracking
- Image feature tracking is the weakest link
65Recommendations (2) Camera
- Sony DFW-V700 firewire camera produces too much
data - 640 x 480, greyscale more appropriate for tracking
66Recommendations (2) Camera
- Sony DFW-V700 firewire camera produces too much
data - 640 x 480, greyscale more appropriate for
tracking - DirectX image acquisition may not timestamp the
images properly
67Recommendations (3) Inertial sensors
Different inertial sensors allow different
approaches to integrating image and inertial
measurements
68Recommendations (4) Inertial sensors, cont.
Current Image measurements dominate Inertial
measurements disambiguate resulting motion
69Recommendations (5) Inertial sensors, cont.
Best paradigm with high quality inertial
sensors Inertial measurements give good short
range motion Image measurements primarily
used to reduce drift
70Recommendations (6) Inertial sensors, cont.
Result Fewer image features required
Remaining features tracked more reliably using
high quality inertial information