Life in the Atacama Project Workshop - PowerPoint PPT Presentation

1 / 70
About This Presentation
Title:

Life in the Atacama Project Workshop

Description:

make it possible to reduce long-term drift in. motion estimates ... Crossbow CXL04LP3 accelerometer. 3 axis, so we have 1 - 4 g. 2 cm x 4.75 cm x 2.5 cm ... – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 71
Provided by: denniss9
Category:

less

Transcript and Presenter's Notes

Title: Life in the Atacama Project Workshop


1
Life in the AtacamaProject Workshop
  • Visual Odometry and VisualizationJuly 28-30,
    2003
  • Dennis Strelow, Sanjiv Singh, and Ivan
    KiriginCarnegie Mellon

2
Introduction (1)
  • Mars rovers need to know where they are
  • to navigate to new locations
  • to tag sensor data with the location where it
    was observed

3
Introduction (1)
  • Mars rovers need to know where they are
  • to navigate to new locations
  • to tag sensor data with the location where it
    was observed
  • Difficult to know for long traverses
  • location estimates from odometry drift
  • GPS not available on Mars

4
Introduction (2)
  • Image and inertial measurements might provide a
    good mechanism for motion estimation
  • Image observations of objects in the environment
    make it possible to reduce long-term drift in
    motion estimates
  • Inertial observations remove ambiguities in
    image-only estimates

5
Introduction (3)
  • Investigating algorithms for
  • six degree of freedom (e.g., nonplanar)
    vehicle motion estimation
  • from image and inertial measurements
  • without a priori knowledge of the environment

6
Introduction (3)
  • Investigating algorithms for
  • six degree of freedom (e.g., nonplanar)
    vehicle motion estimation
  • from image and inertial measurements
  • without a priori knowledge of the environment
  • Emphasis on operation with
  • small and inexpensive inertial sensors

7
Introduction (4)
  • Logged measurements on Hyperion so that we could
    determine

8
Introduction (4)
  • Logged measurements on Hyperion so that we could
    determine
  • 1. Whether our existing sensors and algorithms
    are suitable for visual odometry on the rover?
  • 2. If not, what changes should be made to make
    them suitable?

9
Outline
  • Atacama data acquisition
  • Method
  • Experimental results
  • Recommendations
  • Visualization

10
Outline
  • Atacama data acquisition
  • Sensors
  • Observations
  • Method
  • Experimental results
  • Recommendations
  • Visualization

11
Atacama data acquisition (1) Sensors
12
Atacama data acquisition (2) Sensors, cont.
  • Sony firewire video camera
  • 1024 x 768 color images
  • 15 images per second

13
Atacama data acquisition (3) Sensors, cont.
14
Atacama data acquisition (4) Sensors, cont.
  • Silicon Sensing Systems CRS04 rate gyros
  • 1 axis, so we have 3
  • - 150 degrees/s
  • 1 cm x 3 cm x 3 cm
  • 12 grams
  • approximately 300 each
  • acquire at 200 Hz

15
Atacama data acquisition (5) Sensors, cont.
  • Crossbow CXL04LP3 accelerometer
  • 3 axis, so we have 1
  • - 4 g
  • 2 cm x 4.75 cm x 2.5 cm
  • 46 grams
  • approximately 300
  • acquire at 200 Hz

16
Atacama data acquisition (6) Sensors, cont.
  • Omnidirectional cameras combine
  • conventional camera
  • convex mirror
  • To provide a wide field of view
  • 360 azimuth
  • 90-140 elevation
  • Wide field of view promising for motion
    estimation

17
Atacama data acquisition (7) Sensors, cont.
18
Atacama data acquisition (8) Sensors, cont.
19
Atacama data acquisition (9) Observations
  • April 2003 acquisition days

20
Atacama data acquisition (10) Observations,
cont.
  • April 2003 conventional

21
Atacama data acquisition (11) Observations,
cont.
  • April 2003 omnidirectional

22
Atacama data acquisition (12) Observations,
cont.
  • Best conventional dataset
  • April 16
  • 5003 images at 1.5 Hz
  • 55 minutes 45 seconds
  • Images covers 690 m (of 1338 m traveled that day)

23
Atacama data acquisition (13) Observations,cont.
24
Atacama data acquisition (13) Observations,cont.
25
Atacama data acquisition (13) Observations,cont.
26
Atacama data acquisition (14) Observations,cont.
27
Atacama data acquisition (15) Observations,
cont.
  • Best omnidirectional dataset
  • April 24
  • 4515 images at 1.5 Hz
  • 50 minutes 18 seconds
  • Images cover 402 m (of 1143 m traveled that day)

28
Atacama data acquisition (16) Observations,cont.
29
Atacama data acquisition (16) Observations,cont.
30
Atacama data acquisition (17) Observations,cont.
31
Outline
  • Atacama data acquisition
  • Method
  • Overview
  • Batch method
  • Online method
  • Experimental results
  • Recommendations
  • Visualization

32
Method (1) Overview
  • Suite of algorithms for
  • 6 degree of freedom sensor (vehicle) motion
    estimation
  • from image, gyro, and accelerometer
    measurements

33
Method (1) Overview
  • Suite of algorithms for
  • 6 degree of freedom sensor (vehicle) motion
    estimation
  • from image, gyro, and accelerometer
    measurements
  • in scenarios where
  • the environment contains no fiducials whose
    appearance or location is known a priori

34
Method (2) Overview, cont.
  • If you know the sensors location
  • then its easy to determine the location of
    objects in the environment

35
Method (2) Overview, cont.
  • If you know the sensors location
  • then its easy to determine the location of
    objects in the environment
  • If you know the location of objects in the
    environment
  • then its easy to determine the sensors location

36
Method (2) Overview, cont.
  • If you know the sensors location
  • then its easy to determine the location of
    objects in the enivornment
  • If you know the location of objects in the
    environment
  • then its easy to determine the sensors
    location
  • If neither, then simultaneous estimation

37
Method (3) Overview, cont.
  • Simultaneous localization and mapping (SLAM)
  • Typically recover
  • vehicle position in the plane
  • landmark positions in the plane
  • from
  • range data
  • vehicle odometry

38
Method (4) Overview, cont.
  • Shape-from-motion (SFM)
  • Typically recover
  • 3D camera rotations and translations
  • 3D positions of imaged points
  • from
  • tracked image features

39
Method (6) Overview, cont.
  • Our problem
  • Recover
  • 3D camera rotations and translations
  • other unknowns
  • from
  • image, gyro, accelerometer

40
Method (7) Overview, cont.
  • Our problem
  • Recover
  • 3D camera rotations and translations
  • other unknowns
  • from
  • image, gyro, accelerometer
  • What exactly are the measurements?

41
Method (8) Overview, cont.
  • Camera tracked 2D image features

42
Method (9) Overview, cont.
  • Camera correctly tracked 2D image features

43
Method (10) Overview, cont.
  • Gyro biased camera coordinate system angular
    velocities

Accelerometer biased camera coordinate system
apparent accelerations
44
Method (11) Overview, cont.
  • Image and inertial streams have different rates
  • No direct measurements of position or linear
    velocity

45
Method (12) Batch algorithm
  • Optimal estimates of
  • motion
  • related unknowns
  • using the entire image and inertial observation
    sequence at once

46
Method (13) Batch algorithm, cont.
  • Uses Levenberg-Marquardt to minimize
  • a combined image and inertial error function

47
Method (13) Batch algorithm, cont.
  • Uses Levenberg-Marquardt to minimize
  • a combined image and inertial error function
  • with respect to
  • 6 DOF camera position at the time of each image
  • Linear velocity at the time of each image
  • 3D world position of each tracked point feature
  • Gravity vector w.r.t. world coordinate system
  • Gyro and accelerometer biases

48
Method (14) Batch algorithm, cont.
  • Batch algorithm can produce good motion
    estimates
  • even when the estimates from image or inertial
    measurements alone are poor

49
Method (15) Online algorithm
  • The batch algorithm is not appropriate for online
    motion estimation
  • requires that all measurements be available
    beforehand

50
Method (16) Online algorithm, cont.
  • Our online algorithm is an iterated extended
    Kalman filter (IEKF) that estimates
  • the current camera position and linear velocity
  • the 3D positions of all currently visible
    points
  • the gravity vector and sensor biases

51
Method (17) Online algorithm, cont.
  • Handles the same class of problems as the batch
    method
  • but inherits some problems from the IEKF
    framework

52
Outline
  • Atacama data acquisition
  • Method
  • Experimental results
  • Initial Atacama experiments
  • PRB Crane
  • Recommendations
  • Visualization

53
Experimental results (1) Initial Atacama
experiments
54
Experimental results (2) Initial Atacama
experiments, cont.
  • Recovered rotations, translations, points
  • from above

55
Experimental results (3) Initial Atacama
experiments, cont.
  • Recovered rotations, translations, points
  • from ground level

56
Experimental results (4) PRB Crane
  • With addition of
  • better tracking data
  • inertial data
  • online operation
  • it is possible to generate accurate motion
    estimates over a longer sequence

57
Experimental results (5) PRB Crane, cont.
58
Experimental results (6) PRB Crane, cont.
59
Experimental results (7) PRB Crane, cont.
60
Experimental results (8) PRB Crane, cont.
61
Experimental results (9) PRB Crane, cont.
  • Average rotation error 0.14 radians
  • Average translation error 23.9 (41.8) cm
  • 0.7 of 34.9 m traveled

62
Experimental results (10) PRB Crane, cont.
  • Average rotation error 0.14 radians
  • Average translation error 23.9 (41.8) cm
  • 0.7 of 34.9 m traveled
  • (Ask me about the details of the error analysis)

63
Outline
  • Atacama data acquisition
  • Method
  • Experimental results
  • Recommendations
  • Tracking
  • Camera
  • Inertial sensors
  • Visualization

64
Recommendations (1) Tracking
  • Image feature tracking is the weakest link

65
Recommendations (2) Camera
  • Sony DFW-V700 firewire camera produces too much
    data
  • 640 x 480, greyscale more appropriate for tracking

66
Recommendations (2) Camera
  • Sony DFW-V700 firewire camera produces too much
    data
  • 640 x 480, greyscale more appropriate for
    tracking
  • DirectX image acquisition may not timestamp the
    images properly

67
Recommendations (3) Inertial sensors
Different inertial sensors allow different
approaches to integrating image and inertial
measurements
68
Recommendations (4) Inertial sensors, cont.
Current Image measurements dominate Inertial
measurements disambiguate resulting motion
69
Recommendations (5) Inertial sensors, cont.
Best paradigm with high quality inertial
sensors Inertial measurements give good short
range motion Image measurements primarily
used to reduce drift
70
Recommendations (6) Inertial sensors, cont.
Result Fewer image features required
Remaining features tracked more reliably using
high quality inertial information
Write a Comment
User Comments (0)
About PowerShow.com