Tracking for Scene Augmentation - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Tracking for Scene Augmentation

Description:

Tracking for Scene Augmentation & Visualization Ulrich Neumann Computer Science Department Integrated Media Systems Center University of Southern California – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 24
Provided by: ulrichn2
Category:

less

Transcript and Presenter's Notes

Title: Tracking for Scene Augmentation


1
Tracking for Scene Augmentation Visualization
  • Ulrich Neumann
  • Computer Science Department
  • Integrated Media Systems Center
  • University of Southern California
  • July 2000

2
Research Goals
  • Basic science and engineering needed for
    wide-area, unencumbered, real-time person
    tracking
  • Why person tracking?
  • Position/orientation of people in field smart
    sensors
  • Augment the observed scene with 3D information
    gleaned from distributed sources
  • Enable a shared command/control visualization of
    distributed spatial information/sensing sources
  • Spatial relationships are critical in sensing,
    display, and data fusion

3
Person/Head Tracking vs. Object or Vehicle
Tracking
  • Tracking objects from fixed sensors vehicles
  • Objects emit signatures like sound, light, force
    that are detected (illumination is possible)
  • Person/head tracking uses body-worn moving
    sensors
  • Passive sensing of environmental
    signatures/measures
  • difficult to model or sense or measure
  • Vehicle tracking ground, air, water
  • Similar sensors and fusion ideas (e.g., EKF)
  • Inverse component motion rates ?translation
    ?rotation
  • Lack of velocity measures (e.g., wheel rotations,
    flow)
  • Man-portable imposes severe weight/power
    constraints

4
Current Outdoor Person Tracking
  • S. Feiner _at_ Columbia - MARS project
  • L. Rosenblum _at_ ONR/NRL
  • GPS/compass (gyro w/USC)
  • E. Foxlin _at_ Intersense Corp.
  • compass/gyro IS-300
  • R. Azuma _at_ HRL
  • compass/gyro (vision w/USC)
  • U. Neumann, S. You _at_ USC
  • gyro/vision, panoramic
  • Land Warrior - Army/Point Research Corp.
  • GPS/compass/accelerometer (body-only)

5
Technical Approach/Strategy
  • Estimate real-time 6DOF tracking by fusion of
    multiple sensor data streams, each possessing
    variable uncertainty
  • Fusion strategies EKF, data-driven models
  • GPS for 3DOF position (intermittent)
  • Inertial (MEMS) gyros and accelerometers
  • Vision - planar and panoramic projections
  • Compass, pedometer, laser-range-finder,
    human-aided,

6
Proposed Research and Development
  • Sensor fusion algorithms
  • 3DOF orientation from gyro/compass/vision
  • 3DOF position from GPS/accel/vision/LRF
  • Real-time portable prototype
  • Outdoor performance/annotation tests/metrics
  • Command/control visualization testbed
  • Oriented images
  • Precise target localization (man-sighted, UAVs)

7
Data Fusion Methods
  • Extended Kalman Filters
  • Explicit closed-loop models
  • Fuzzy models
  • Implicit data-driven models of noise, sensor
    bias, and sensor correlations that are hard to
    model explicitly
  • E.g., device to device variations, application
    usage variations, (crawling, vs running),
    user-to-user variations
  • Hybrid combinations of above

8
Explicit Closed-Loop Models for Inertial/Vision
Data Fusion
9
USC Research Status
  • Autocalibration
  • Calibrate 3D positions of features (sparse
    modeling)
  • Extend tracking and stabilize pose
  • Panoramic imaging
  • Track from reference images
  • Visualize/monitor 360 scene
  • Gyro/vision fusion
  • Stable orientation (3DOF)
  • Track through high speed motions and blur

10
Autocalibration
estimate camera pose
P K(Fc, S, Fn)
Features (Fc), gyro, accelerometer other
sensors (s)
S ?s(t) dt
detect calibrate new features
Fn K(P, fn)
convergence in an iterative Extended Kalman
Filter framework supports autocalibration and
multiple-sensor fusion
11
Autocalibration Demonstration
12
Motion Estimation/Tracking with Panoramic Images
  • Panoramic images are more robust to partial
    occlusions than planar images
  • Adapt iterative EKF for 5DOF motion estimates
  • Motion direction and rotation between images
  • Good results for small motions RT
  • Similar accuracy as popular 8-point method
  • Least squares solution with more points
  • EKF framework has advantages
  • Sensor fusion framework
  • No minimum number of features
  • Flexibility in rejecting uncorrelated noise

13
Large Motion Estimates
  • Large R and T cause motion estimate errors
  • Large R or T are both desirable for high SNR
  • Errors arise in separating R/T
  • Recursive Rotation Factorization (RRF)
  • Build on iEKF framework for small motions
  • Take advantage of property that features motions
    are identical for a given R motion
  • Estimate R I2 TR I1
  • Factor R from image I2 TR R-1 I1
  • Estimate T I2 T I1
  • Iterate until R and T converge

14
RRF Large-Motion Estimation
  • RRF motion estimation with various noise levels
    (1 m displacement and 10-90 degree rotation about
    the up-axis).
  • Left and right side charts describe translation
    and rotation error respectively.
  • Noise levels are 0.3, 1.5, and 3.0 degrees, top
    to bottom

Translation Rotation
15
Panoramic 6DOF Tracking
  • 6DOF tracking of a moving camera (red) is
    obtained (without requiring any calibrated
    features in the scene) from multiple 5DOF-motion
    estimates relative to reference images (blue)

16
6DOF Tracking Simulation
RRF motion estimation is computed and integrated
over a sequence of images. The left graph shows
the absolute angular error in translation
direction and the right graph shows the absolute
rotation error. The lower figure shows the
simulated motion path of the camera. Points A
and B are two reference positions. The camera
starts at A and moves along the path.
Translation Rotation
Motion Path
17
Panoramic Images/Video for Visualization
  • Panoramic image from conventional video
  • Video processed in field to produce a panoramic
    image
  • Highly compressed scene capture - no redundancy
  • Panoramic video camera (3200x480 _at_ 24 fps)
  • Transmit 360? field of view from remote site
  • Video motion detection in all directions
  • Real time view for desktop or HMD

18
Panorama from Video
19
Gyro/Vision Orientation Tracking
  • Gyro angular rate sensors
  • Drift and bias (1KHz)
  • Video angular rate sensors
  • tracking loss/drift (30Hz)
  • Compass orientation sensor
  • jitter and magnetic field distortion (100 Hz)

20
Orientation Tracking Test
  • Predict 2D feature motion from gyro data
  • Refine gyro data by vision feature tracking
  • Stabilize gyro drift and more robust 2D tracking

21
Orientation-Tracking Demonstration
22
Gyro/Vision Fusion Examples
Video tracking process
0 1 2 3 . 32 33 34 ms time
23
Cooperation within MURI Team
  • Algorithms for sensor fusion and uncertainty
    management
  • Portable prototype and testbed for visualization
    demonstration and outdoor tracking tests
  • Shared visualizations of spatial annotations and
    panoramic imagery
  • Tracking and modeling are strongly related
  • Scene modeling aids tracking and vice-versa
Write a Comment
User Comments (0)
About PowerShow.com