Title: Tracking as Dimensionality Reduction
1Tracking as Dimensionality Reduction
- Ali Rahimi (Intel Research Seattle)
- Ben Recht (Caltech)
- Brian Ferris (University of Washington)
- Dieter Fox (University of Washington)
2Tracking as Dimensionality Reduction
11 and smooth
Want these
Given only this
3Ferris, Fox, Lawrence, IJCI 06
4Taylor, Rahimi, Bachrach, Shrobe, IPSN 06
5Rahimi, Recht, NIPS 06
6Rahimi, Recht, CVPR 05
7WiFi localization
RFID tags
Sensor networks
RF tracking devices
Video
8Lessons
- Utilize time information
- Avoid local minima
- Use plausible models
9Lesson 1 Use Time
Low-dimensional latent variables
Sensor
High-dimensional Observations
10Lesson 1 Use Time
- For smooth time series data
- ST-Isomap (Jenkins) gt Isomap
- DGPLVM (WangFleetHertzman) gt GPLVM
- RahimiRecht gt KPCA
11The Sensetable
Signal strength measurements from tag
RFID tag
12Sensetable Manifold Learning
LLE
KPCA
Ground Truth
Isomap
Ground Truth
ST-Isomap
RahimiRecht 06
13Lesson 2 Avoid Local Minima
- What good is a local max of the posterior?
(It may be arbitrarily improbable)
14Lesson 2 Avoid Local Minima
Wi-Fi localization ground truth
Recovered by DGPLVM with PPCA initializer
15Lesson 2 Avoid Local Minima
Wi-Fi localization ground truth
Recovered by DGPLVM with Isomap initializer
16Lesson 2 Avoid Local Minima
17Lesson 3 Manifold Learning is a Red Herring for
Tracking
- Pros No local mins, fast.
- Cons
- Inapplicable assumptions Isometry, LLE
- Only asymptotic guarantees (not MAP estimate, for
example)
18An Algorithm
- Dynamical model
- MAP estimate of f
- No local mins
- Approximation guarantees
19Why Its Hard
Hairy hairy integral
This integral is hard in general
EM cant even get us a local max of this!
20Steal Some Good Ideas
Manifold Learning f is 11
Nonlinear System ID (EM) prior on X
Y is obtained via change of variable on X
21Two assumptions to simplify marginalization
22Final cost function
Outputs should be likely
Mapping cant collapse to zero
Represent g with kernels on observations
23APPROXIMATE OPTIMUM 1 Using dual relaxation
Not convex in C. Convex in ZCC. Solve by
SDP. Extract solution by SVD. Gives bounds on
solution.
APPROXIMATE OPTIMUM 2 Using spectral relaxation
Find g that gives most likely X while matching
moments of prior. Results in dynamic-augmented
KPCA. Exact in the linear case.
24The Sensetable
Signal strength measurements from tag
RFID tag
25Sensetable Manifold Learning
LLE
KPCA
Ground Truth
Isomap
Ground Truth
ST-Isomap
RahimiRecht 06
26Applying g
27Tracking with KPCA
28Unsupervised Learning in Sensor Networks
Unknown sensor locations Unknown smooth
measurement model Unknown smooth path
Output as a funtcion of distance for various
sensors.
29target position
30True trajectory
Recovered Up to scale and 90 rotation
31Unknown measurement process
Measurements
g
Unknown sensor locations Unknown smooth
measurement model Unknown smooth path
32Conclusion
- Lessons for learning trackers
- Utilize time
- Avoid local mins
- Plausible models
- Requests
- Quantify errors
- Online updates
33(No Transcript)
34Request Online Updates
RF Signal strengths After a chair moves
voltage
time
35Request Quantify errors
- Needed for
- Algorithm development
- Parameter tuning
36Visualization vs. Tracking
From the Isomap page
37Request Quantify errors
- Dont penalize errors in scale, translation,
rotation. - Do penalize incorrect choice of subspace,
folding, changes in topology, nonlinear
stretching, etc - We use affine registration error
38LEARNING TO TRACK in UNCALIBRATED SENSOR
NETWORKS Mapping scalar sensor outputs to
location
Trajectory through sensor net
Measurements from all nodes
Recovered trajectory
Test trajectory
Applying g
Applying g from KPCA
39LEARNING TO TRACK in VIDEOS Mapping appearance
to pose
True rotation
Recovered rotation
Unsup w/ dynamics
Observed image sequence