Alignment - PowerPoint PPT Presentation

About This Presentation
Title:

Alignment

Description:

Title: PowerPoint Presentation Last modified by: Michael Kazhdan Created Date: 1/1/1601 12:00:00 AM Document presentation format: On-screen Show Other titles – PowerPoint PPT presentation

Number of Views:146
Avg rating:3.0/5.0
Slides: 46
Provided by: csJhuEdu97
Learn more at: https://www.cs.jhu.edu
Category:
Tags: alignment | eigen | value

less

Transcript and Presenter's Notes

Title: Alignment


1
Alignment
  • Introduction

Notes courtesy of Funk et al., SIGGRAPH 2004
2
  • Outline
  • Challenge
  • General Approaches
  • Specific Examples

3
Alignment
  • Challenge
  • The shape of a model does not change when acted
    on by similarity transformations


4
Alignment
  • Challenge
  • However, the shape descriptors can change if a
    the model is
  • Translated
  • Scaled
  • Rotated
  • How do we match shape descriptors across
    transformations that do not change the shape of a
    model?

5
  • Outline
  • Challenge
  • General Approaches
  • Specific Examples

6
Alignment
  • Approaches
  • Given the shape descriptors of two models, find
    the transformation(s) -- translation, scale, and
    rotation -- that minimize the distance between
    the two models
  • Exhaustive search
  • Closed form solution
  • Minimization
  • Normalization
  • Invariance

7
Alignment
  • Aside
  • Because translations and rotations preserve
    distances, applying such a transformation to one
    model is equivalent to applying the inverse
    transformation to the other one


8
Alignment
  • Aside
  • For translations and rotations we can simplify
    the alignment equation

9
Exhaustive Search
  • Approach
  • Compare the descriptors at all possible
    transformations.
  • Find the transformation at which the distance is
    minimal.
  • Define the model similarity as the value at the
    minimum.

10
Exhaustive Search
  • Approach
  • Compare the descriptors at all possible
    transformations.
  • Find the transformation at which the distance is
    minimal.
  • Define the model similarity as the value at the
    minimum.

Exhaustive search for optimal rotation
11
Exhaustive Search
  • Approach
  • Compare the descriptors at all possible
    transformations.
  • Find the transformation at which the distance is
    minimal.
  • Define the model similarity as the value at the
    minimum.

12
Exhaustive Search
  • Properties
  • Always gives the correct answer
  • Needs to be performed at run-time and can be very
    slow to compute
  • Computes the measure of similarity for every
    transform. We only need the value at the best one.

13
Closed Form Solution
  • Approach
  • Explicitly find the transformation(s) that solves
    the equation
  • Properties
  • Always gives the correct answer
  • Only compute the measure of similarity for the
    best transformation.
  • A closed form solution does not always exist.
  • Often needs to be computed at run-time.

14
Minimization
  • Approach
  • Coarsely align the models using low frequency
    information.
  • Progressively refine the alignment by comparing
    higher frequency components and adjusting the
    alignment.
  • Converge to the (locally) optimal alignment.
  • Example Light field descriptors

Spherical Extent Function
15
Minimization
  • Approach
  • Coarsely align the models using low frequency
    information.
  • Progressively refine the alignment by comparing
    higher frequency components and adjusting the
    alignment.
  • Converge to the (locally) optimal alignment.

Initial Models
Low Frequency
Aligned Models
16
Minimization
  • Approach
  • Coarsely align the models using low frequency
    information.
  • Progressively refine the alignment by comparing
    higher frequency components and adjusting the
    alignment.
  • Converge to the (locally) optimal alignment.





Initial Models
Low Frequency
Aligned Models
High Frequency
17
Minimization
  • Properties
  • Can be applied to any type of transformation
  • Needs to be computed at run-time.
  • Difficult to do robustly
  • Given the low frequency alignment and the
    computed high-frequency alignment, how do you
    combine the two?
  • Considerations can include
  • Relative size of high and low frequency info
  • Distribution of info across the low frequencies
  • Speed of oscillation

18
Normalization
  • Approach
  • Place every model into a canonical coordinate
    frame and assume that two models are optimally
    aligned when each is in its own canonical frame.
  • Example COM, Max Radius, PCA

19
Normalization
  • Properties
  • Can be computed in a pre-processing stage.
  • For some transformations this is guaranteed to
    give the optimal alignment.
  • For other transformations the approach is only a
    heuristic and may fail.

Failure of PCA-normalization in aligning rotations
20
Invariance
  • Approach
  • Represent every model by a descriptor that is
    unchanged when the model is transformed by
    discarding information that is transformation
    dependent.

Transformation-invariant descriptor
21
Invariance
  • Review
  • Is there a general method for addressing these
    basic types of transformations?

Descriptor Translation Scale Rotation
Shape Distributions (D2) -
Extended Gaussian Images - -
Shape Histograms (Shells) - -
Shape Histograms (Sectors) - -
Spherical Parameterizations - -
22
Invariance
  • Properties
  • Can be computed in a pre-processing stage.
  • Works for translation, scale and rotation.
  • Gives a more compact representation.
  • Tends to discard valuable, discriminating,
    information.

.....No Invariance ..Rotation Translation
Rotation ...Rotation
23
  • Outline
  • Challenge
  • General Approaches
  • Specific Examples
  • Normalization PCA
  • Closed Form Solution Ordered Point Sets

24
PCA Alignment
  • Treat a surface as a collection of points and
    define the variance function

25
PCA Alignment
  • Define the covariance matrix M
  • Find the eigen-values and align so that the
    eigen-values map to the x-, y-, and z-axes

26
PCA Alignment
  • Limitations
  • Eigen-values are only defined up to sign!PCA
    alignment is only well-defined up to axial flips
    about the x-, y-, and z-axes.

27
PCA Alignment
  • Limitations
  • Assumes that the eigen-values are distinct and
    therefore the eigen-vectors are well-defined (up
    to sign).
  • This is not always true and can make PCA
    alignment unstable.

28
  • Outline
  • Challenge
  • General Approaches
  • Specific Examples
  • Normalization PCA
  • Closed Form Solution Ordered Point Sets

29
Ordered Point Sets
  • Challenge
  • Given ordered point sets Pp1,,pn,
    Qq1,,qn, find the rotation/reflection R
    minimizing the sum of squared differences

q4
q4
p2
R(q2)
p1
p3
R(q1)
R(q3)
R
q5
q5
q3
q3
q6
q6
q2
q2
p4
p6
R(q6)
R(q4)
p5
R(q5)
q1
q1
30
Review
  • Vector dot-product
  • If v (v1,,vn) and w(w1,,wn) are two
    n-dimensional vectors the dot-product of v with w
    is the sum of the product of the coefficients

31
Review
  • Trace
  • The trace of a nxn matrix M is the sum of the
    diagonal entries of M
  • Properties

32
Review
  • Trace
  • If M is any nxn matrix and D is a diagonal nxn
    matrix, then the trace of MD is the sum of the
    products of the diagonal entries.

M
D
33
Review
  • Matrix multiplication
  • If M and N are two then the (i,j)th entry of the
    matrix MN is the dot-product of the jth row
    vector of M with the ith column vector of N.

M
MN
N
jth row
ith column
(i,j)th entry
34
Review
  • Matrix dot-product
  • If M and N are two mxn matrices then the ith
    diagonal entry of the matrix MtN is the
    dot-product of the ith column vector of M with
    the ith column vector of N.

m
n
Mt
MtN
N
n
m
ith row
ith column
ith diagonal entry
35
Review
  • Matrix dot-product
  • We define the dot-product of two mxn matrices, M
    and N, to be the trace of the matrix product
  • (the sum of the dot-products of the column
    vectors).

36
Review
  • SVD Factorization
  • If M is an mxm matrix, then M can be factored as
    the product
  • where D is a diagonal mxm matrix with
    non-negative entries and U and V are orthonormal
    (i.e. rotation/reflection) mxm matrices.

37
Trace Maximization
  • Claim
  • If M is an mxm matrices, whose SVD factorization
    is
  • then the orthonormal transformation RVUt is the
    orthonormal transformation maximizing the trace

38
Trace Maximization
  • Proof
  • We can rewrite the trace equation as
  • If we set R0 to be the rotation R0VtRU, we get

39
Trace Maximization
  • Proof
  • Since R0 is a rotation, each of its entries can
    have value no larger than one.
  • Since D is diagonal, the value of the trace is
    the product of the diagonal elements of D and R0.

40
Trace Maximization
  • Proof
  • To maximize the trace we want R0 to be maximal on
    the diagonal (i.e. have only 1s).
  • Thus, R0 is the identity matrix and we have
  • So that the rotation/reflection R that maximizes
    Trace(RM) is

41
Ordered Point Sets
  • Challenge
  • Given ordered point sets Pp1,,pn,
    Qq1,,qn, find the rotation/reflection R
    minimizing the sum of squared differences

42
Ordered Point Sets
  • Solution
  • Use the fact that we can express the difference
  • to rewrite the equation as

43
Ordered Point Sets
  • Solution
  • Use the fact that rotations preserves lengths
  • to rewrite the equation as

44
Ordered Point Sets
  • Solution
  • Since the value
  • Does not depend on the choice of rotation R,
    minimizing the sum of squared distances is
    equivalent to maximizing the sum of dot-products

45
Ordered Point Sets
  • Solution
  • If we let MP (respectively MQ) be the 3xn matrix
    whose columns are the points pi (respectively qi)
    then we can rewrite the sum of the vector
    dot-products as the matrix dot product
  • and we can find the maximizing rotation/reflection
    R by trace maximization.
Write a Comment
User Comments (0)
About PowerShow.com