Title: SVD(Singular Value Decomposition) and Its Applications
1SVD(Singular Value Decomposition) and Its
Applications
2The plan today
- Singular Value Decomposition
- Basic intuition
- Formal definition
- Applications
3LCD Defect Detection
- Defect types of TFT-LCD Point, Line, Scratch,
Region
4Shading
- Scanners give us raw point cloud data
- How to compute normals to shade the surface?
normal
5Hole filling
6Eigenfaces
- Same principal components analysis can be applied
to images
7Video Copy Detection
- same image content different source
- same video source different image content
AVI and MPEG face image
Hue histogram
Hue histogram
83D animations
- Connectivity is usually constant (at least on
large segments of the animation) - The geometry changes in each frame ? vast amount
of data, huge filesize!
13 seconds, 3000 vertices/frame, 26 MB
9Geometric analysis of linear transformations
- We want to know what a linear transformation A
does - Need some simple and comprehendible
representation of the matrix of A. - Lets look what A does to some vectors
- Since A(?v) ?A(v), its enough to look at
vectors v of unit length
A
10The geometry of linear transformations
- A linear (non-singular) transform A always takes
hyper-spheres to hyper-ellipses.
A
11The geometry of linear transformations
- Thus, one good way to understand what A does is
to find which vectors are mapped to the main
axes of the ellipsoid.
A
12Geometric analysis of linear transformations
- If we are lucky A V ? VT, V orthogonal
(true if A is symmetric) - The eigenvectors of A are the axes of the ellipse
13Symmetric matrix eigen decomposition
- In this case A is just a scaling matrix. The
eigen decomposition of A tells us which
orthogonal axes it scales, and by how much
14General linear transformations SVD
- In general A will also contain rotations, not
just scales
?1
1
1
?2
A
15General linear transformations SVD
?1
1
1
?2
A
orthonormal
orthonormal
16SVD more formally
- SVD exists for any matrix
- Formal definition
- For square matrices A ? Rn?n, there exist
orthogonal matrices U, V ? Rn?n and a diagonal
matrix ?, such that all the diagonal values ?i of
? are non-negative and
17SVD more formally
- The diagonal values of ? (?1, , ?n) are called
the singular values. It is accustomed to sort
them ?1 ? ?2? ? ?n - The columns of U (u1, , un) are called the left
singular vectors. They are the axes of the
ellipsoid. - The columns of V (v1, , vn) are called the right
singular vectors. They are the preimages of the
axes of the ellipsoid.
18SVD is the working horse of linear algebra
- There are numerical algorithms to compute SVD.
Once you have it, you have many things - Matrix inverse ? can solve square linear systems
- Numerical rank of a matrix
- Can solve least-squares systems
- PCA
- Many more
19Matrix inverse and solving linear systems
- Matrix inverse
- So, to solve
20Matrix rank
- The rank of A is the number of non-zero singular
values
n
?1
?2
?n
m
21Numerical rank
- If there are very small singular values, then A
is close to being singular. We can set a
threshold t, so that numeric_rank(A) ?i ?i
gt t - If rank(A) lt n then A is singular. It maps the
entire space Rn onto some subspace, like a plane
(so A is some sort of projection).
22Solving least-squares systems
- We tried to solve Axb when A was rectangular
- Seeking solutions in least-squares sense
b
A
x
23Solving least-squares systems
- We proved that when A is full-rank,
(normal equations). - So
?12
?1
?1
?n
?n2
?n
24Solving least-squares systems
- Substituting in the normal equations
1/?1
1/?12
?1
1/?n2
?n
1/?n
25Pseudoinverse
- The matrix we found is called the pseudoinverse
of A. - Definition using reduced SVD
If some of the ?i are zero, put zero instead of
1/?I .
26Pseudo-inverse
- Pseudoinverse A exists for any matrix A.
- Its properties
- If A is m?n then A is n?m
- Acts a little like real inverse
27Solving least-squares systems
- When A is not full-rank ATA is singular
- There are multiple solutions to the normal
equations - Thus, there are multiple solutions to the
least-squares. - The SVD approach still works! In this case it
finds the minimal-norm solution
28PCA the general idea
- PCA finds an orthogonal basis that best
represents given data set. - The sum of distances2 from the x axis is
minimized.
y
y
x
x
29PCA the general idea
y
v2
v1
x
y
y
x
x
The projected data set approximates the original
data set
This line segment approximates the original data
set
30PCA the general idea
- PCA finds an orthogonal basis that best
represents given data set. - PCA finds a best approximating plane (again, in
terms of ?distances2)
z
3D point set in standard basis
y
x
31For approximation
- In general dimension d, the eigenvalues are
sorted in descending order - ?1 ? ?2 ? ? ?d
- The eigenvectors are sorted accordingly.
- To get an approximation of dimension d lt d, we
take the d first eigenvectors and look at the
subspace they span (d 1 is a line, d 2 is a
plane)
32For approximation
- To get an approximating set, we project the
original data points onto the chosen subspace - xi m ?1v1 ?2v2 ?dvd ?dvd
- Projection
- xi m ?1v1 ?2v2 ?dvd 0?vd1 0?
vd
33Principal components
- Eigenvectors that correspond to big eigenvalues
are the directions in which the data has strong
components ( large variance). - If the eigenvalues are more or less the same
there is no preferable direction. - Note the eigenvalues are always non-negative.
Think why
34Principal components
- Theres no preferable direction
- S looks like this
- Any vector is an eigenvector
- There is a clear preferable direction
- S looks like this
- ? is close to zero, much smaller than ?.
35Application finding tight bounding box
- An axis-aligned bounding box agrees with the
axes
y
maxY
minX
maxX
x
minY
36Application finding tight bounding box
- Oriented bounding box we find better axes!
x
y
37Application finding tight bounding box
- Oriented bounding box we find better axes!
38Scanned meshes
39Point clouds
- Scanners give us raw point cloud data
- How to compute normals to shade the surface?
normal
40Point clouds
- Local PCA, take the third vector
41Eigenfaces
- Same principal components analysis can be applied
to images
42Eigenfaces
- Each image is a vector in R250?300
- Want to find the principal axes vectors that
best represent the input database of images
43Reconstruction with a few vectors
- Represent each image by the first few (n)
principal components
44Face recognition
- Given a new image of a face, w ? R250?300
- Represent w using the first n PCA vectors
- Now find an image in the database whose
representation in the PCA basis is the closest
The angle between w and w is the smallest
w
w
w
w
45SVD for animation compression
Chicken animation
463D animations
- Each frame is a 3D model (mesh)
- Connectivity mesh faces
473D animations
- Each frame is a 3D model (mesh)
- Connectivity mesh faces
- Geometry 3D coordinates of the vertices
483D animations
- Connectivity is usually constant (at least on
large segments of the animation) - The geometry changes in each frame ? vast amount
of data, huge filesize!
13 seconds, 3000 vertices/frame, 26 MB
49Animation compression by dimensionality reduction
- The geometry of each frame is a vector in R3N
space (N vertices)
3N ? f
50Animation compression by dimensionality reduction
- Find a few vectors of R3N that will best
represent our frame vectors!
VT f?f
? f?f
U 3N?f
VT
51Animation compression by dimensionality reduction
- The first principal components are the important
ones
u1
u2
u3
52Animation compression by dimensionality reduction
- Approximate each frame by linear combination of
the first principal components - The more components we use, the better the
approximation - Usually, the number of components needed is much
smaller than f.
u3
u1
u2
?1
?2
?3
53Animation compression by dimensionality reduction
ui
- Compressed representation
- The chosen principal component vectors
- Coefficients ?i for each frame
Animation with only 2 principal components
Animation with 20 out of 400 principal components
54LCD Defect Detection
- Defect types of TFT-LCD Point, Line, Scratch,
Region
55Pattern Elimination Using SVD
56Pattern Elimination Using SVD
57Pattern Elimination Using SVD
- Singular value determinant for the image
reconstruction
58Pattern Elimination Using SVD
- Defect detection using SVD