Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces) - PowerPoint PPT Presentation

About This Presentation
Title:

Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)

Description:

Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces) Pradeep Buddharaju U of H COSC 6397 Principal Component Analysis A N x N pixel image of a face ... – PowerPoint PPT presentation

Number of Views:878
Avg rating:3.0/5.0
Slides: 20
Provided by: IoannisP5
Learn more at: https://www2.cs.uh.edu
Category:

less

Transcript and Presenter's Notes

Title: Face Recognition using PCA (Eigenfaces) and LDA (Fisherfaces)


1
Face Recognition using PCA (Eigenfaces) and LDA
(Fisherfaces)
  • Pradeep Buddharaju

U of H
COSC 6397
2
Principal Component Analysis
  • A N x N pixel image of a face,
    represented as a vector occupies a single
    point in N2-dimensional image space.
  • Images of faces being similar in overall
    configuration, will not be randomly
    distributed in this huge image space.
  • Therefore, they can be described by a low
    dimensional subspace.
  • Main idea of PCA (cutler96)
  • To find vectors that best account for variation
    of face images in entire image space.
  • These vectors are called eigen vectors.
  • Construct a face space and project the images
    into this face space (eigenfaces).

3
Image Representation
  • Training set of images of size NN are
    represented by vectors of size N2
  • ?1,?2,?3,,?M
  • Example

4
Average Image and Difference Images
  • The average training set is defined by
  • ? (1/M) ?Mi1 ?i
  • Each face differs from the average by vector
  • Fi Gi ?

5
Covariance Matrix
  • A covariance matrix is constructed as
  • C AAT, where AF1,,FM of size N2 x
    N2
  • Finding eigenvectors of N2 x N2 matrix is
    intractable. Hence, use the matrix ATA of size M
    x M and find eigenvectors of this small matrix.


Size of this matrix is N2 x N2
Size of this matrix is MM
6
Eigenvalues and Eigenvectors - Definition
  • If v is a nonzero vector and ? is a number such
    that
  • Av  ?v, then             
  • v is said to be an eigenvector of A with
    eigenvalue ?.
  • Example

l
(eigenvalues)
(eigenvectors)
A
v
7
How to Calculate Eigenvectors?
8
Eigenvectors of Covariance Matrix
  • The eigenvectors vi of ATA are
  • Consider the eigenvectors vi of ATA such that
  • ATAvi ?ivi
  • Premultiplying both sides by A, we have
  • AAT(Avi) ?i(Avi)

9
Face Space
  • The eigenvectors of covariance matrix are
  • ui Avi

Face Space
  • ui resemble facial images which look ghostly,
    hence called eigenfaces

10
Projection into Face Space
  • A face image can be projected into this face
    space by
  • Ok UT(Gk ?) k1,,M

Projection of Image1
11
Recognition
  • The test image, G, is projected into the face
    space to obtain a vector, O
  • O UT(G ?)
  • The distance of O to each face class is defined
    by
  • ?k2 O-Ok2 k 1,,M
  • A distance threshold,?c, is half the largest
    distance between any two face images
  • ?c ½ maxj,k Oj-Ok j,k 1,,M

12
Recognition
  • Find the distance, ? , between the original
    image, G, and its reconstructed image from the
    eigenface space, Gf,
  • ?2 G Gf 2 , where Gf U O ?
  • Recognition process
  • IF ??cthen input image is not a face image
  • IF ?lt?c AND ?k?c for all k then input image
    contains an unknown face
  • IF ?lt?c AND ?kmink ?k lt ?c then input
    image contains the face of individual k

13
Limitations of Eigenfaces Approach
  • Variations in lighting conditions
  • Different lighting conditions for enrolment and
    query.
  • Bright light causing image saturation.
  • Differences in pose Head orientation
  • - 2D feature distances appear to
    distort.
  • Expression
  • - Change in feature location and shape.

14
Linear Discriminant Analysis
  • PCA does not use class information
  • PCA projections are optimal for reconstruction
    from a low dimensional basis, they may not be
    optimal from a discrimination standpoint.
  • LDA is an enhancement to PCA
  • Constructs a discriminant subspace that minimizes
    the scatter between images of same class and
    maximizes the scatter between different class
    images

15
Mean Images
  • Let X1, X2,, Xc be the face classes in the
    database and let each face class Xi, i 1,2,,c
    has k facial images xj, j1,2,,k.
  • We compute the mean image ?i of each class Xi as
  • Now, the mean image ? of all the classes in the
    database can be calculated as

16
Scatter Matrices
  • We calculate within-class scatter matrix as
  • We calculate the between-class scatter matrix as

17
Projection
  • We find the product of SW-1 and SB and then
    compute the Eigen vectors of this product (SW-1.
    SB).
  • Use same technique as eigenfaces approach to
    reduce the dimensionality of scatter matrix to
    compute eigenvectors.
  • Form a matrix U that represents all eigenvectors
    of SW-1. SB by placing each eigenvector ui as
    each column in that matrix.
  • Each face image xj ? Xi can be projected into
    this face space by the operation
  • Oi UT(xj ?)

18
Testing
  • Same as Eigenfaces Approach

19
References
  • Turk, M., Pentland, A. Eigenfaces for
    recognition. J. Cognitive Neuroscience 3 (1991)
    7186
  • Belhumeur, P., P.Hespanha, J., Kriegman, D.
    Eigenfaces vs. fisherfaces recognition using
    class specific linear projection. IEEE
    Transactions on Pattern Analysis and Machine
    Intelligence 19 (1997) 711720
Write a Comment
User Comments (0)
About PowerShow.com