Partial - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Partial

Description:

Correlation Filters MACE Kernel Correlation Filters. Support Vector Machines ... Energy (MACE) Mahalanobis, Kumar, Casasent (1987) ... – PowerPoint PPT presentation

Number of Views:38
Avg rating:3.0/5.0
Slides: 24
Provided by: tonyb57
Category:
Tags: mace | partial

less

Transcript and Presenter's Notes

Title: Partial


1
Partial Holistic Face Recognition on FRGC-II
data using Support Vector Machine Kernel
Correlation Analysis a brief summary
  • M. Savvides, R. Abiantun, J. Heo, S. Park, C.
    Xie, and B.V.K Vijayakumar

2
Presentation Outline
  • Background Material
  • Correlation Filters ? MACE ? Kernel Correlation
    Filters
  • Support Vector Machines
  • Partial vs. Holistic Face Recognition
  • Results

3
Correlation Pattern Recognition
  • Matched Spatial Filter (MSF)
  • Uses frequency domain
  • Optimal in noise
  • Spatial freq. domain proven in (ATR)
  • No need for image segmentation
  • Closed form expression
  • - Need new classifiers for multiple orientations

4
Synthetic Discriminant FunctionHester and
Casasent (1980)
  • Uses multi-class objects (same object from
    different views) as reference functions
  • System of linear equations determines an Average
    Filter
  • Constraint shift invariance
  • Relative peak values used to detect object
  • - Sidelobes can lead to misclassifications (due
    to constraints it only controls values near the
    origin)

5
(No Transcript)
6
Peek Sidelobe Ratio
7
Min. Avg. Corr. Energy (MACE)Mahalanobis, Kumar,
Casasent (1987)
  • Minimizes sidelobe energy (fewer
    misclassifications)
  • Kallman (optimum) is too computation intensive
  • With only a few training images, outperforms most
    others (in less time)
  • Can be shown it is Preprocessing Invariant

8
(No Transcript)
9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
Correlation Pattern Recognition - Cambridge Press
2005
13
(No Transcript)
14
(No Transcript)
15
(No Transcript)
16
Class Dependant Feature Analysis (CFA)
  • One Filter designed for each of the 222 people in
    the training set
  • One against all 12,776 images
  • Closed form solution to produce a correlation of
    1 for authentic class 0 for everyone else

17
(No Transcript)
18
In general linear subspace methods e.g. LDA, PCA,
are extended into higher dimensions to compensate
or non-linear distortions.Here they use the
Kernel Trick
19
Kernel Trick
  • Uses inner products of functions that map to
    higher dimensions
  • Do not actually have to compute the higher
    dimensional mappings

20
Kernel Correlation Filters
  • Now use the 222 Kernel versions of the CFA
    filters to get KCFA filters
  • Nearly three fold increase over other linear
    methods

21
Support Vector Machines
  • Instead of training the SVM on the images
    (directly) they are trained on the KCFA
    coeffiecients

22
(No Transcript)
23
Final Holistic Results
24
Partial vs. Holistic Face Recognition
  • Experiments using the above techniques were done
    on three regions of the face
  • Mouth region
  • Nose region
  • Eyes and eye-brow region

25
(No Transcript)
26
ROC Curves by Region
27
(No Transcript)
28
(No Transcript)
29
(No Transcript)
30
Conclusions
  • MACE filters are less dependant on alignment
  • KCFA allows working in lower dimensional space
  • SVMs offer a general improvement over other
    similarity metrics and can be trained on lower
    dimensional data
  • Fusion based techniques show improved results in
    Correlation Filters as well
Write a Comment
User Comments (0)
About PowerShow.com