Christopher M' Bishop - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Christopher M' Bishop

Description:

then evaluate posterior probabilities using Bayes' theorem ... Illustrative 2-class experiments: cows vs. sheep. Interest Points and Features ... – PowerPoint PPT presentation

Number of Views:80
Avg rating:3.0/5.0
Slides: 28
Provided by: cmbi7
Category:

less

Transcript and Presenter's Notes

Title: Christopher M' Bishop


1
Generative vs. Discriminative Approaches to
Object Recognition
  • Christopher M. Bishop

Microsoft Research Cambridge
International Workshop on Object
RecognitionSicily, October 2004
2
Collaborator
  • Ilkay Ulusoy (METU)

3
Generative vs. Discriminative Models
  • Generative model joint distribution of data and
    classesthen evaluate posterior probabilities
    using Bayes theorem
  • Discriminative directly model posterior
    probabilities
  • In both cases usually work in a feature space

4
Weakly Labelled Images
  • Images labelled with object category only
  • Patch-based models, no spatial relationships
  • Illustrative 2-class experiments cows vs. sheep

5
Interest Points and Features
  • Difference of Gaussians or Harris
  • SIFT optionally colour
  • Interest point code from Cordelia Schmid

6
A Discriminative Approach
  • Graphical representation

7
A Discriminative Approach
  • Patch labels are hidden
  • Class labels (not mutually exclusive) for whole
    image
  • Image carries class label k if at least one patch
    does, so posterior probability for image is

8
A Discriminative Approach
  • Error function given by negative log
    likelihoodwhere
  • Gradient based optimization
  • Model learns to predict class for each patch,
    even though training labels are only given for
    the whole image

9
A Discriminative Approach
  • Only patches which help to discriminate between
    the object classes become labelled with object
    category
  • All others, including most foreground patches,
    are classified as background meaning
    non-discriminative
  • This discriminative approach therefore labels the
    image, but does not segment the object

10
(No Transcript)
11
(No Transcript)
12
(No Transcript)
13
An Alternative Discriminative Approach
  • Cluster feature vectors from all patches in all
    training images using K-means (K 100)
  • For each image assign each patch to closest
    prototype
  • Gives fixed-length histogram feature vector
  • Use (normalized) histogram as feature vector for
    classifier

14
Automatic Relevance Determination
  • Select relevant features using automatic
    relevance determination (MacKay and Neal)
  • Prior distribution over parameters governed by
    hyper-parameters
  • Optimize hyper-parameters by maximizing marginal
    likelihood
  • High implies low relevance for feature

15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
Preliminary Comparative Results
  • 167 images from each class
  • 10 fold cross-validation

23
A Generative Approach
24
A Generative Approach
  • Joint distribution specified by
  • Determine parameters by maximum likelihood

25
(No Transcript)
26
(No Transcript)
27
Discussion
  • Generative model required good initialization
    can we use a discriminative model to do this?
  • Generative models can exploit mix of strongly
    labelled, weakly labelled and unlabelled data
  • Would be much more satisfactory to learn both
    interest point detector and local descriptors
Write a Comment
User Comments (0)
About PowerShow.com