Pattern Recognition - PowerPoint PPT Presentation

About This Presentation
Title:

Pattern Recognition

Description:

Almost anything within the reach of our five senses can be chosen as a pattern: ... Pattern Recognition is the science to assign an object/event of interest to one ... – PowerPoint PPT presentation

Number of Views:51
Avg rating:3.0/5.0
Slides: 17
Provided by: drtingch
Category:

less

Transcript and Presenter's Notes

Title: Pattern Recognition


1
Pattern Recognition
  • Why?
  • To provide machines with perception cognition
    capabilities so that they could interact
    independently with their environments.
  • Pattern Recognition
  • a natural ability of human
  • based on some description of an object, such
    description is termed Pattern.

2
Patterns and Pattern Classes
  • Almost anything within the reach of our five
    senses can be chosen as a pattern
  • Sensory patterns speech, odors, tastes
  • Spatial patterns characters, fingerprints,
    pictures
  • Temporal patterns waveforms, electrocardiograms,
    movies
  • Conceptual recognition for abstract items
  • (We will limit ourselves to deal with only
    physical objects/ events, but NOT abstract
    entities, say, concepts.)
  • A pattern class is a group of patterns with
    certain common characteristics.

3
Pattern Recognition
  • Pattern Recognition is the science to assign an
    object/event of interest to one of several
    prespecified categories/classes based on certain
    measurements or observations.
  • Measurements are usually problem dependent.
  • E.g. weight or height for basketball
    players/jockeys
  • color for apples/oranges
  • Feature vectors represent measurements as
    coordinates of points in a vector space (feature
    space).

4
Pattern Recognition Systems
5
Statistical Pattern Recognition
  • Taps into the vast and thorough knowledge of
    statistics to provide a formal treatment of PR.
  • Observations are assumed to be generated by a
    state of nature
  • data can be described by a statistical model
  • model by a set of probability functions
  • Strength many powerful mathematical tools from
    the theory of probability and statistics.
  • Shortcoming it is usually impossible to design
    (statistically) errorfree systems.

6
Example OCR
7
Major Steps
8
Raw Features Example
9
Feature Extraction OCR Example
10
Feature Extraction
  • Objectives To remove irrelevant information and
    extract distinctive, representative information
    of the objects.
  • discriminative
  • invariant
  • data compression gt dimension reduction
  • It is not easy!

11
Data Modeling
  • To build statistical models for describing the
    data.
  • Parametric models
  • single probability density function e.g.
    Gaussian
  • mixture density function e.g. Gaussian mixture
    model (GMM)
  • hidden Markov model --- may cope with data of
    different duration/length
  • Nonparametric models
  • k-nearest neighbor
  • Parzen window
  • neural network

12
Training
  • Training Data
  • Model is learned from a set of training data
  • Data collection - should contain data from
    various regions of the pattern space.
  • Do you know the whole pattern space?
  • Training Algorithm can be iterative.
  • When to stop training?
  • Generalization Models trained on a finite set
    of data
  • should also generalize well to unseen data.
  • How to ensure that?

13
Supervised vs. Unsupervised
  • Supervised PR
  • Representative patterns from each pattern class
    under consideration are available.
  • Supervised learning.
  • Unsupervised PR
  • A set of training patterns of unknown
    classification is given.
  • Unsupervised learning.

14
Classification
  • Classification of N Classes can be thought as
    partitioning the feature space into N regions, as
    nonoverlapping as possible, so that each region
    represents one of the N classes. Often called
    DecisionTheoretic Approach
  • Decision Boundaries the boundaries between the
    class regions in the feature space.
  • Discriminant Functions mathematical functions to
    describe the decision boundaries.
  • Types of Classifiers depending on the functional
    form of the decision boundary, classifiers may be
    categorized into
  • Linear classifier
  • Quadratic classifier
  • Piecewise classifier

15
Decision Boundary
16
Summary
  • Three main components features, data model, and
    recognition algorithm.
  • Make sure you find out a good set of features to
    work with before you build data models.
  • Data modeling requires knowledge of statistics
    and optimization.
  • Recognition requires classifier design (i.e. the
    discriminant functions), search, and algorithm
    design.
  • Evaluation involves testing on unseen test data
    which must be large enough in order to claim
    statistical significance.
Write a Comment
User Comments (0)
About PowerShow.com