A Graphical Model For Simultaneous Partitioning And Labeling - PowerPoint PPT Presentation

About This Presentation
Title:

A Graphical Model For Simultaneous Partitioning And Labeling

Description:

A Graphical Model For Simultaneous Partitioning And Labeling. Philip ... Thomas Minka, Yuan Qi and Michel Gagnet for useful discussion and providing software. ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 30
Provided by: philip99
Category:

less

Transcript and Presenter's Notes

Title: A Graphical Model For Simultaneous Partitioning And Labeling


1
A Graphical Model For Simultaneous Partitioning
And Labeling
  • Philip Cowans
  • Martin Szummer
  • AISTATS, Jan 2005

Cambridge
2
Motivation Interpreting Ink
Hand-drawn diagram
Machine interpretation
3
Graph Construction
Vertices are grouped into parts.
Vertices, V
Each part is assigned a label
G
Edges, E
4
Labeled Partitions
  • We assume
  • Parts are contiguous.
  • The graph is triangulated.
  • Were interested in probability distributions
    over labeled partitions conditioned on observed
    data.

5
Conditional Random Fields
  • CRFs (Lafferty et. al.) provide joint labeling of
    graph vertices.
  • Idea define parts to be contiguous regions with
    same label.
  • But
  • Large number of labels needed.
  • Symmetry problems / bias.

2
1
2
-1
-1
3
3
6
A Better Approach
  • Extend the CRF framework to work directly with
    labeled partitions.
  • Complexity is improved dont need to deal with
    so many labels.
  • No symmetry problem were working directly with
    the representation in which the problem is posed.

7
Consistency
  • Let G and H µ V.
  • Y (G) and Y (H) are consistent if and only if
  • For any vertex in G Ã… H, Y (G) and Y (H) agree on
    its label.
  • For any pair of vertices in G Ã… H, Y (G) and
    Y (H) agree on their part membership.
  • Denoted Y (G) v Y (H).

8
Projection
  • Projection maps labeled partitions onto smaller
    subgraphs.
  • If G µ V then, the projection of Y onto G is the
    unique labeled partition of G which is
    consistent with Y.

9
Notation
10
Potentials
11
The Model
  • Unary
  • Pairwise

12
The Model
13
Training
  • Train by finding MAP weights on example data with
    Gaussian prior (BFGS).
  • We require the value and gradient of the log
    posterior

Normalization
Marginalization
14
Prediction
  • New data is processed by finding the most
    probable labeled partition.
  • This is the same as normalization with the
    summation replaced by a maximization.

15
Inference
  • These operations require summation or
    maximization over all possible labeled
    partitions.
  • The number of terms grows super-exponentially
    with the size of G.
  • Efficient computation possible using message
    passing as distribution factors.
  • Proof based on Shenoy Shafer (1990).

16
Factorization
  • A distribution factors if it can be written as a
    product of potentials for cliques on the graph
  • This is the case for the (un-normalized) model.
  • This allows efficient computation using message
    passing.

17
Message Passing
8
9
7
1
2
3
4
5
6
18
Message Passing
2,9
1,7,8
1,2,3,4
Upstream
Message summarizes contribution from upstream
to the sum for a given configuration of the
separator.
Junction tree constructed from cliques on
original graph.
2,3,4,5
4,5,6
19
Message Passing
2,9
1,7,8
1,2,3,4
2,3,4,5
x22
4,5,6
20
Message Update Rule
  • Update messages (for summation) according to
  • Marginals found using
  • Z can be found explicitly

21
Complexity
22
Experimental Results
  • We tested the algorithm on hand drawn ink
    collected using a Tablet PC.
  • The task is to partition the ink fragments into
    perceptual objects, and label them as containers
    or connectors.
  • Training data set was 40 diagrams, from 17
    subjects with a total of 2157 fragments.
  • 3 random splits (20 training and 20 test
    examples).

23
Example 1
24
Example 1
25
Example 2
26
Example 2
27
Example 3
28
Example 3
29
Labeling Results
  • Labelling error fraction of fragments labeled
    incorrectly.
  • Grouping error fraction of edges locally
    incorrect.

30
Conclusions
  • We have presented a conditional model definied
    over labeled partitions of an undirected graph.
  • Efficient exact inference is possible in our
    model using message passing.
  • Labeling and grouping simultaneously can improve
    labeling performance.
  • Our model performs well when applied to the task
    of parsing hand-drawn ink diagrams.

31
Acknowledgements
  • Thanks to
  • Thomas Minka, Yuan Qi and Michel Gagnet for
    useful discussion and providing software.
  • Hannah Pepper for collecting our ink database.
Write a Comment
User Comments (0)
About PowerShow.com