PROBEWorkshop on graph partitioning in Vision and Machine Learning

1 / 17
About This Presentation
Title:

PROBEWorkshop on graph partitioning in Vision and Machine Learning

Description:

Approaches that view task as kind of graph partitioning problem coming up ... Second workshop in year or so. Now, on to Jerry's presentation ... –

Number of Views:23
Avg rating:3.0/5.0
Slides: 18
Provided by: csC76
Category:

less

Transcript and Presenter's Notes

Title: PROBEWorkshop on graph partitioning in Vision and Machine Learning


1
PROBE/Workshop on graph partitioning in Vision
and Machine Learning
Organizers
Avrim Blum CMU Algs/ML Jon
Kleinberg Cornell Algs John Lafferty CMU ML J
ianbo Shi U.Penn Vision Eva
Tardos Cornell Algs Ramin Zabih Cornell Vision
2
PROBE/Workshop on graph partitioning in Vision
and Machine Learning
Executive summary
  • Approaches that view task as kind of graph
    partitioning problem coming up recently in
    Computer Vision and Machine Learning.
  • Our goal bring communities together. Relate
    objectives, techniques, experiences.
  • Successful workshop Jan 9-11, 2003 (2164
    attendees). Ongoing research projects.

3
PROBE/Workshop on graph partitioning in Vision
and Machine Learning
Grad students
4
Background/history
  • Graph partitioning problems have long history in
    algs optimization.
  • max flow / min cut
  • balanced separators, min ratio cuts,...
  • k-median, facility location,...

5
Background/history
Graph partitioning problems have long history in
algs optimization.
  • Recent use in computer vision
  • Stereo image reconstruction.
  • Greig,Porteous,Seheult, Boykov,Veksler,Zabih
    ...
  • Image segmentation.
  • Shi, Malik,...

6
(No Transcript)
7
Whats going on?
  • Fix up initial match using idea that most
    neighboring pixels should be at same depth.
  • Minimize energy function cost for flipping
    label pairwise cost.

8
Whats going on?
  • A min-cut solves this exactly for case of 2
    labels. Can get approximate (or good local
    optimal) for multiple labels. BVZKT
  • Empirically wins big over previous methods.

9
Also in Machine Learning
  • Important topic in recent years can large
    unlabeled dataset help with learning?
  • Often, have reason to believe two examples
    probably have same label, even if unsure what
    that label is
  • Could be just similarity, or additional features.
  • examples take role of pixels.

10
Graph partitioning in ML
  • Define graph with edges between similar examples
    (perhaps weighted).
  • Solve for best labeling (e.g., minimize weight of
    bad edges).

-


-
View as MRF problem or graph cut problem.
Blum, Chawla, Zhu, Ghahramani,
Z,G,Lafferty...
11
Graph partitioning in ML
  • But then other issues
  • What is similar anyway
  • Other assumptions/beliefs not modeled by graph
    structure or min-cut objective.
  • features, other info.
  • ...

12
The PROBE
  • Given all this, it seemed high time to get these
    groups/communities together.
  • Workshop on Graph Partitioning in Vision and
    Machine Learning
  • Research collaborations

13
Results of workshop
  • Better understanding of similarities
    differences. (Objectives, side information).
  • Improve communication, crystalize some of key
    problems.
  • Ideas to try
  • Discussions started

14
Research projects
  • Balcan, Zhu learning from visual data.
  • Zhu, Ghahramani, Lafferty Gaussian random
    fields.
  • Rwebangira, Reddy use standard learning
    algorithm to set priors.
  • Bansal, B, Chawla, Cohen, McCallum correlation
    clustering (formulation of learning-based
    clustering problem).

15
Correlation clustering
Cohen and McCallum learning for entity-name
clustering
Bansal-Blum-Chawla formulate as graph problem.
Apx algs
McCallum and Wellner NLP coreference
Demaine-Immorlica, Charikar et al, Emanuel-Fiat
improved LP-based algs, generalize results.
16
From Andrew McCallum mccallum_at_cs.umass.edu
Subject
graph partitioning

Date 26 Jun 2003 162851 0400

Hi Avrim, Nikhil and Shuchi, I realized the
other day that I hadn't yet sent you the paper on
using graph partitioning for NLP coreference
analysis... this is the paper related the the
conference call we had a while ago earlier this
year. We successfully used a minor variant on
your "minimizing disagreements" correlational
clustering algorithm to train the parameters of
an undirected graphical model that performs NLP
coreference. We are still making further
feature-representation improvements, but already
we are strongly out-performing several
alternative algorithms that use identical feature
sets, and also (barely) beating best-in-the-world
performance on noun coreference in the MUC-7
dataset from a group at Cornell. I am becoming
increasingly interested in graph partitioning
algorithms, and would love to talk further. In
particular, I'm especially interested in

algorithms that will scale nicely to
thousands of nodes
randomized algorithms whose posterior
distribution over partitionings corresponds to
the posterior distribution of the corresponding
Markov random field....

Have you been
thinking further in this area? Let's find some
time to talk! Best wishes, Andrew
17
Future plans
  • Continued research interactions.
  • Locally, working together with some Darpa
    projects in Vision and AI.
  • Second workshop in year or so.

18
Now, on to Jerrys presentation
Write a Comment
User Comments (0)
About PowerShow.com