Introduction to Belief Propagations - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Introduction to Belief Propagations

Description:

Prior Model p(u) Captures know information about unknown ... Because same evidence is passed around the network multiple times and mistaken for new evidence. ... – PowerPoint PPT presentation

Number of Views:96
Avg rating:3.0/5.0
Slides: 19
Provided by: cadZj
Category:

less

Transcript and Presenter's Notes

Title: Introduction to Belief Propagations


1
Introduction to Belief Propagations
  • Xiang Zeng

2
Content
  • Background
  • The BP Algorithm
  • Applications

3
Bayesian models in low level vision
  • A statistical description of an estimation
    problem.
  • Given data d, we want to estimate unknown
    parameters u
  • Two components
  • Prior Model p(u) Captures know information
    about unknown data and is independent of observed
    data. Distribution of probable solutions.
  • Sensor Model p(du) Describes relationship
    between sensed measurements d and unknown hidden
    data u.
  • Combine using Bayes Rule to give the posterior

4
Markov Random Fields
Pairwise Markov Random Field Model commonly used
to represent images




ui

5
Bayesian Networks
  • The Asia example
  • Joint probability
  • Marginal probability

6
Inference On Bayesian Networks
  • The joint probability distribution for the
    unknown variable xi
  • Infer on the belief at a node i
  • bi(xi)P(xi)

7
Belief Propagations
  • A message-passing algorithm
  • A message mij(xj) is the information passing
    from node i to node j on what state node j should
    be in.
  • A nodes belief is the product of all its
    incoming messages

8
Belief Propagations
  • The algorithm

9
Belief Propagation
m5
m5
m4
10
Belief Propagations
  • The Theory
  • The Bethe free energy
  • Equivalence of BP to the Bethe Approximation
  • Equivalence of BP to Dynamic Programming
  • Solving MRF energy function using BP

11
Belief Propagations
  • Loopy non-loopy graph
  • BP works for singly connected networks. It is
    guaranteed to converge to the correct answers
  • BP does not always work for loopy networks.
    Because same evidence is passed around the
    network multiple times and mistaken for new
    evidence.

12
Belief Propagations
  • Loopy graph works sometimes
  • Although evidence is double counted, all
    evidence may be double counted. It is proved to
    be correct in this situation.
  • Single loop ? BP is guaranteed to generate the
    most likely state sequence.
  • Multiple loops? Balanced network will work.

13
Belief Propagations-Improvement
  • BP Visiting Order Reschedule
  • In traditional BP algorithm, messages being
    passed and updated between nodes are without any
    priority.
  • This is not efficient because notes with weak
    evidence providing less useful information to
    their neighbors. Messages from these nodes should
    be passed at later stage compared with those
    nodes with strong evidence.
  • A new node visiting order to effectively passing
    messages between graph nodes.
  • Rank the nodes according to the belief of their
    local evidence (breadth first search). Most
    informative node passes its message first.
  • Reverse the order in step 1, pass messages back.

14
Belief Propagations-Improvement
  • Theories
  • Efficient Belief Propagation for Early Vision
  • -- CVPR03
  • Globally optimal solutions for energy
    minimization in stereo vision using reweighted
    belief propagation
  • -- ICCV05

15
Application in vision
  • ExampleOptical Flow

16
Application in graphics
17
Applications
  • Very Similar to Graph Cuts
  • But not as good as graph cuts in general

18
  • Thanks
Write a Comment
User Comments (0)
About PowerShow.com