Efficient Belief propagation for Early Vision - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

Efficient Belief propagation for Early Vision

Description:

Efficient Belief propagation for Early Vision Pedro F. Felzenszwalb University of Chicago Daniel P. Huttenlocher Cornell University – PowerPoint PPT presentation

Number of Views:214
Avg rating:3.0/5.0
Slides: 41
Provided by: faz65
Category:

less

Transcript and Presenter's Notes

Title: Efficient Belief propagation for Early Vision


1
Efficient Belief propagation for Early Vision
  • Pedro F. Felzenszwalb University of Chicago
    Daniel P. Huttenlocher Cornell
    University
  • International Journal of Computer Vision 2006.10
  • ??????

2
perface
  • Early vision problems problems based on pixels.
    Such as stereo, and image restoration
  • Early vision problems can be formulated as a
    maximum a posteriori MRF (MAP-MRF) problem.
  • methods to solve MAP-MRF simulated annealing ,
    which can often take an unacceptably long time
    although global optimization is achieved in theory

3
outline
  • Markov random field (MRF) models
  • Message passing
  • 3 techniques to reduce the time
  • 1.Computing Messages update in linear time
  • 2.BP on the bipartite graph message passing
  • schedule
  • 3.Multi-Grid BP

4
MRF models
  • be the set of pixels in an image
  • be a set of labels (ex. intensities)
  • the edges in the four-connected image
  • grid graph
  • Labeling assigns a label to
    each pixel
  • energy function

5
MRF models
  • the cost of assigning labels
    and
  • to two neighboring pixels (discontinuity
    cost) , generally based on the difference between
    labels in low-level vision problem
  • so we use
  • the cost of assigning label
    to pixel
  • (data cost )

energy function
6
Message passing
  • Finding a labeling with minimum energy
    corresponds to MAP estimation problem
  • max-product used to approximate the MAP
  • max-product to min-sum ( with )
  • less sensitive to numerical artifacts
  • max-product BP works by passing messages around
    the graph defined by the
  • four-connected image grid.

7
Message passing
  • In general  .
  • Ex 1.

Total 321
41
1
2
3
4
5
5
4
3
2
1
31
8
Message passing
  • Rules
  • Or
  • Ex 2. (complex)

A
ABC1
B
C
A1
A
1
9
Message passing
7
2
2417 2316 3418
9
8
4
1
2
7
2
1
2
3
3
6
6
8
9
9
8
4
3
1
2
1
9
8
4
10
Message passing
  • be the message that node p sends to
    a neighboring node q at iteration t
  • are initialized to 0
  • a vector
    of n_labels
  • dimension
  • Step 1. Compute messages

11
(No Transcript)
12
(No Transcript)
13
Message passing
Step 2. Compute belief vector for each
node after T iterations
14
  • Step 3. minimizes individually at
    each node is selected
  • time complexity
  • n number of pixels
  • k number of possible labels ( n_lables )
  • T number of iterations

Too long time
15
3 techniques
  • 1. Computing a Message Update in Linear Time (
    to )
  • 2. The bipartite graph message passing schedule (
    to )
  • 3. Multi-Grid BP

16
1.Computing Messages
  • To most low-level vision application
  • based on the difference
    between
  • the labels and

The form of equation (3) is commonly referred to
as a min convolution
17
1.Computing Messages
  • Potts model

Compute fristly
time complexity turns to
go
18
1.Computing Messages
(stereo and image restoration)
  • truncated linear model
  • First. Consider the pure linear model
  • message compute

The minimization can be seen as lower envelope in
figure
go
19
Ex. S1 cones root at
lower envelope
Can use two-pass algorithm to compute message
20
two-pass algorithm
  • initialize message with
  • forward pass
  • for from 1 to
  • backward pass
  • for from to 0

Compute messages with the linear cost
21
two-pass algorithm
  • Ex.
  • 1.
  • 2.forward pass
  • for 1 to 3
  • 3.backward pass
  • for 2 to 0

22
1.Computing Messages
second. Use Potts model to computation
pure linear model
time complexity turns to
  • Truncated quadratic Model

go
23
truncated linear model
Potts model
Truncated quadratic Model
Robust!
24
2.BP on the Grid Graph
  • BP performed more efficiently for a bipartite
    graph
  • grid graph to bipartite graph every edge
    connects different nodes
  • bipartite graph with nodes
  • messages sent from nodes in
  • depend on the messages sent from
    nodes in , vice versa

25
2.BP on the Grid Graph
  • Update

Without compute
26
2.BP on the Grid Graph
  • So we alternate update the messages from A and
    from B
  • scheme when is odd , update
  • when is even , update
  • if is odd(even) then

27
2.BP on the Grid Graph
  • Moreover , store new messages in the old ones
    memory space(independent)
  • The new messages are nearly the same as
  • the messages in standard scheme
  • when BP converges , messages converges to the
    same fixed point
  • Turns to

28
3.Multi-Grid BP
  • BPS requires large T to produce good results
  • Initialize close to a fixed value, messages will
    get convergence more rapidly (have a small fixed
    number of iterations T )
  • coarse-to-fine manner
  • long range interactions between pixels can be
    captured by short paths in coarse graphs

29
3.Multi-Grid BP
  • Details
  • 0-th level the original labeling problem
  • i-th level blocks of pixels
    grouped together .
  • labelings where all the pixels in a block
    are assigned the same label
  • level i get estimates for the messages at level
    i-1(make a good initial value)

30
3.Multi-Grid BP
level 1
level 0
Illustration of two levels in the multi-grid
method
31
3.Multi-Grid BP
  • be the message that node p sends to the
    right at iteration t.
  • , , be the messages send to
    left, up and down, respectively
  • Instance

32
at level i is the block containing node
at level i - 1,

level i
level i-1
33
3.Multi-Grid BP
  • The cost of assigning label to a block b
  • (block data cost)
  • Block discontinuity costs

34
3.Multi-Grid BP
  • Potts model the same
  • truncated linear model
  • Truncated quadratic Model

go
go
go
35
3.Multi-Grid BP
  • run BP for a small number of iterations at each
    level (between 5 and 10)
  • Note that the total number of nodes in the
  • hierarchy is just the number of nodes
    at the finest level

36
Experiments .1
  • stereo results for the Tsukuba image pair

37
number of message update iterations for our
multi-grid BP method versus the standard algorithm
38
running BP with all speedup techniques versus
running BP with all but one of the techniques.
min convolution method provides an important
speedup
39
Experiments .2
  • Restoration results with an input that has
    missing values.

40
?????????????????????? ??????????????????(message
)????(belief) ????????????????? ?????????????????
?????? ??????????????????????????? ???????????????
????????,???????? ???????????????????,???????????
?????????????
Write a Comment
User Comments (0)
About PowerShow.com