Title: Efficient Belief propagation for Early Vision
1Efficient Belief propagation for Early Vision
- Pedro F. Felzenszwalb University of Chicago
Daniel P. Huttenlocher Cornell
University - International Journal of Computer Vision 2006.10
- ??????
2perface
- Early vision problems problems based on pixels.
Such as stereo, and image restoration - Early vision problems can be formulated as a
maximum a posteriori MRF (MAP-MRF) problem. - methods to solve MAP-MRF simulated annealing ,
which can often take an unacceptably long time
although global optimization is achieved in theory
3outline
- Markov random field (MRF) models
- Message passing
- 3 techniques to reduce the time
- 1.Computing Messages update in linear time
- 2.BP on the bipartite graph message passing
- schedule
- 3.Multi-Grid BP
4MRF models
- be the set of pixels in an image
- be a set of labels (ex. intensities)
- the edges in the four-connected image
- grid graph
- Labeling assigns a label to
each pixel - energy function
5MRF models
- the cost of assigning labels
and - to two neighboring pixels (discontinuity
cost) , generally based on the difference between
labels in low-level vision problem - so we use
- the cost of assigning label
to pixel - (data cost )
energy function
6Message passing
- Finding a labeling with minimum energy
corresponds to MAP estimation problem - max-product used to approximate the MAP
- max-product to min-sum ( with )
- less sensitive to numerical artifacts
- max-product BP works by passing messages around
the graph defined by the - four-connected image grid.
7Message passing
Total 321
41
1
2
3
4
5
5
4
3
2
1
31
8Message passing
A
ABC1
B
C
A1
A
1
9Message passing
7
2
2417 2316 3418
9
8
4
1
2
7
2
1
2
3
3
6
6
8
9
9
8
4
3
1
2
1
9
8
4
10Message passing
- be the message that node p sends to
a neighboring node q at iteration t - are initialized to 0
- a vector
of n_labels - dimension
- Step 1. Compute messages
11(No Transcript)
12(No Transcript)
13Message passing
Step 2. Compute belief vector for each
node after T iterations
14- Step 3. minimizes individually at
each node is selected - time complexity
- n number of pixels
- k number of possible labels ( n_lables )
- T number of iterations
Too long time
153 techniques
- 1. Computing a Message Update in Linear Time (
to ) - 2. The bipartite graph message passing schedule (
to ) - 3. Multi-Grid BP
161.Computing Messages
- To most low-level vision application
- based on the difference
between - the labels and
The form of equation (3) is commonly referred to
as a min convolution
171.Computing Messages
Compute fristly
time complexity turns to
go
181.Computing Messages
(stereo and image restoration)
- truncated linear model
- First. Consider the pure linear model
- message compute
The minimization can be seen as lower envelope in
figure
go
19Ex. S1 cones root at
lower envelope
Can use two-pass algorithm to compute message
20two-pass algorithm
- initialize message with
- forward pass
- for from 1 to
- backward pass
- for from to 0
Compute messages with the linear cost
21two-pass algorithm
- Ex.
- 1.
- 2.forward pass
- for 1 to 3
- 3.backward pass
- for 2 to 0
-
221.Computing Messages
second. Use Potts model to computation
pure linear model
time complexity turns to
- Truncated quadratic Model
go
23truncated linear model
Potts model
Truncated quadratic Model
Robust!
242.BP on the Grid Graph
- BP performed more efficiently for a bipartite
graph - grid graph to bipartite graph every edge
connects different nodes - bipartite graph with nodes
- messages sent from nodes in
- depend on the messages sent from
nodes in , vice versa
252.BP on the Grid Graph
Without compute
262.BP on the Grid Graph
- So we alternate update the messages from A and
from B - scheme when is odd , update
- when is even , update
- if is odd(even) then
272.BP on the Grid Graph
- Moreover , store new messages in the old ones
memory space(independent) - The new messages are nearly the same as
- the messages in standard scheme
- when BP converges , messages converges to the
same fixed point - Turns to
283.Multi-Grid BP
- BPS requires large T to produce good results
- Initialize close to a fixed value, messages will
get convergence more rapidly (have a small fixed
number of iterations T ) - coarse-to-fine manner
- long range interactions between pixels can be
captured by short paths in coarse graphs
293.Multi-Grid BP
- Details
- 0-th level the original labeling problem
- i-th level blocks of pixels
grouped together . - labelings where all the pixels in a block
are assigned the same label - level i get estimates for the messages at level
i-1(make a good initial value)
303.Multi-Grid BP
level 1
level 0
Illustration of two levels in the multi-grid
method
313.Multi-Grid BP
- be the message that node p sends to the
right at iteration t. - , , be the messages send to
left, up and down, respectively - Instance
32 at level i is the block containing node
at level i - 1,
level i
level i-1
333.Multi-Grid BP
- The cost of assigning label to a block b
- (block data cost)
- Block discontinuity costs
343.Multi-Grid BP
- Potts model the same
- truncated linear model
- Truncated quadratic Model
go
go
go
353.Multi-Grid BP
- run BP for a small number of iterations at each
level (between 5 and 10) - Note that the total number of nodes in the
- hierarchy is just the number of nodes
at the finest level
36Experiments .1
- stereo results for the Tsukuba image pair
37number of message update iterations for our
multi-grid BP method versus the standard algorithm
38running BP with all speedup techniques versus
running BP with all but one of the techniques.
min convolution method provides an important
speedup
39Experiments .2
- Restoration results with an input that has
missing values.
40 ?????????????????????? ??????????????????(message
)????(belief) ????????????????? ?????????????????
?????? ??????????????????????????? ???????????????
????????,???????? ???????????????????,???????????
?????????????