Title: Belief Propagation on Markov Random Fields
1Belief Propagation on Markov Random Fields
2Outline
- Graphical Models
- Markov Random Fields (MRFs)
- Belief Propagation
3Graphical Models
- Diagrams
- Nodes random variables
- Edges statistical dependencies among random
variables - Advantages
- Better visualization
- conditional independence properties
- new models design
- Factorization
4Graphical Models types
- Directed
- causal relationships
- e.g. Bayesian networks
- Undirected
- no constraints imposed on causality of events
(weak dependencies) - Markov Random Fields (MRFs)
5Example MRF Application Image Denoising
Noisy image e.g. 10 of noise
Original image (Binary)
- Question How can we retrieve the original image
given the noisy one?
6MRF formulation
- Nodes
- For each pixel i,
- xi latent variable (value in original image)
- yi observed variable (value in noisy image)
- xi, yi ? 0,1
y1
y2
x1
x2
yi
xi
yn
xn
7MRF formulation
- Edges
- xi,yi of each pixel i correlated
- local evidence function ?(xi,yi)
- E.g. ?(xi,yi) 0.9 (if xi yi) and ?(xi,yi)
0.1 otherwise (10 noise) - Neighboring pixels, similar value
- compatibility function ?(xi, xj)
8MRF formulation
P(x1, x2, , xn) (1/Z) ?(ij) ?(xi, xj) ?i ?(xi,
yi)
- Question What are the marginal distributions for
xi, i 1, ,n?
9Belief Propagation
- Goal compute marginals of the latent nodes of
underlying graphical model - Attributes
- iterative algorithm
- message passing between neighboring latent
variables nodes - Question Can it also be applied to directed
graphs? - Answer Yes, but here we will apply it to MRFs
10Belief Propagation Algorithm
- Select random neighboring latent nodes xi, xj
- Send message mi?j from xi to xj
- Update belief about marginal distribution at node
xj - Go to step 1, until convergence
- How is convergence defined?
yi
yj
xi
xj
mi?j
11Step 2 Message Passing
- Message mi?j from xi to xj what node xi thinks
about the marginal distribution of xj
yi
yj
N(i)\j
xi
xj
mi?j(xj) ?(xi) ?(xi, yi) ?(xi, xj) ?k?N(i)\j
mk?i(xi)
- Messages initially uniformly distributed
12Step 3 Belief Update
- Belief b(xj) what node xj thinks its marginal
distribution is
N(j)
yj
xj
b(xj) k ?(xj, yj) ?q?N(j) mq?j(xj)
13Belief Propagation Algorithm
- Select random neighboring latent nodes xi, xj
- Send message mi?j from xi to xj
- Update belief about marginal distribution at node
xj - Go to step 1, until convergence
yi
yj
xi
xj
mi?j
14Example
- - Compute belief at node 1.
3
m3?2
Fig. 12 (Yedidia et al.)
2
1
m2?1
m4?2
4
15Does graph topology matter?
- BP procedure the same!
- Performance
- Failure to converge/predict accurate beliefs
Murphy, Weiss, Jordan 1999 - Success at
- decoding for error-correcting codes Frey and
Mackay 1998 - computer vision problems where underlying MRF
full of loops Freeman, Pasztor, Carmichael 2000
vs.
16How long does it take?
- No explicit reference on paper
- My opinion, depends on
- nodes of graph
- graph topology
- Work on improving the running time of BP (for
specific applications) - Next time?
17Questions?
18Next time ?
- BP on directed graphs
- Improve running time of BP
- More about loopy BP
- Can an initial estimation of messages
(non-uniform) alleviate the problem?