Image Analysis and Markov Random Fields MRFs - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Image Analysis and Markov Random Fields MRFs

Description:

For example, textures can be represented by their first and second statistics. ... The Ising model is chosen. Open Issues / Discussion. Code Development ... – PowerPoint PPT presentation

Number of Views:718
Avg rating:3.0/5.0
Slides: 23
Provided by: jay91
Category:

less

Transcript and Presenter's Notes

Title: Image Analysis and Markov Random Fields MRFs


1
Image Analysis and Markov Random Fields (MRFs)
2
Statistical models
  • Some image structures are not deterministic, are
    best characterized by their statistical
    properties.
  • For example, textures can be represented by their
    first and second statistics.
  • Images are often distorted by statistical noise.
    To restore the true image, images are often
    treated as realizations of a random process.

3
Uses of Markov Random Fields
  • MRFs are a kind of statistical model.
  • They can be used to model spatial constrains.
  • smoothness of image regions
  • spatial regularity of textures in a small region
  • depth continuity in stereo construction

4
What are MRFs
  • Neighbors and cliquesLet S be a set of
    locations, here for simplicity, assume S a
    grid. S (i, j) i, j are integers
    .Neighbors of s(i,j) ? S are defined
    as ?((i, j)) (k, l) 0lt(k - i)2 (l - j)2
    lt r constant r is constant
  • A subset C of S is a clique if any two different
    elements of C are neighbors. The set of all
    cliques of S is denoted by O.

5
Examples of neighborhood
  • 4-neighborhood (r 1)

cliques
6
Examples of neighborhood
  • 8-neighborhood (r 2)

cliques
7
Random fields
  • The random vector on S is
    called a random field and assumed to have density
    p(x).
  • Images as Random fieldsIf vector X represents
    intensity values of an image, then its component
    Xs is the intensity value at location s(i, j).


S
X

640x480
640x480
Vector X 640480 Dimension
8
Markov Random Fields
  • If p(x) of a random field fulfills the so called
    Markov condition w.r.t. a neighborhood system, it
    is called a Markov Random Field.

I.E, the value Xs at location S is only depend on
its neighbors.
9
Markov Random fields
  • p(x) can also be factorize over cliques due to
    its Markov properties. i.e.?C is a
    function of X determined by clique C.

10
Markov Random Fields
  • MRFs are equivalent to Gibbs Fields and p(x) has
    the following form
  • H(x) is called energy function.
  • The summation in the denominator is over all
    possible configurations on S. In our case are
    over all possible images. For 256 grey values
    and 640x480 grid, it will have 256640x480 terms.
    Z is impractical to evaluate.
  • so Z is only known up to a constant.

11
Local Characteristics of MRFs
  • For every , we have,S\I
    means complement of I
  • If I is a small set, since X only changes over
    I, ZI can be evaluated in reasonable time.
  • So p(yIxS\I) is known.

12
Using MRFs in Image Analysis
  • In image analysis, p(x) is often the posterior
    probability of Bayesian inference. That is, p(x)
    p(xy0).
  • For example y0 may be the observed image with
    noise, and we want to compute the estimate x0 of
    the true image x0 based on p(x) p(xy0).

13
Using MRFs in Image Analysis
MRF Model (either learned or known)
Sampling
X0
14
Difficulties in computing X0
  • A way to compute the estimate X0 is to
    let,
  • But p(xy0) is only known up to a constant Z, How
    to do above integration?

15
Monte Carlo integration
  • One solution is to construct a Markov chain
    having p(x) as its limiting distribution.
  • If the Markov chain starting at state X0, and
    going through states X1, X2, X3, , Xt,, then
    E(X)p(x) can be approximated by
  • m is a sufficiently long burn-in time. Xm1,
    Xm2,...... can be considered as samples drawn
    from p(x).

16
Gibbs Sampler
  • Because X is a high dimension vector. (For a
    640x480 image its dimension is 640x480).
  • It is not practical to update all components of
    Xt to Xt1 in one iteration.
  • One version of Metropolis-Hastings algorithm,
    called Gibbs Sampler, builds the Markov chain and
    updates only a single component of Xt in one
    iteration.

17
Gibbs Sampler Algorithm
  • Let the vector X has k components,
    X(X0,X1,X2,,Xk). and presently it is in state
    Xt (x0,x1, x2,,xk).
  • An index that is equally likely to be any of
    1,,k is chosen. say index i.
  • A random variable w with density P wx P
    Xix Xj xj, j ? i is generated.
  • If wx, the updated Xt is Xt1 (x0, x1, x2, ,
    xi-1, x, xi1, , xk).

18
Two aspects of using MRFs
  • Find an appropriate model class, the general form
    of H(x).
  • Identify suitable parameters in H(x) from
    observed samples of X.
  • This is the most difficult part in applying MRFs.

19
An Example
  • Suppose we want to restore a binary (1/-1) image
    with pepper-and-salt noise added.
  • The Ising model is chosen.

20
Open Issues / Discussion
  • Code Development
  • What should our MRF library look like?
  • Challenges Build MRF model from image samples
    and then generate new images using Gibbs sampler
  • Need a way to determine the parameters in H(x)
    based on image samples.

21
My Idea
  • Image Segmentation in Wavelet domain with Markov
    Random Fields
  • Wavelet transform.
  • Image enhancement.
  • Extract edge information.
  • Build MRF model based on image samples.
  • Image segmentation using Markov Random Fields
    scheme.

22
Reference
  • Ross Kindermann and J. Laurie Snell, Markov
    Random Fields and Their Applications,
    http//www.ams.org/online_bks/conm1/, 1980.
  • Gerhard Winkler, Image Analysis, Random Fields
    and Markov Chain Monte Carlo methods. Springer,
    2003
  • W. R. Gilks, Markov Chain Monte Carlo in
    Practice, Chapman Hall/CRC, 1998.
  • Sheldon M. Ross, Introduction to Probability
    Models, Academic Press, 2003.
Write a Comment
User Comments (0)
About PowerShow.com