Gibbs Sampling - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

Gibbs Sampling

Description:

... M. and Wong, W., 'The calculation of posterior distributions by data ... 'Sampling based approaches to calculating marginal densities', Journal of the ... – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 18
Provided by: umiac7
Category:

less

Transcript and Presenter's Notes

Title: Gibbs Sampling


1
Gibbs Sampling
  • For Graduate Seminar (ENEE698A)
  • Presented by Hongqiang Zhang
  • Oct. 22, 2003
  • Dept. of ECE, Univ. of MD

2
Outline
  • Motivation of GS (Gibbs Sampling)
  • What is GS?
  • Example of GS in Mixture problem
  • Conclusion

3
Motivation of GS
  • Want to draw samples from the posterior
    distribution to make inferences
  • Difficult to compute joint distributions directly
  • GS is a way to draw samples without computing
    joint distribution

4
What is GS?
  • Given random variables
  • Want to draw samples from joint distribution
  • Draw samples from the conditional distribution
    instead
  • Iterate until the process stabilizes

5
(No Transcript)
6
Example
7
Underlying Theory
  • The process is equivalent to generate a
    homogenous Markov Chain with transition matrix
  • where

8
Convergence
  • If the Markov chain is ergodic, then the
    stationary distribution can be reached
  • Ergodic is
  • - Aperiodic
  • - Irreducible
  • - Pos. recurrent

9
Other Issues
  • How to choose realizations?
  • How to judge the accuracy of estimates?
  • How to choose initial distribution?
  • R.M. Neal, "Probabilistic inference using Markov
    chain Monte Carlo methods," Tech. Rep.
    CRG-TR-93-1, University of Toronto, 1993

10
GS in Gaussian Mixture Problem
  • Given a bunch of data points, try to generate a
    model as a mixture of two normal distributions
  • where
  • Goal fit the model to given data, i.e., find
    parameters and

11
(No Transcript)
12
(No Transcript)
13
Comparison to EM
  • In (2a), draw data from the distributions
  • In (E), compute the ML responsibilities
  • In (2b), draw data from conditional distribution
  • In (M), compute the maximizers of the posterior

14
(No Transcript)
15
Comment
  • Simplified for comparison
  • More realistically, put a prior distribution on
    , then do separate GS steps in
    which we sample from their posterior
    distributions conditioned on the other parameters

16
Conclusion
  • GS is a good way to draw samples without
    computing joint distribution
  • There are also other methods such as
  • - Rejection sampling
  • - Importance sampling
  • - Metropolis-Hastings sampling
  • which are useful in different situations

17
References
  • Tanner, M. and Wong, W., The calculation of
    posterior distributions by data augmentation,
    Journal of the American Statistical Association,
    82528-550, 1987
  • Gelfand, A. and Smith, A., Sampling based
    approaches to calculating marginal densities,
    Journal of the American Statistical Association,
    85398-409, 1990
Write a Comment
User Comments (0)
About PowerShow.com