Expectation Maximization - PowerPoint PPT Presentation

1 / 7
About This Presentation
Title:

Expectation Maximization

Description:

... it consists of means and variances, k is the number of Gaussian ... values for ?k (mean and variance) ... weighted mean and variance for each gaussian model ... – PowerPoint PPT presentation

Number of Views:17
Avg rating:3.0/5.0
Slides: 8
Provided by: richa6
Category:

less

Transcript and Presenter's Notes

Title: Expectation Maximization


1
Expectation Maximization
  • First introduced in 1977
  • Lots of mathematical derivation
  • Problem given a set of data (data is
    incomplete or having missing values).
  • Goal assume the set of data come from a
    underlying distribution, we need to guess the
    most likely (maximum likelihood) parameters of
    that model.

2
Example
  • Given a set of data points in R2
  • Assume underlying distribution is mixture of
    Gaussians
  • Goal estimate the parameters of each gaussian
    distribution
  • ? is the parameter, we consider it consists of
    means and variances, k is the number of Gaussian
    model.

3
Steps of EM algorithm(1)
  • randomly pick values for ?k (mean and variance)
  • for each xn, associate it with a responsibility
    value r
  • rn,k - how likely the nth point comes
    from/belongs to the kth mixture
  • how to find r?

Assume data come fromthese two distribution
4
Steps of EM algorithm(2)
Probability that we observe xn in the data set
provided it comes from kth mixture
Distance between xn and center of kth mixture
5
Steps of EM algorithm(3)
  • each data point now associate with (rn,1,
    rn,2,, rn,k)rn,k how likely they belong to
    kth mixture, 0ltrlt1
  • using r, compute weighted mean and variance for
    each gaussian model
  • We get new ?, set it as the new parameter and
    iterate the process (find new r -gt new ? -gt )
  • Consist of expectation step and maximization step

6
Ideas and Intuition
  • given a set of incomplete (observed) data
  • assume observed data come from a specific model
  • formulate some parameters for that model, use
    this to guess the missing value/data (expectation
    step)
  • from the missing data and observed data, find
    the most likely parameters (maximization step)
  • iterate step 2,3 and converge

7
Application
  • Parameter estimation for Gaussian mixture (demo)
  • Baum-Welsh algorithm used in Hidden Markov
    Models
  • Difficulties
  • How to model the missing data?
  • How to determine the number of Gaussian mixture.
  • What model to be used?
Write a Comment
User Comments (0)
About PowerShow.com