Forward-backward algorithm - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Forward-backward algorithm

Description:

Relation to EM. Relation to EM. HMM is a PM (Product of Multi ... Relation to EM (cont) Iterations. Each iteration provides values for all the parameters ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 30
Provided by: facultyWa4
Category:

less

Transcript and Presenter's Notes

Title: Forward-backward algorithm


1
Forward-backward algorithm
  • LING 572
  • Fei Xia
  • 02/23/06

2
Outline
  • Forward and backward probability
  • Expected counts and update formulae
  • Relation with EM

3
HMM
  • A HMM is a tuple
  • A set of states Ss1, s2, , sN.
  • A set of output symbols Sw1, , wM.
  • Initial state probabilities
  • State transition prob Aaij.
  • Symbol emission prob Bbijk
  • State sequence X1XT1
  • Output sequence o1oT

4
Constraints
5
Decoding
  • Given the observation O1,To1oT, find the state
    sequence X1,T1X1 XT1 that maximizes P(X1,T1
    O1,T).
  • ? Viterbi algorithm

6
Notation
  • A sentence O1,To1oT,
  • T is the sentence length
  • The state sequence X1,T1X1 XT1
  • t time t, range from 1 to T1.
  • Xt the state at time t.
  • i, j state si, sj.
  • k word wk in the vocabulary

7
Forward and backward probabilities
8
Forward probability
  • The probability of producing oi,t-1 while ending
    up in state si

9
Calculating forward probability
Initialization
Induction
10
Backward probability
  • The probability of producing the sequence Ot,T,
    given that at time t, we are at state si.

11
Calculating backward probability
Initialization
Induction
12
Calculating the prob of the observation
13
Estimating parameters
  • The prob of traversing a certain arc at time t
    given O (denoted by pt(i, j) in MS)

14
The prob of being at state i at time t given O
15
Expected counts
  • Sum over the time index
  • Expected of transitions from state i to j in O
  • Expected of transitions from state i in O

16
Update parameters
17
Final formulae
18
Emission probabilities
Arc-emission HMM
19
The inner loop for forward-backward algorithm
  • Given an input sequence and
  • Calculate forward probability
  • Base case
  • Recursive case
  • Calculate backward probability
  • Base case
  • Recursive case
  • Calculate expected counts
  • Update the parameters

20
Relation to EM
21
Relation to EM
  • HMM is a PM (Product of Multi-nominal) Model
  • Forward-back algorithm is a special case of the
    EM algorithm for PM Models.
  • X (observed data) each data point is an O1T.
  • Y (hidden data) state sequence X1T.
  • T (parameters) aij, bijk, pi.

22
Relation to EM (cont)
23
(No Transcript)
24
Iterations
  • Each iteration provides values for all the
    parameters
  • The new model always improve the likeliness of
    the training data
  • The algorithm does not guarantee to reach global
    maximum.

25
Summary
  • A way of estimating parameters for HMM
  • Define forward and backward probability, which
    can calculated efficiently (DP)
  • Given an initial parameter setting, we
    re-estimate the parameters at each iteration.
  • The forward-backward algorithm is a special case
    of EM algorithm for PM model

26
Additional slides
27
Definitions so far
  • The prob of producing O1,t-1, and ending at state
    si at time t
  • The prob of producing the sequence Ot,T, given
    that at time t, we are at state si
  • The prob of being at state i at time t given O

28
(No Transcript)
29
Emission probabilities
Arc-emission HMM
State-emission HMM
Write a Comment
User Comments (0)
About PowerShow.com