Probabilistic reasoning over time - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Probabilistic reasoning over time

Description:

Start it up (the prior probability model) What is P(X0) ... Gub is unlikely to be a word, And if it were, it would be less likely than 'gun. ... – PowerPoint PPT presentation

Number of Views:206
Avg rating:3.0/5.0
Slides: 16
Provided by: WillFit6
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic reasoning over time


1
Probabilistic reasoning over time
  • This sentence is likely to be untrue in the
    future!

2
The basic problem
  • What do we know about the state of the world now
    given a history of the world before.
  • The only evidence we have are probabilities.
  • Past performance may not be a guide to future
    performance.

3
Simplifying assumptions and notations
  • States are our events.
  • (Partial) states can be measured at reasonable
    time intervals.
  • Xt unobservable state variables at t.
  • Et (evidence) observable state variables at t.
  • Vmn Variables Vm, Vm1,,Vn

4
Stationary, Markovian (transition model)
  • Stationary the laws of probability dont change
    over time
  • Markovian current unobservalbe state depends on
    a finite number of past states
  • First-order current state depends only on the
    previous state, i.e.
  • P(XtX0t-1)P(XtXt-1)
  • Second-order etc., etc.

5
Observable variables (the sensor model)
  • Observable variables depend only on the current
    state (by definition, essentially), these are the
    sensors.
  • The current state causes the sensor values.
  • P(EtX0t,E0t-1)P(EtXt)

6
Start it up (the prior probability model)
  • What is P(X0)?
  • At time t, the joint is completely determined
  • P(X0,X1,Xt,E1,,Et) P(X0) ?i ? t
    P(XiXi-1)P(EiXi)

7
Better predictions?
  • More state variables (temperature, humidity,
    pressure, season)
  • Higher order Markov processes (take more of the
    past into account).
  • Tradeoffs?

8
Whats it good for?
  • Belief/monitoring the current state
  • Prediction about the next state
  • Hindsight about previous states
  • Explanation of possible causes

9
Example
10
Hidden Markov Models (HMMs)
  • Further simplification
  • Only one state variable.
  • We can use matrices, now.
  • Ti,j P(XtjXt-1i)

11
Speech Recognition
  • P(wordssignal) P(signalwords)P(words)
  • P(words) language model
  • Every time I fire a linguist, the recognition
    rate goes up.

12
Model 1 Speech
  • Sample the speech signal
  • Decide the most likely sequence of speech symbols

13
Phonetic alphabet
  • Phonemes minimal units of sound that make a
    meaning difference (beat vs. bit fit vs. bit)
  • Phones normalized articulation results paid vs.
    tap
  • English has about 40
  • Co-articulation effects modeled as new symbols.
    sweet w(s,iy)

14
Model 2,3 Words, Sentences
  • Given the phones, what is the most likely
    word/word in the sentence?
  • Give me all your money. I have a gub.
  • Gub is unlikely to be a word,
  • And if it were, it would be less likely than
    gun.

15
Lots of tricky bits
Write a Comment
User Comments (0)
About PowerShow.com