Probabilistic Inference - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

Probabilistic Inference

Description:

Probabilistic Inference – PowerPoint PPT presentation

Number of Views:96
Avg rating:3.0/5.0
Slides: 52
Provided by: KrisH170
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic Inference


1
Probabilistic Inference
2
Agenda
  • Review of probability theory
  • Joint, conditional distributions
  • Independence
  • Marginalization and conditioning
  • Intro to Bayesian Networks

3
Probabilistic Belief
  • Consider a world where a dentist agent D meets
    with a new patient P
  • D is interested in only whether P has a cavity
    so, a state is described with a single
    proposition Cavity
  • Before observing P, D does not know if P has a
    cavity, but from years of practice, he believes
    Cavity with some probability p and ?Cavity with
    probability 1-p
  • The proposition is now a boolean random variable
    and (Cavity, p) is a probabilistic belief

4
Probabilistic Belief State
  • The world has only two possible states, which are
    respectively described by Cavity and ?Cavity
  • The probabilistic belief state of an agent is a
    probabilistic distribution over all the states
    that the agent thinks possible
  • In the dentist example, Ds belief state is

5
Multivariate Belief State
  • We now represent the world of the dentist D using
    three propositions Cavity, Toothache, and
    PCatch
  • Ds belief state consists of 23 8 states each
    with some probability Cavity?Toothache?PCatch,
    ?Cavity?Toothache?PCatch,
    Cavity??Toothache?PCatch,...

6
The belief state is defined by the full joint
probability of the propositions

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
7
Probabilistic Inference

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
P(Cavity ?Toothache) 0.108 0.012 ...
0.28
8
Probabilistic Inference

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
P(Cavity) 0.108 0.012 0.072 0.008 0.2
9
Probabilistic Inference

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
Marginalization P(c) StSpc P(c?t?pc) using
the conventions that c Cavity or ?Cavity and
that St is the sum over t Toothache,
?Toothache
10
Komolgorovs Probability Axioms
  • 0 ? P(a) ? 1
  • P(true) 1, P(false) 0
  • P(a ? b) P(a) P(b) - P(a ? b)

11
Decoding Probability Notation
  • Capital letters A,B,C denote random variables
  • Each random variable X can take one of a set of
    possible values x?Val(X)
  • Confusingly, this is often left implicit
  • Probabilities defined over logical statements,
    e.g., P(Xx)
  • Confusingly, this is often written P(x)

12
Decoding Probability Notation
  • Interpretation of P(A) depends on if A treated as
    a logical statement, or a random variable
  • As a logical statement
  • Just the probability that A is true
  • P(A) 1-P(?A)
  • As a random variable
  • Denotes both P(A is true) and P(A is false)
  • For non-boolean variables, denotes P(Aa) for all
    a ? Val(A)
  • Computation requires marginalization
  • Belief space A, B, C, D, often left implicit
  • Marginalize the joint distribution over all
    combinations of B, C, D (event space)

13
Decoding Probability Notation
  • How to interpret P(A?B) P(A,B)?
  • As a logical statement
  • Probability that both A and B are true
  • As random variables
  • P(A0,B0), P(A1,B0), P(A0,B1), P(A1,B1)
  • P(Aa,Bb) for all a?Val(A), b?Val(B) in general
  • Equal to P(A)P(B)?

No!!! (except under very special conditions)
14
Quiz What does this mean?
  • P(A?B) P(A)P(B)- P(A?B)
  • P(Aa ? Bb) P(Aa) P(Bb)
  • P(Aa ? Bb)
  • For all a?Val(A) and b?Val(B)

15
Marginalization
  • Express P(A,B) in terms of P(A,B,C)
  • If C is boolean,
  • P(A,B) P(A,B,C0) P(A,B,C1)
  • In general
  • P(A,B) Sc?Val(C) P(A,B,Cc)

16
Conditional Probability
  • P(A?B) P(AB) P(B) P(BA) P(A)P(AB) is
    the posterior probability of A given B
  • Axiomatic definition P(AB) P(A,B)/P(B)

17

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
  • P(CavityToothache) P(Cavity?Toothache)/P(Tootha
    che)
  • (0.1080.012)/(0.1080.0120.0160.064)
    0.6
  • Interpretation After observing Toothache, the
    patient is no longer an average one, and the
    prior probability (0.2) of Cavity is no longer
    valid
  • P(CavityToothache) is calculated by keeping the
    ratios of the probabilities of the 4 cases
    unchanged, and normalizing their sum to 1

18

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
  • P(CavityToothache) P(Cavity?Toothache)/P(Tootha
    che)
  • (0.1080.012)/(0.1080.0120.0160.064)
    0.6
  • P(?CavityToothache)P(?Cavity?Toothache)/P(Tootha
    che)
  • (0.0160.064)/(0.1080.0120.0160.064)
    0.4
  • P(cToochache) (P(CavityToothache),
    P(?CavityToothache))
  • a P(c ?Toothache) a
    Spc P(c ?Toothache ? pc)
  • a (0.108, 0.016) (0.012,
    0.064)
  • a (0.12, 0.08) (0.6, 0.4)

19
Conditioning
  • Express P(A) in terms of P(AB), P(B)
  • If C is boolean,
  • P(A) P(AB0) P(B0) P(AB1) P(B1)
  • In general
  • P(A) Sb?Val(B) P(ABb) P(Bb)

20
Conditional Probability
  • P(A?B) P(AB) P(B) P(BA) P(A)
  • P(A?B?C) P(AB,C) P(B?C) P(AB,C) P(BC)
    P(C)
  • P(Cavity) StSpc P(Cavity?t?pc) StSpc
    P(Cavityt,pc) P(t?pc)

21
Independence
  • Two random variables A and B are independent if
    P(A?B) P(A) P(B) hence if P(AB) P(A)
  • Two random variables A and B are independent
    given C, if P(A?BC) P(AC) P(BC)hence if
    P(AB,C) P(AC)

22
Updating the Belief State

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
  • Let D now observe Toothache with probability 0.8
    (e.g., the patient says so)
  • How should D update its belief state?

23
Updating the Belief State

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
  • Let E be the evidence such that P(ToothacheE)
    0.8
  • We want to compute P(c?t?pcE) P(c?pct,E)
    P(tE)
  • Since E is not directly related to the cavity or
    the probe catch, we consider that c and pc are
    independent of E given t, hence P(c?pct,E)
    P(c?pct)

24
Updating the Belief State

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
  • Let E be the evidence such that P(ToothacheE)
    0.8
  • We want to compute P(c?t?pcE) P(c?pct,E)
    P(tE)
  • Since E is not directly related to the cavity or
    the probe catch, we consider that c and pc are
    independent of E given t, hence P(c?pct,E)
    P(c?pct)

25
Issues
  • If a state is described by n propositions, then a
    belief state contains 2n states (possibly, some
    have probability 0)
  • ? Modeling difficulty many numbers must be
    entered in the first place
  • ? Computational issue memory size and time

26
Bayes Rule and other Probability Manipulations
  • P(A?B) P(AB) P(B) P(BA) P(A)
  • P(AB) P(BA) P(A) / P(B)
  • Can manipulate distributions, e.g.P(B) Sa
    P(BAa) P(Aa)
  • Can derive P(AB), P(B) using only P(BA) and
    P(A)
  • So, if any variables are conditionally
    independent, we should be able to decompose joint
    distribution to take advantage of it

27

PCatch ?PCatch PCatch ?PCatch
Cavity 0.108 0.012 0.072 0.008
?Cavity 0.016 0.064 0.144 0.576
Toothache
?Toothache
  • Toothache and PCatch are independent given Cavity
    (or ?Cavity), but this relation is hidden in the
    numbers ! Verify this
  • Bayesian networks explicitly represent
    independence among propositions to reduce the
    number of probabilities defining a belief state

28
Bayesian Network
  • Notice that Cavity is the cause of both
    Toothache and PCatch, and represent the
    causality links explicitly
  • Give the prior probability distribution of Cavity
  • Give the conditional probability tables of
    Toothache and PCatch

P(Cavity)
0.2
P(c?t?pc) P(t?pcc) P(c)
P(tc) P(pcc) P(c)
Cavity
P(Toothachec)
Cavity ?Cavity 0.6 0.1
P(PCatchc)
Cavity ?Cavity 0.90.02
Toothache
PCatch
5 probabilities, instead of 7
29
Conditional Probability Tables
P(c?t?pc) P(t?pcc) P(c)
P(tc) P(pcc) P(c)
P(Cavity)
0.2
Cavity
P(Toothachec)
Cavity ?Cavity 0.6 0.1
P(PCatchc)
Cavity ?Cavity 0.90.02
Toothache
PCatch
P(tc) P(?tc)
Cavity ?Cavity 0.6 0.1 0.4 0.9
Rows sum to 1
If X takes n values, just store n-1 entries
30
Naïve Bayes Models
  • P(Cause,Effect1,,Effectn) P(Cause) Pi
    P(Effecti Cause)

Cause
Effect1
Effect2
Effectn
31
Naïve Bayes Classifier
  • P(Class,Feature1,,Featuren) P(Class) Pi
    P(Featurei Class)

Spam / Not Spam English / French/ Latin
Class
Feature1
Feature2
Featuren
Word occurrences
32
Equations Involving Random Variables
  • C A ? B
  • C max(A,B)
  • Constrains joint probability P(A,B,C)
  • Nicely encoded as causality relationship

A
B
Conditional probability given by equation rather
than a CPT
C
33
A More Complex BN
Intuitive meaning of arc from x to y x has
direct influence on y
Directed acyclic graph
34
A More Complex BN
P(B)
0.001
P(E)
0.002
B E P(A)
TTFF TFTF 0.950.940.290.001
Size of the CPT for a node with k parents 2k
A P(J)
TF 0.900.05
A P(M)
TF 0.700.01
10 probabilities, instead of 31
35
What does the BN encode?
P(b?j) ? P(b) P(j) P(b?ja) P(ba) P(ja)
  • Each of the beliefs JohnCalls and MaryCalls is
    independent of Burglary and Earthquake given
    Alarm or ?Alarm

For example, John doesnot observe any
burglariesdirectly
36
What does the BN encode?
P(b?ja) P(ba) P(ja) P(j?ma) P(ja) P(ma)
A node is independent of its non-descendants
given its parents
  • The beliefs JohnCalls and MaryCalls are
    independent given Alarm or ?Alarm

For instance, the reasons why John and Mary may
not call if there is an alarm are unrelated
37
What does the BN encode?
Burglary and Earthquake are independent
A node is independent of its non-descendants
given its parents
  • The beliefs JohnCalls and MaryCalls are
    independent given Alarm or ?Alarm

For instance, the reasons why John and Mary may
not call if there is an alarm are unrelated
38
Locally Structured World
  • A world is locally structured (or sparse) if each
    of its components interacts directly with
    relatively few other components
  • In a sparse world, the CPTs are small and the BN
    contains much fewer probabilities than the full
    joint distribution
  • If the of entries in each CPT is bounded by a
    constant, i.e., O(1), then the of probabilities
    in a BN is linear in n the of propositions
    instead of 2n for the joint distribution

39
But does a BN represent a belief state?In other
words, can we compute the full joint distribution
of the propositions from it?
40
Calculation of Joint Probability
P(B)
0.001
P(E)
0.002
P(j?m?a??b??e) ??
B E P(A)
TTFF TFTF 0.950.940.290.001
A P(J)
TF 0.900.05
A P(M)
TF 0.700.01
41
  • P(j?m?a??b??e) P(j?ma,?b,?e) ? P(a??b??e)
    P(ja,?b,?e) ? P(ma,?b,?e) ? P(a??b??e)(J and M
    are independent given A)
  • P(ja,?b,?e) P(ja)(J and B and J and E are
    independent given A)
  • P(ma,?b,?e) P(ma)
  • P(a??b??e) P(a?b,?e) ? P(?b?e) ? P(?e)
    P(a?b,?e) ? P(?b) ? P(?e)(B and E
    are independent)
  • P(j?m?a??b??e) P(ja)P(ma)P(a?b,?e)P(?b)P(?e)

42
Calculation of Joint Probability
P(B)
0.001
P(E)
0.002
P(J?M?A??B??E) P(JA)P(MA)P(A?B,?E)P(?B)P(?E)
0.9 x 0.7 x 0.001 x 0.999 x 0.998 0.00062
B E P(A)
TTFF TFTF 0.950.940.290.001
A P(J)
TF 0.900.05
A P(M)
TF 0.700.01
43
Calculation of Joint Probability
P(B)
0.001
P(E)
0.002
P(j?m?a??b??e) P(ja)P(ma)P(a?b,?e)P(?b)P(?e)
0.9 x 0.7 x 0.001 x 0.999 x 0.998 0.00062
B E P(A)
TTFF TFTF 0.950.940.290.001
A P(J)
TF 0.900.05
A P(M)
TF 0.700.01
44
Calculation of Joint Probability
Since a BN defines the full joint distribution of
a set of propositions, it represents a belief
state
P(B)
0.001
P(E)
0.002
P(j?m?a??b??e) P(ja)P(ma)P(a?b,?e)P(?b)P(?e)
0.9 x 0.7 x 0.001 x 0.999 x 0.998 0.00062
B E P(A)
TTFF TFTF 0.950.940.290.001
A P(J)
TF 0.900.05
A P(M)
TF 0.700.01
45
Querying the BN
  • The BN gives P(tc)
  • What about P(ct)?
  • P(Cavityt) P(Cavity ? t)/P(t) P(tCavity)
    P(Cavity) / P(t)Bayes rule
  • P(ct) a P(tc) P(c)
  • Querying a BN is just applying the trivial Bayes
    rule on a larger scale algorithms next time

P(C)
0.1
C P(Tc)
TF 0.40.01111
46
More Complicated Singly-Connected Belief Net
47
Some Applications of BN
  • Medical diagnosis
  • Troubleshooting of hardware/software systems
  • Fraud/uncollectible debt detection
  • Data mining
  • Analysis of genetic sequences
  • Data interpretation, computer vision, image
    understanding

48
Region Sky, Tree, Grass, Rock
R1
Above
R2
R4
R3
49
BN to evaluate insurance risks
50
Purposes of Bayesian Networks
  • Efficient and intuitive modeling of complex
    causal interactions
  • Compact representation of joint distributions
    O(n) rather than O(2n)
  • Algorithms for efficient inference with new
    evidence (more on this next time)

51
Reminder
  • HW4 due on Thursday!
Write a Comment
User Comments (0)
About PowerShow.com