Probabilistic Reasoning and Bayesian Belief Networks - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Probabilistic Reasoning and Bayesian Belief Networks

Description:

An Excercise. 3. You are working with a bit string of length 8. You are interested in ... An Excercise. 4. Bayes' Rule. P(A B) = P(A|B) P(B) ... Excercise ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 21
Provided by: nel143
Category:

less

Transcript and Presenter's Notes

Title: Probabilistic Reasoning and Bayesian Belief Networks


1
  • Chapter 12
  • Probabilistic Reasoning and Bayesian Belief
    Networks

2
An Excercise
  • You are working with a bit string of length 8. 
  • You are interested in two events...
  • A First two bits are '11'
  • B There are at least two consecutive 0s
  • Suppose you calculate
  • P(A)64/256
  • P(B)201/256
  • P(Neither A nor B)34/256
  • Calculate
  • P(AB) ?
  • P(AVB) ?
  • P(AB)   ?
  • P(BA) ?

3
An Excercise
  • You are working with a bit string of length 8. 
  • You are interested in two events...
  • A First two bits are '11'
  • B There are at least two consecutive 0s
  • Suppose you calculate
  • P(A)64/256
  • P(B)201/256
  • P(Neither A nor B)34/256
  • Calculate
  • P(AB) 43/256
  • P(AVB) 222/256
  • P(AB)   43/201
  • P(BA) 43/64

4
Bayes Rule
  • P(A ? B) P(AB) P(B) P(BA) P(A)
  • Bayes rule is extremely useful in trying to
    infer probability of a diagnosis, when the
    probability of cause is known.

5
Another Excercise
A doctor knows that the disease meningitis causes
the patient to have a stiff neck 50 of the time.
The doctor also knows that the probability that a
patient has meningitis is 1/50,000, and the
probability that any patient has a stiff neck is
1/20. Find the probability that a patient with a
stiff neck has meningitis.
6
Bayes rule and its use An Excercise
A doctor knows that the disease meningitis causes
the patient to have a stiff neck 50 of the time.
The doctor also knows that the probability that a
patient has meningitis is 1/50,000, and the
probability that any patient has a stiff neck is
1/20. Find the probability that a patient with a
stiff neck has meningitis.
P(M S) P(S M) P(M) / P(S) 0.5
0.00002 / 0.05 0.0002
7
Definition
  • A Bayesian network is a directed acyclic graph
    which consists of
  • A set of random variables which makes up the
    nodes of the network.
  • A set of directed links (arrows) connecting pairs
    of nodes. If there is an arrow from node X to
    node Y, X is said to be a parent of Y.
  • Each node Xi has a conditional probability
    distribution P(XiParents(Xi)) that quantifies
    the effect of the parents on the node.

8
Definition
  • Intuitions
  • A Bayesian network models our incomplete
    understanding of the causal relationships from an
    application domain.
  • A node represents some state of affairs or event.
  • A link from X to Y means that X has a direct
    influence on Y.

9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
The probabilities associated with the nodes
reflect our representation of the causal
relationships.
13
A Bayesian network provides a complete
description of the domain in the sense one can
compute the probability of any state of the world
(represented as a particular assignment to each
variable).
Example What is the probability that the alarm
has sounded, but neither burglary nor an
earthquake has occurred, and both John and Mary
call?
P(j, m, a, b, e) ???
14
A Bayesian network provides a complete
description of the domain in the sense one can
compute the probability of any state of the world
(represented as a particular assignment to each
variable).
Example What is the probability that the alarm
has sounded, but neither burglary nor an
earthquake has occurred, and both John and Mary
call?
P(j, m, a, b, e) P(ja) P(ma) P(a, b, e)
P(b) P(e) 0.900.700.0010.9990.998
0.00062
15
A Bayesian network provides a complete
description of the domain in the sense one can
compute the probability of any state of the world
(represented as a particular assignment to each
variable).
Example What is the probability that the alarm
has sounded, but neither burglary nor an
earthquake has occurred, and both John and Mary
call?
P(j, m, a, b, e) P(ja) P(ma) P(a, b, e)
P(b) P(e) 0.900.700.0010.9990.998
0.00062
16
A Bayesian network provides a complete
description of the domain in the sense one can
compute the probability of any state of the world
(represented as a particular assignment to each
variable).
Example What is the probability that the alarm
has sounded, but neither burglary nor an
earthquake has occurred, and both John and Mary
call?
P(j, m, a, b, e) P(ja) P(ma) P(a, b, e)
P(b) P(e) 0.900.700.0010.9990.998
0.00062
In general
17
Another Example (Charniak, 1991)
Suppose when I go home at night, I want to know
if my family is home before I try the doors.
(Perhaps the most convenient door to enter is
double locked when nobody is home.) Now, often
when my wife leaves the house, she turns on an
outdoor light. However, she sometimes turns on
this light if she is expecting a guest. Also, we
have a dog. When nobody is home, the dog is put
in the back yard. The same is true if the dog has
bowel troubles. Finally, if the dog is in the
backyard, I will probably hear her barking (or
what I think is her barking), but sometimes I can
be confused by other dogs barking.
18
Another Example (Charniak, 1991)
We may use this diagram to predict what will
happen (if my family goes out, the dog goes out)
or to infer causes from observed effects (if the
light is on and the dog is out, then my family is
probably out).
19
Another Example (Charniak, 1991)
  • The important thing to note about this example is
    that the causal connections are not absolute.
  • Often, my family will have left without putting
    out the dog or turning on a light.
  • Sometimes we can use these diagrams anyway, but
    in such cases, it is hard to know what to infer
    when not all the evidence points the same way.
    Should I assume the family is out if the light is
    on, but I do not hear the dog? What if I hear the
    dog, but the light is out?
  • If we knew the relevant probabilities, such as
    P(family-out light-on, hearbark), then we
    would be all set. However, typically, such
    numbers are not available for all possible
    combinations of circumstances.
  • Bayesian networks allow us to calculate them from
    a small set of probabilities, relating only
    neighboring nodes (see next slide).

20
Another Example (Charniak, 1991)
Write a Comment
User Comments (0)
About PowerShow.com