Title: Bayesian Inference
1Bayesian Inference
- Artificial Intelligence
- CMSC 25000
- February 26, 2002
2Agenda
- Motivation
- Reasoning with uncertainty
- Medical Informatics
- Probability and Bayes Rule
- Bayesian Networks
- Noisy-Or
- Decision Trees and Rationality
- Conclusions
3Motivation
- Uncertainty in medical diagnosis
- Diseases produce symptoms
- In diagnosis, observed symptoms gt disease ID
- Uncertainties
- Symptoms may not occur
- Symptoms may not be reported
- Diagnostic tests not perfect
- False positive, false negative
- How do we estimate confidence?
4Motivation II
- Uncertainty in medical decision-making
- Physicians, patients must decide on treatments
- Treatments may not be successful
- Treatments may have unpleasant side effects
- Choosing treatments
- Weigh risks of adverse outcomes
- People are BAD at reasoning intuitively about
probabilities - Provide systematic analysis
5Probabilities Model Uncertainty
- The World - Features
- Random variables
- Feature values
- States of the world
- Assignments of values to variables
- Exponential in of variables
- possible states
6Probabilities of World States
- Joint probability of assignments
- States are distinct and exhaustive
- Typically care about SUBSET of assignments
- aka Circumstance
- Exponential in of dont cares
7A Simpler World
- 2n world states Maximum entropy
- Know nothing about the world
- Many variables independent
- P(strep,ebola) P(strep)P(ebola)
- Conditionally independent
- Depend on same factors but not on each other
- P(fever,coughflu) P(feverflu)P(coughflu)
8Probabilistic Diagnosis
- Question
- How likely is a patient to have a disease if they
have the symptoms? - Probabilistic Model Bayes Rule
- P(DS) P(SD)P(D)/P(S)
- Where
- P(SD) Probability of symptom given disease
- P(D) Prior probability of having disease
- P(S) Prior probability of having symptom
9Modeling (In)dependence
- Bayesian network
- Nodes Variables
- Arcs Child depends on parent(s)
- No arcs independent (0 incoming only a priori)
- Parents of X
- For each X need
10Simple Bayesian Network
Need P(A) P(BA) P(CA) P(DB,C) P(EC)
Truth table 2 22 22 222 22
A only a priori B depends on A C depends on A D
depends on B,C E depends on C
A
B
C
D
E
11Simplifying with Noisy-OR
- How many computations?
- p parents k values for variable
- (k-1)kp
- Very expensive! 10 binary parents2101024
- Reduce computation by simplifying model
- Treat each parent as possible independent cause
- Only 11 computations
- 10 causal probabilities leak probability
- Some other cause
12Noisy-OR Example
A
B
Pn(ba) 1-(1-ca)(1-l) Pn(ba)
(1-ca)(1-l) Pn(ba) 1-(1 -l) l 0.5 Pn(ba)
(1-l)
P(BA)
b b
Pn(ba) 1-(1-ca)(1-l)0.6
(1-ca)(1-l)0.4 (1-ca)
0.4/(1-l) 0.4/0.50.8
ca 0.2
a a
0.6 0.4 0.5 0.5
13Noisy-OR Example II
Full model P(cab)P(cab)P(cab)P(cab) neg
A
B
Assume P(a)0.1 P(b)0.05 P(cab)0.3 ca
0.5 P(cb) 0.7
Noisy-Or ca, cb, l
C
Pn(cab) 1-(1-ca)(1-cb)(1-l) Pn(cab)
1-(1-cb)(1-l) Pn(cab) 1-(1-ca)(1-l) Pn(cab)
1-(1-l)
l 0.3
Pn(cb)Pn(cab)Pn(a)Pn(cab)P(a)
1-0.7(1-ca)(1-cb)(1-l)0.1(1-cb)(1-l)0.9
0.30.5(1-cb)0.07(1-cb)0.70.9
0.035(1-cb)0.63(1-cb)0.665(1-cb) 0.55cb
14Graph Models
- Bipartite graphs
- E.g. medical reasoning
- Generally, diseases cause symptom (not reverse)
s1
s2
d1
s3
d2
s4
d3
s5
d4
s6
15Topologies
- Generally more complex
- Polytree One path between any two nodes
- General Bayes Nets
- Graphs with undirected cycles
- No directed cycles - cant be own cause
- Issue Automatic net acquisition
- Update probabilities by observing data
- Learn topology use statistical evidence of
indep, heuristic search to find most probable
structure
16Decision Making
- Design model of rational decision making
- Maximize expected value among alternatives
- Uncertainty from
- Outcomes of actions
- Choices taken
- To maximize outcome
- Select maximum over choices
- Weighted average value of chance outcomes
17Gangrene Example
Medicine
Amputate foot
Worse 0.25
Full Recovery 0.7 1000
Die 0.05 0
Die 0.01
Live 0.99
850
0
Medicine
Amputate leg
Live 0.6 995
Live 0.98 700
Die 0.4 0
Die 0.02 0
18Decision Tree Issues
- Problem 1 Tree size
- k activities 2k orders
- Solution 1 Hill-climbing
- Choose best apparent choice after one step
- Use entropy reduction
- Problem 2 Utility values
- Difficult to estimate, Sensitivity, Duration
- Change value depending on phrasing of question
- Solution 2c Model effect of outcome over lifetime
19Conclusion
- Reasoning with uncertainty
- Many real systems uncertain - e.g. medical
diagnosis - Bayes Nets
- Model (in)dependence relations in reasoning
- Noisy-OR simplifies model/computation
- Assumes causes independent
- Decision Trees
- Model rational decision making
- Maximize outcome Max choice, average outcomes
20Holmes Example (Pearl)
Holmes is worried that his house will be burgled.
For the time period of interest, there is a
10-4 a priori chance of this happening, and
Holmes has installed a burglar alarm to try to
forestall this event. The alarm is 95 reliable
in sounding when a burglary happens, but also has
a false positive rate of 1. Holmes neighbor,
Watson, is 90 sure to call Holmes at his office
if the alarm sounds, but he is also a bit of a
practical joker and, knowing Holmes concern,
might (30) call even if the alarm is silent.
Holmes other neighbor Mrs. Gibbons is a
well-known lush and often befuddled, but Holmes
believes that she is four times more likely to
call him if there is an alarm than not.
21Holmes Example Model
There a four binary random variables B whether
Holmes house has been burgled A whether his
alarm sounded W whether Watson called G whether
Gibbons called
W
B
A
G
22Holmes Example Tables
B t Bf
Wt Wf
A t f
0.0001 0.9999
0.90 0.10 0.30 0.70
At Af
B t f
Gt Gf
A t f
0.95 0.05 0.01 0.99
0.40 0.60 0.10 0.90