Title: Dr. Matthew Ikl
1Imprecise Probabilities and Their Role in General
Intelligence A Pragmatic Approach to Calculating
Weight of Evidence Combining Imprecise
Probabilities and Confidence Intervals
Dr. Matthew Iklé Department of Mathematics and
Computer Science Adams State College
2Probability Theory
- A Principled Foundation for Artificial General
Intelligence - BUT Constraints are placed by
- The need to operate within realistic
computational resources. - The current, incomplete state of probabilistic
mathematics. - THUS Probability theory requires augmentation
with heuristic approaches to be pragmatic for
general intelligence.
3Probabilistic Logic Networks (PLN)
- A Logical Inference System
- Combines rigorous probabilistic formulas with
heuristic rules. - Reasoning based on uncertain knowledge and/or
reasoning leading to uncertain conclusions. - Ability to encompass within logic things such as
induction, abduction, analogy and speculation,
and reasoning about time and causality. - Effectively propagates uncertainties through
complex inferences involving quantifiers,
higher-order functions, etc. - Designed for integration with a general-purpose
cognition process (in the Novamente AI system)
4Probabilistic Logic Networks (PLN)
- A Rich Set of Inference Rules
- Deduction, Bayes Rule, Unification,
Intensional/Extensional Inference, Belief
Revision, - Each rule comes with uncertain truth value
formulas, calculating the truth value of the
conclusion from the truth values of the premises - Inference is controlled by highly flexible
forward and backward chaining processes able to
take feedback from external processes and thus
behave adaptively
5Belief Revision
- One simple, but critical rule within PLN and
other uncertain inference systems - Allows the combination of two different estimates
of the truth value of the same proposition, to
form a composite estimate - Is awkwardly handled within standard
probabilistic approaches - Different estimates may come from different
external sources, OR from different internal
inference trails
6Belief Revision A Heuristic Rule
- lts,dgt -- ltstrength, weight of evidencegt
- count n k d/(1-d) d n/(nk), assume k10
- eat (cat, mouse) lt.8,.7gt // data source 1
- eat(cat, mouse) lt.2,.4gt // data source 2
- -
- eat(cat, mouse) lt.58, .75gt // sources 1 and 2
- (.8 .7 .2.4)/(.7.4) .58
- d1.7 --gt N1 23
- d2.4 --gt N2 7
- N N1 N2 30 (assuming no dependence)
- d .75
7Weight of Evidence
- What is it?
- Why is it important?
- E.g. belief revision
- One approach weight of ev. interval width
- .2,.8 means less evidence than .4,.6
- Pei Wangs NARS system
- Imprecise probabilities
- Heuristic approaches (Izabela Freire Goertzel)
8Imprecise Probabilities
- The foundation of one approach to weight of
evidence calculations within PLN - Peter Walleys Imprecise Beta-Binomial (IBB)
theory, developed in his seminal work,
Statistical Inference with Imprecise
Probabilities, 1991. - Uses a parametrized envelope of
(Beta-disribution) priors rather than assuming a
single prior.
9Imprecise Probabilities
- Advantages of Imprecise Probabilities in General
- Weakness of the traditional approach to
statistics with its reliance on often unmotivated
assumptions regarding the functional forms of
probability distributions. - More natural and consistent with uncertain and
incomplete information. - Standard Bayesian methods offer no generally
viable way to assess or reason about
second-order uncertainties or
weight-of-evidence (eloquently pointed out by
Pei Wang).
10Imprecise Probabilities are not (quite) the
entire answer
- Disadvantages of Imprecise Probabilities
- Overly conservative.
- Professing ignorance rather than giving guidance
for practical decision-making. - Even with significant information to the
contrary, imprecise probability intervals rapidly
expand to 0,1.
11The PLN Approach
- A Hybrid of Imprecise Probabilities and
Traditional Confidence Intervals - Walleys key ideas provide a solid foundation.
- Natural generalization of Walleys parametrized
distributions. - All distributions replaced by envelopes of
distributions.
12Three Basic Stages
- To calculate the weight of evidence associated
with the conclusion of an uncertain inference
rule (e.g. deduction, Bayes rule,) - Translate premise strength (probability) values
s, count (weight-of-evidence) values n, and
standard confidence levels b, into initial
intervals L,U. - Calculate final L,U interval using the
inference rule and Monte-Carlo methods (see next
slide) - Translate final L,U interval back to get final
strength, count, and confidence level values.
13Monte-Carlo methods
final probability interval
initial probability interval
Imprecise Probability level
Translation layer
strength, count, confidence level
strength, count, confidence level
AI engine level
14An Example
- Suppose we have
- 100 gerbils of unknown color
- 10 gerbils of known color, 5 of which are blue
- and 100 rats of known color, 10 of which are
blue. - We wish to estimate the probability of a randomly
chosen blue rodent being a gerbil, using Bayes
rule - P(gerbil blue) ?
15Experimental Results
PLN Approach
16Experimental Results
Bayesian (Standard Confidence Interval) Approach
Walleys Approach
17The PLN Approach
- Advantages of the PLN Hybrid Method
- Introduction of traditional Bayesian confidence
intervals at each stage provides an easily
configurable way to control the expansion of the
probability intervals. - Both Walleys IBB theory and standard Bayesian
inference follow from the PLN approach as special
cases. - Allows for the modeling of all probabilities by
any family of distributions. - Allows for considerably more flexibility in
accounting for known and unknown quantities.
18Conclusions
- The PLN Hybrid Method
- Combines the solid philosophical underpinnings of
imprecise probability theory with the
practicality of standard Bayesian methods. - Provides the ability to adjust interval widths
based on confidence levels. - Interoperates smoothly with non-probabilistic
heuristic methods.
19QUESTIONS?