Artificial Intelligence A Modern Approach Uncertainty - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Artificial Intelligence A Modern Approach Uncertainty

Description:

Before the evidence is obtained, we talk about prior or unconditional probability ... Unconditional facts: the prior probability of a patient having meningitis ... – PowerPoint PPT presentation

Number of Views:739
Avg rating:3.0/5.0
Slides: 23
Provided by: jju3
Category:

less

Transcript and Presenter's Notes

Title: Artificial Intelligence A Modern Approach Uncertainty


1
Artificial Intelligence A Modern
ApproachUncertainty
2005-2006 Winter Break Seminar
  • Hyoseok Yoon
  • GIST U-VR Lab.
  • Gwangju 500-712
  • http//uvr.gist.ac.kr

2
Outline
  • Acting Under Uncertainty
  • Basic Probability Notation
  • The Axioms of Probability
  • Bayes Rule and Its Use
  • Where Do Probabilities Come From?
  • Relations to Interpreter

3
Acting Under Uncertainty
  • Problem with the logical-agent approach
  • Agents almost never have access to the whole
    truth about their environment
  • There will be cases to which the agent cannot
    find a categorical answer
  • Qualification problem
  • Many rules about the domain will be incomplete,
    because there are too many conditions to be
    explicitly enumerated, or because some of the
    conditions are unknown
  • The right thing to do, the rational decision,
    therefore, depends on both the relative
    importance of various goals and the likelihood
    that, and degree to which, they will be achieved

4
Handling uncertain knowledge
  • Using first-order logic, we have to add an almost
    unlimited list of possible causes
  • Trying to use first-order to scope with a domain
    like medical diagnosis thus fails for three main
    reasons
  • Laziness It is too much work to list the
    complete set of antecedents or consequents needed
    to ensure an exceptionless rule, and too hard to
    use the enormous rules that result
  • Theoretical ignorance Medical science has no
    complete theory for the domain
  • Practical ignorance Even if we know all the
    rules, we may be uncertain about a particular
    patient because all the necessary tests have not
    or cannot be run

5
Handling uncertain knowledge (contd)
  • The agents knowledge can at best provide only a
    degree of belief in the relevant sentences
  • Probability provides a way of summarizing the
    uncertainty that comes from our laziness and
    ignorance
  • Evidence the percepts the agent has received to
    date
  • An assignment of probability to a proposition is
    analogous to saying whether or not a given
    logical sentence (or its negation) is entailed by
    the knowledge base
  • As the agent receives new percepts, its
    probability assessments are updated to reflect
    the new evidence.
  • Before the evidence is obtained, we talk about
    prior or unconditional probability
  • After the evidence is obtained, we talk about
    posterior or conditional probability

6
Uncertainty and rational decisions
  • An agent must first have preferences between the
    different possible outcomes of the various plans
  • Use utility theory to represent and reason with
    preferences. The term utility is used here in the
    sense of the quality of being useful
  • Utility theory says that every state has a degree
    of usefulness, or utility, to an agent, and that
    the agent will prefer states with higher utility
  • Preferences, as expressed by utilities, are
    combined with probabilities in the general theory
    of rational decisions called decision theory
  • Decision theory probability theory utility
    theory
  • The fundamental idea of decision theory
  • An agent is rational if and only if it chooses
    the action that yields the highest expected
    utility, averaged over all the possible outcomes
    of the action the principle of Maximum Expected
    Utility (MEU)

7
Design for a decision-theoretic agent
  • The structure of an agent that uses decision
    theory to select actions is identical, at an
    abstract level, to that of the logical agent

8
Basic Probability Notation
  • Prior probability
  • The notation P(A) is used for the unconditional
    or prior probability that the proposition A is
    true
  • Ex) P(Cavity) 0.1 , means that in the absence
    of any other information, the agent will assign a
    probability of 0.1 (a 10 chance) to the event of
    the patients having a cavity
  • Propositions can also include equalities
    involving so-called random variables
  • Ex) P(Weather Sunny) 0.7
  • Each random variable X has a domain of possible
    values that it can take on, usually deals with
    discrete sets of values
  • Use an expression such as P(Weather) to denote a
    vector of values for the probabilities of each
    individual state of the weather
  • Probability distribution for the random variables
  • P(Weather) lt 0.7,0.2,0.08,0.02gt

9
Basic Probability Notation (contd)
  • Conditional or posterior probability
  • Notation P(AB) is used and read as the
    probability of A given that all we know is B
  • P(CavityToothache) 0.8 indicates that if a
    patient is observed to have a toothache, and no
    other information is yet available, then the
    probability of the patient having a cavity will
    be 0.8
  • In general, if we are interested in the
    probability of a proposition A, and we have
    accumulated evidence B, then the quality we must
    calculate is P(AB)
  • When conditional probability is not available
    directly in the knowledge base, we must resort to
    probabilistic inference

10
The axioms of probability
  • It is normal to use a small set of axioms that
    constrain the probability assignments that an
    agent can make to a set of propositions
  • 1. All probabilities are between 0 and 1
  • 0 lt P(A) lt 1
  • 2. Necessarily true (i.e., valid) propositions
    have probability 1, and necessarily false (i.e.,
    unsatisfiable) propositions have probability 0
  • P(True) 1, P(False) 0
  • 3. The probability of a disjunction is given by
  • P(A V B) P(A) P(B) P(A B)

11
Why the axioms of probability are reasonable
  • One argument for the axioms of probability, first
    stated in 1931 by Bruno de Finetti
  • The key to de Finettis argument is the
    connection between degree of belief and actions,
    a game between two agents
  • If agent 1 expresses a set of degrees of belief
    that violate the axioms of probability theory
    then there is a betting strategy for Agent 2 that
    guarantees that Agent 1 will lose money

12
The joint probability distribution
  • Completely specifies an agents probability
    assignments to all propositions in the domain
  • A probabilistic model of a domain consists of a
    set of random variables that can take on
    particular values with certain probabilities, X1
    Xn
  • An atomic event is an assignment of particular
    values to all the variables
  • a complete specification of the state of the
    domain
  • Used to compute any probabilistic statement we
    care to know about the domain, by expressing the
    statement as a disjunction of atomic events and
    adding up their probabilities

13
Bayes Rule and Its Use
  • Applying Bayes rule The simple case
  • Probability that disease meningitis causes the
    patient to have a stiff neck 50 of the time
  • Unconditional facts the prior probability of a
    patient having meningitis is 1/50,000
  • The prior priority of any patient having a stiff
    neck is 1/20
  • Let S be the proposition that the patient has a
    stiff neck
  • Let M be the proposition that the patient has
    meningitis

14
Normalization
  • Avoid direct assessment by considering an
    exhaustive set of cases
  • This process is called normalization, because it
    treats 1/P(S) as a normalizing constant that
    allows the conditional terms to sum to 1
  • In general
  • Where a is the normalization constant needed to
    make the entries in the table P(YX) sum to 1

15
Using Bayes rule Combining evidence
  • The process of Bayesian updating incorporates
    evidence one piece at a time, modifying the
    previously held belief in the unknown variable
  • Beginning with Toothache, we have
  • When Catch is observed, we can apply Bayes rule
    with Toothache as the constant conditioning
    context
  • Thus, in Bayesian updating, as each new piece of
    evidence is observed, the belief in the unknown
    variable is multiplied by a factor that depends
    on the new evidence

16
Conditional independence
  • In the cavity example, the cavity is the direct
    cause of both the toothache and the probe
    catching in the tooth
  • Mathematically,
  • To say that X and Y are independent given Z, we
    write
  • P(XY,Z) P(XZ)
  • Bayes rule for multiple evidence is
  • Where a is a normalization constant such that
    entries in P(ZX,Y) sum to 1

17
Where Do Probabilities Come From?
  • Source and status of probability numbers
  • Frequentist the numbers can come from
    experiements
  • Ex) test 100 people and find 10 people with
    cavity, probability of a cavity is then about 0.1
  • Objectivist the probabilities are real aspects
    of the universe
  • Subjectivist a way of characterizing an agents
    beliefs, rather than having any external physical
    significance
  • Ex) allow doctor or analyst to make up numbers

18
Summary
  • Uncertainty arises because of both laziness and
    ignorance. It is inescapable in complex, dynamic,
    or inaccessible worlds
  • Uncertainty means that many of the
    simplifications that are possible with deductive
    inference are no longer valid
  • Probabilities express the agents inability to
    reach a definite decision regarding the truth of
    a sentence, and summarize the agents beliefs
  • Basic probability statements include prior
    probabilities and conditional probabilities over
    simple and complex propositions

19
Summary (contd)
  • The axioms of probability specifies constraints
    on reasonable assignments of probabilities to
    propositions. An agent that violates the axioms
    will behave irrationally in some circumstances
  • The joint probability distribution specifies the
    probability of each complete assignment of values
    to random variables. It is usually far too large
    to create or use
  • Bayes rule allows unknown probabilities to be
    computed from known, stable ones
  • In the general case, combining many pieces of
    evidence may require assessing a large number of
    conditional probabilities
  • Conditional independence brought about by direct
    causal relationships in the domain allows
    Bayesian updating to work effectively even with
    multiple pieces of evidence

20
Relations to interpreter
  • Interpreter only has limited view on the
    environment
  • Through input context
  • Source context may not map categorically to
    target context
  • Different properties, characteristics,
    structures, ontology
  • Bayesian approach in SOCAM
  • The supports for using probability-annotated
    context ontology (OWL) and BN have been built in
    SOCAM. The context ontology with additional
    dependency markups is created and stored in a
    context database. SOCAM can translate this
    ontology into a BN
  • Prob(Status(John,Sleeping)) 0.8

T. Gu, H. K. Pung, D. Q. Zhang. A Bayesian
Approach for Dealing with Uncertain Contexts. In
Proc of the 2nd Intl Conf on Pervasive Computing
(Pervasive 2004), in book "Advances in Pervasive
Computing" , vol. 176, Austria, Apr 2004.
21
Dealing with uncertainty
  • T. Gu, H. Pung, D. Zhang, "A Bayesian approach
    for dealing with uncertain contexts", Proceedings
    of the Second International Conference on
    Pervasive Computing , April 2004
  • Binh An Truong, Young-Koo Lee, Sung-Young Lee,
    Modeling and Reasoning about Uncertainty in
    Context-Aware Systems, e-Business Engineering,
    2005. ICEBE 2005. pp. 102-109 2005.
  • A. Ranganathan, J. Al-Muhtadi, R. H. Campbell,
    "Reasoning about Uncertain Contexts in Pervasive
    Computing Environments", IEEE Pervasive
    Computing, pp 62-70, Volume 3, Issue 2, Apr-June
    2004.

22
QA
  • Discussions More information
  • Hyoseok Yoon
  • GIST U-VR Lab, Gwangju 500-712, S. Korea
  • Tel. (062) 970-2279
  • Fax. (062) 970-2204
  • mailto hyoon_at_gist.ac.kr
  • Web http//uvr.gist.ac.kr

Thank You!
Write a Comment
User Comments (0)
About PowerShow.com