010'141 Engineering Mathematics II Lecture 19 Propositional Reasoning - PowerPoint PPT Presentation

1 / 59
About This Presentation
Title:

010'141 Engineering Mathematics II Lecture 19 Propositional Reasoning

Description:

Russell, S & Norvig, P 'Artificial Intelligence: A Modern Approach', Prentice Hall ... improved backtracking, e.g., Davis--Putnam-Logemann-Loveland (DPLL) ... – PowerPoint PPT presentation

Number of Views:71
Avg rating:3.0/5.0
Slides: 60
Provided by: scSn
Category:

less

Transcript and Presenter's Notes

Title: 010'141 Engineering Mathematics II Lecture 19 Propositional Reasoning


1
010.141 Engineering Mathematics IILecture
19Propositional Reasoning
  • Bob McKay
  • School of Computer Science and Engineering
  • College of Engineering
  • Seoul National University

2
Outline
  • The Resolution Method
  • Conjunctive Normal Form
  • Resolution and the Empty Clause
  • Horn Clauses and Modus Ponens
  • Backward and Forward Chaining
  • Satisfiability and NP Completeness
  • Efficient Heuristics
  • Phase Changes
  • Limitations of Propositional Calculus

3
References
  • Shoenfield isnt much use for this
  • Written before Robinson published the resolution
    method (1965)
  • Russell, S Norvig, P Artificial Intelligence
    A Modern Approach, Prentice Hall
  • The library has edition 1, call number 006.3
    R917a
  • To buy, edition 2 (1995), ISBN 0137903952
  • Either is fine
  • Nilsson, NJ Artificial Intelligence A New
    Synthesis, Morgan Kaufmann, 1998, ISBN 1 55860
    535 5
  • Library call no 006.3 N599a
  • More detail than youll ever need
  • Leitsch, A The Resolution Calculus, Springer
    1997
  • ISBN 3 540 61882 1
  • Library call number 511.3 L537r

4
Proof methods
  • Proof methods divide (roughly) into two kinds
  • Application of inference rules
  • Legitimate (sound) generation of new sentences
    from old
  • Like our method from last lecture
  • Proof a sequence of inference rule applications
  • We could use inference rules as steps in a
    standard search algorithm
  • Effective computer searches usually transform
    sentences into a normal form which is simpler for
    automated searching
  • Model checking
  • truth table enumeration (always exponential in n)
  • improved backtracking, e.g., Davis--Putnam-Logeman
    n-Loveland (DPLL)
  • heuristic search in model space (sound but
    incomplete)
  • e.g., min-conflicts-like hill-climbing
    algorithms

5
Proof methods for automated search
  • Our proof method from last lecture isnt very
    good for automated search
  • In essence, guessing the right combination of
    rules to use is a hard problem (even if humans do
    it well)
  • Its even harder for the predicate calculus (next
    lecture)
  • Well look at an alternative, resolution, which
    works well for automated search in both the
    propositional and predicate calculus
  • It requires sentences to be converted to a normal
    form
  • Conjunctive Normal Form (CNF)

6
Conjunctive Normal Form (CNF)
  • An atom means one of the variables of our
    language
  • (A, B, C,.)
  • A literal means either an atom or its negation
  • (A, B, C,.)
  • A clause means a disjunction (or) of literals
  • (A ? B ? C,.)
  • A CNF sentence means a conjunction (and) of
    clauses
  • (A ? ?B) ? (B ? ?C ? ?D)

7
Converting to CNF
  • Every propositional sentence can be
    (systematically) converted to CNF
  • First, replace all equivalences using
    biconditional elimination
  • (A ? B) (A ? B) ? (B ? A)
  • Next, replace all implications using implication
    elimination
  • (A ? B) (?A ? B)
  • Next, move all negations inwards towards atoms
    using de Morgans laws
  • ?(A ? B) (?A ? ?B)
  • ?(A ? B) (?A ? ?B)
  • And remove all multiple negations
  • ? ?A A
  • Now, always move conjunctions to the right using
    commutativity
  • (A ? B) (B ? A)
  • But at the same time, move disjunctions inside
    conjunctions when possible, using distributivity
  • (A ? (B ? C)) (A ? B) ? (A ? C)

8
Converting to CNF
  • The algorithm for converting to CNF finishes in a
    finite amount of time
  • If you wish, you can calculate the maximum time
    it will take to convert a formula of n symbols
  • The sentence it generates is logically equivalent
    to the original sentence
  • Ie it has the same models
  • In particular
  • If one of the sentences is valid, so is the other
  • If one of the sentences is unsatisfiable, so is
    the other

9
Representing in Clausal Form
  • A sentence in CNF consists of a conjunction of
    disjunctions of literals
  • Remember that we dont care about the order of
    conjunctions or disjunctions, or about
    repetitions
  • So we can just represent CNF as a set of sets
  • A, ?B,B, ?C, ?D
  • Instead of
  • (A ? ?B) ? (B ? ?C ? ?D)
  • We just have to remember that the outer set means
    conjunction, the inner one disjunction
  • This is known as clausal form
  • Its easier to explain the resolution method in
    clausal form

10
The Resolution Method
  • Resolution is the basic process used in the
    resolution method
  • There are a number of different (equivalent) ways
    of explaining it
  • Well use a different (slightly more general) one
    from R N
  • Suppose we want to prove A is valid
  • One way of doing this is to prove that ?A is
    unsatisfiable
  • So we start by converting ?A into clausal form
  • Lets take last lectures
  • (A ? (A ? B)) ? B
  • As an example

11
Clausal Form Conversion
  • ?((A ? (A ? B)) ? B)
  • ?(?(A ? (? A ? B)) ? B)
  • (? ?(A ? (? A ? B)) ? ? B)
  • (A ? (? A ? B)) ? ? B
  • A, ?A, B, ?B

12
The Resolution Method
  • The resolution method uses an operator called
    resolution
  • Resolution takes two clauses which have
    complementary literals
  • The same literal with opposite signs
  • X1, X2, Xn, Y and Z1, Z2, Zm, ?Y
  • And creates their resolvent by joining them
    together, deleting the complementary literals
  • X1, X2, Xn, Z1, Z2, Zm
  • You can add the resolvent to the original clause
    set without changing its validity

13
The Resolution Method (Cont)
  • Sometimes, the resolvent is bigger than the
    original clauses
  • But sometimes, its also smaller
  • Notice that as clauses get smaller, they are more
    difficult to satisfy
  • A is harder to satisfy than (A ? B ? C)
  • In the limit, it is consistent to regard the
    empty clause as unsatisfiable ( false)
  • (this requires proof, but its not hard)
  • So a clause set which contains the empty clause
    is a conjunction of clauses, one of which is
    unsatisfiable
  • So the clause set is also unsatisfiable
  • The resolution method uses repeated resolutions
    to try to derive the empty set

14
Resolution Example
  • Take our example
  • ?((A ? (A ? B)) ? B)
  • Which we converted to
  • A, ?A, B, ?B
  • We can resolve the first two clauses
  • A resolved with ?A, B gives B
  • Adding this to our clause set gives
  • A, ?A, B, ?B, B
  • And resolving the last two clauses
  • ?B and B gives

15
Resolution Example (continued)
  • So our new clause set is
  • A, ?A, B, ?B, B
  • Since is unsatisfiable, so is the whole clause
    set
  • So ?((A ? (A ? B)) ? B) is unsatisfiable
  • Which means ((A ? (A ? B)) ? B) is valid
  • Notice that there were fewer choices to make than
    in our previous proof
  • Theres only one rule to choose (resolution)
  • Our only choice is which (of possibly many)
    resolutions to choose

16
Properties of the Resolution Method
  • For the Propositional calculus, the resolution
    method is
  • Sound
  • It will never give incorrect answers
  • Complete
  • For any unsatisfiable clause set, the empty
    clause can be derived in finite time
  • In fact, in exponential time
  • Algorithmic
  • There is an algorithm to guarantee this
  • Relatively efficient
  • Proofs can be quite effective
  • Highly efficient for some sorts of formulas

17
Horn Clauses
  • A Horn Clause is a clause which has at most one
    positive literal
  • There are three cases
  • 1) No positive literal (generally not used much)
  • ? X1, ? X2, ? Xn (? X1 ? ? X2 ? ? ? Xn)
  • 2) No negative literal
  • X (X)
  • A fact
  • 3) General Case
  • ? X1, ? X2, ? Xn, Y (? X1 ? ? X2 ? ? ?
    Xn ? Y)
  • (X1 ? X2 ? ? Xn) ? Y
  • A rule

18
Horn Clauses and Resolution
  • Lets look at the sub-cases
  • 1) and 1) - Cant resolve (no positive literals)
  • 1) and 2) - only one way to do it (not very
    useful)
  • (? X1 ? ? X2 ? ? ? Xn), X1
  • _________________________
  • (? X2 ? ? ? Xn)
  • 1) and 3) - again only one choice (not very
    useful)
  • (? X1 ? ? X2 ? ? ? Xn), (Y1 ? Y2 ? ? Yn) ?
    X1
  • _________________________
  • (? X2 ? ? ? Xn ? ? Y1 ? ? Y2 ? ? ? Yn)

19
Horn Clauses and Resolution
  • 2) and 2) - Cant resolve (no negative literals)
  • 2) and 3) - only one way to do it
  • (Generalised Modus Ponens)
  • X1, (X1 ? X2 ? ? Xn) ? Y
  • _________________________
  • (X2 ? ? Xn) ? Y
  • 3) and 3) - Now there are two choices, lets take
    one
  • (X1 ? X2 ? ? Xn) ? Y1, (Y1 ? Y2 ? ? Yn) ?Z
  • _________________________
  • (X1 ? X2 ? ? Xn ? Y2 ? ? Yn) ?Z
  • In fact, we only need generalised modus ponens
    for Horn clause reasoning
  • Runs in Linear Time

20
Forward and Backward Chaining
  • Modus ponens gives us very efficient (short)
    proofs for Horn Clauses
  • But how do we find an efficient proof
    efficiently?
  • There are two main approaches
  • Start from what we know, and work forwards
  • Forward Chaining
  • Start from what we want to prove, and work
    backwards
  • Backward Chaining
  • Of course, its also possible to combine the two
  • Mixed Chaining

21
Forward chaining
  • Idea fire any rule whose premises are satisfied
    in the KB,
  • add its conclusion to the KB, until query is
    found
  • Sound and complete for Horn clauses

22
Forward chaining example
23
Forward chaining example
24
Forward chaining example
25
Forward chaining example
26
Forward chaining example
27
Forward chaining example
28
Forward chaining example
29
Forward chaining example
30
Sketch of completeness proof
  • To prove forward chaining derives every atomic
    sentence that is entailed by KB
  • Forward chaining must reach a final state, a
    fixed point where no new atomic sentences can be
    derived
  • Because there are only finitely many sentences
  • Consider the final state as a model m, assigning
    true/false to symbols
  • Every clause in the original KB is true in m
  • a1 ? ? ak ? b
  • Hence m is a model of KB
  • If KB q, q is true in every model of KB,
    including m

31
Backward chaining
  • Idea work backwards from the query q, keeping a
    stack of open queries
  • to prove q by Backward Chaining, add q to the
    goal stack, then recursively
  • Pop the top of the goal stack, q
  • If q is already known, do nothing, its proven
  • Otherwise, find a rule (q1 ? q2 ? ? qn) ? q
  • Push q1, q2, qn onto the goal stack
  • Until the goal stack is empty (the goal is
    proven!)

32
Backward chaining
  • Efficiency considerations
  • Avoid loops
  • check if new subgoal is already on the goal
    stack
  • Avoid repeated work check if new subgoal
  • has already been proved true
  • has already failed

33
Backward chaining example
34
Backward chaining example
35
Backward chaining example
36
Backward chaining example
37
Backward chaining example
38
Backward chaining example
39
Backward chaining example
40
Backward chaining example
41
Backward chaining example
42
Backward chaining example
43
Forward vs. backward chaining
  • Forward Chaining is data-driven
  • Suitable for automatic, unconscious processing,
  • object recognition
  • routine decisions
  • May do lots of work that is irrelevant to the
    goal
  • Particularly frustrating for human users
  • Backward Chaining is goal-driven
  • appropriate for problem-solving,
  • Where are my keys?
  • How do I get into a PhD program?
  • Complexity can be much less than linear in size
    of KB
  • Appears more purposeful to human users

44
Efficient propositional inference
  • Recall that we discussed resolution in terms of
    unsatisfiability
  • Is this sentence unsatisfiable?
  • We can turn this question on its head
  • Is this sentence satisfiable?
  • How hard is this problem?
  • Actually, a very deep question

45
How hard is it to compute satisfiability
  • It can be solved in polynomial (linear) time by
    guessing
  • A problem which guessing can solve in polynomial
    time is known as an NP problem
  • Satisfiability can be shown to be as hard as any
    other NP problem
  • A problem which can be solved in polynomial time
    without guessing is known as a P problem
  • It is almost universally believed that P ? NP
  • (in fact, that satisfiability requires
    exponential time in the worst case)
  • But it has never been proven

46
Efficient propositional inference
  • There are two main families of efficient
    algorithms for propositional inference
  • Complete backtracking search algorithms
  • DPLL algorithm (Davis, Putnam, Logemann,
    Loveland)
  • Incomplete local search algorithms
  • WalkSAT algorithm
  • Both are efficient for most cases, but can be
    very slow on the worst cases

47
The DPLL algorithm
  • Uses heuristics to remove alternatives in truth
    table enumeration
  • Early termination
  • A clause is true if any literal is true
  • A sentence is false if any clause is false
  • Pure symbol heuristic
  • A pure symbol is a symbol that always appears
    with the same "sign" in all clauses.
  • In the clauses (A ? ?B), (?B ? ?C), (C ? A), A
    and B are pure, C is impure
  • Make any pure symbol literal true
  • Unit clause heuristic
  • A unit clause is a clause with only one literal
  • Make the only literal in a unit clause true

48
The DPLL algorithm
49
The WalkSAT algorithm
  • An incomplete, local search algorithm
  • Evaluation function
  • Minimise the number of unsatisfied clauses
  • Balance between greediness and randomness

50
The WalkSAT algorithm
51
Hard satisfiability problems
  • Consider random 3-CNF sentences. e.g.,
  • (?D ? ?B ? C) ? (B ? ?A ? ?C) ? (?C ? ?B ? E) ?
    (E ? ?D ? B) ? (B ? E ? ?C)
  • m number of clauses
  • n number of symbols
  • If there are many clauses per symbol, it is
    highly unlikely that some combination of truth
    assignments will satisfy all clauses
  • If there are only a few clauses per symbol, its
    easy to find truth assignments satisfying them
    all
  • The hardest problems cluster near m/n 4.3
    (critical point)

52
Are most sentences satisfiable?
53
Hard satisfiability problems
  • Median runtime for 100 satisfiable random 3-CNF
    sentences, n 50

54
Propositional Logic and the wumpus world
  • A wumpus-world description using propositional
    logic
  • ?P1,1
  • ?W1,1
  • Bx,y ? (Px,y1 ? Px,y-1 ? Px1,y ? Px-1,y)
  • Sx,y ? (Wx,y1 ? Wx,y-1 ? Wx1,y ? Wx-1,y)
  • W1,1 ? W1,2 ? ? W4,4
  • ?W1,1 ? ?W1,2
  • ?W1,1 ? ?W1,3
  • ? 64 distinct proposition symbols, 155 sentences

55
(No Transcript)
56
Expressiveness limitations of propositional logic
  • The KB has to contain "physics" sentences for
    every single square
  • Even for our simple case, there are 16 squares
  • The KB must contain 16 times as much logic as we
    might initially expect

t
57
Expressiveness limitations (2)
  • Even worse, some propositions change with time
  • Those which describe where the agent is
  • We need time-stamped variables for every possible
    time and location
  • Every statement involving the agent has to be
    repeated for each possible time
  • For every time t and every location x,y,
    Lx,y(t) ? FacingRight(t)? Forward(t) ? Lx1,y(t)
  • In general, we may not even know how many times
    we might need
  • Though we can probably make a decent guess for
    wumpus

t
58
Summary
  • The Resolution Method
  • Conjunctive Normal Form
  • Resolution and the Empty Clause
  • Horn Clauses and Modus Ponens
  • Backward and Forward Chaining
  • Satisfiability and NP Completeness
  • Efficient Heuristics
  • Phase Changes
  • Limitations of Propositional Calculus

59
?????
Write a Comment
User Comments (0)
About PowerShow.com