Generating Bayesian Networks from Probability Logic knowledge Bases - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Generating Bayesian Networks from Probability Logic knowledge Bases

Description:

Generating Bayesian Networks from Probability Logic knowledge Bases. Peter Haddawy ... W has converging arrows and none of W or its descendents are in Z, or ... – PowerPoint PPT presentation

Number of Views:22
Avg rating:3.0/5.0
Slides: 17
Provided by: kki51
Category:

less

Transcript and Presenter's Notes

Title: Generating Bayesian Networks from Probability Logic knowledge Bases


1
Generating Bayesian Networks from Probability
Logic knowledge Bases
  • Peter Haddawy

Kim Kangil
2
Contents
  • Introduction
  • Representation Language
  • Bayesian Knowledge Bases
  • Network Generation Algorithm
  • Conclusion

3
Introduction
  • Problem of Bayesian network
  • Propositional model
  • Static representation
  • Approach
  • Use Knowledge base of probability sentences

4
Representation Language
  • Network fragment
  • g consequent
  • f antecedent
  • x is a subset of Random variable X

5
Contd
  • Rule

6
Contd
  • Definition for each conditional probability
    sentence

7
Contd
  • Constraints for isomorphic mapping to Bayesian
    network

8
Bayesian Knowledge base
  • We have to define some concepts for representing
    the conditional probability completely
  • Definition 1 Path, directed edge, converging
    arrows, direct predecessor, direct successor,
    root, leaf

9
Contd
  • Definition 2 d-seperation
  • Condition about ground terms to define Z
  • W has converging arrows and none of W or its
    descendents are in Z, or
  • W does not have converging arrows and W is in Z

10
Contd
  • Definition 3 Bayesian knowledge base

11
Contd
  • Lemma
  • A ground term in a Bayesian knowledge base is
    independent of all ground terms which are not its
    successors, given its direct predecessors
  • 4 case for proof this
  • The path is a direct link from G to F
  • The path is a direct link from F to G
  • The path must pass through one of Gs direct
    predecessors
  • The path must pass through one of Gs direct
    successors

12
  • Theorem 1
  • A Bayesian knowledge base is a complete
    specification of a joint probability distribution
    over the ground terms contained in any non-empty
    set of ground instances of its rules in which
    every ground term is the consequent of some rule
    instance.

13
Network generation algorithm
  • Backward chaining
  • P(QE)
  • Theorem 2
  • The equality of probability between Bayesian
    Knowledge base and network

14
Examples
15
(No Transcript)
16
Conclusion
  • From Bayesian Knowledge base, we can get a
    network completely matched to Bayesian network.
  • We dont need to consider some variables not used
    in a Query and evidence.
Write a Comment
User Comments (0)
About PowerShow.com