Title: Knowledge Representation and Reasoning
1Knowledge Representation and Reasoning
- Stuart C. Shapiro
- Professor, CSE
- Director, SNePS Research Group
- Member, Center for Cognitive Science
2Introduction
3Long-Term Goal
- Theory and Implementation of
- Natural-Language-Competent
- Computerized Cognitive Agent
- and Supporting Research in
- Artificial Intelligence
- Cognitive Science
- Computational Linguistics.
4Research Areas
- Knowledge Representation and Reasoning
- Cognitive Robotics
- Natural-Language Understanding
- Natural-Language Generation.
5Goal
- A computational cognitive agent that can
- Understand and communicate in English
- Discuss specific, generic, and rule-like
information - Reason
- Discuss acts and plans
- Sense
- Act
- Remember and report what it has sensed and done.
6Cassie
- A computational cognitive agent
- Embodied in hardware
- or Software-Simulated
- Based on SNePS and GLAIR.
7GLAIR Architecture
Grounded Layered Architecture with Integrated
Reasoning
Knowledge Level
SNePS
Perceptuo-Motor Level
NL
Sensory-Actuator Level
Vision
Sonar
Motion
Proprioception
8SNePS
- Knowledge Representation and Reasoning
- Propositions as Terms
- SNIP SNePS Inference Package
- Specialized connectives and quantifiers
- SNeBR SNePS Belief Revision
- SNeRE SNePS Rational Engine
- Interface Languages
- SNePSUL Lisp-Like
- SNePSLOG Logic-Like
- GATN for Fragments of English.
9Example Cassies Worlds
10BlocksWorld
11FEVAHR
12FEVAHRWorld Simulation
13UXO Remediation
Corner flag
Field
Drop-off zone
UXO
NonUXO object
Battery meter
Corner flag
Corner flag
Cassie
Recharging Station
Safe zone
14Crystal Space Environment
15Sample Research IssuesComplex Categories
16Complex Categories 1
- Noun Phrases
- ltDetgt N Adj N
- Understanding of the modification must be left to
reasoning. - Example
- orange juice seat
- Representation must be left vague.
17Complex Categories 2
- Kevin went to the orange juice seat.
- I understand that Kevin went to the orange juice
seat. - Did Kevin go to a seat?
- Yes, Kevin went to the orange juice seat.
18Complex Categories 3
- Pat is an excellent teacher.
- I understand that Pat is an excellent teacher.
- Is Pat a teacher?
- Yes, Pat is a teacher.
- Lucy is a former teacher.
- I understand that Lucy is a former teacher.
19Complex Categories 4
- former' is a negative adjective.
- I understand that former' is a negative
adjective. - Is Lucy a teacher?
- No, Lucy is not a teacher.
Also note representation and use of knowledge
about words.
20Sample Research IssuesIndexicals
21Representation and Use of Indexicals
- Words whose meanings are determined by occasion
of use - E.g. I, you, now, then, here, there
- Deictic Center ltI, YOU, NOWgt
- I SNePS term representing Cassie
- YOU person Cassie is talking with
- NOW current time.
22Analysis of Indexicals(in input)
- First person pronouns YOU
- Second person pronouns I
- here location of YOU
- Present/Past relative to NOW.
23Generation of Indexicals
- I First person pronouns
- YOU Second person pronouns
- NOW used to determine tense and aspect.
24Use of Indexicals 1
Come here.
25Use of Indexicals 2
Come here.
I came to you, Stu. I am near you.
26Use of Indexicals 3
Who am I?
Your name is Stu and you are a person.
Who have you talked to?
I am talking to you.
Talk to Bill.
I am talking to you, Bill.
Come here.
27Use of Indexicals 4
Come here.
I found you. I am looking at you.
28Use of Indexicals 5
Come here.
I found you. I am looking at you.
I came to you. I am near you.
29Use of Indexicals 6
Who am I?
Your name is Bill and you are a person.
Who are you?
I am the FEVAHR and my name is Cassie.
Who have you talked to?
I talked to Stu and I am talking to you.
30Current Research Issues Distinguishing
Perceptually Indistinguishable ObjectsPh.D.
Dissertation, John F. Santore
31- Some robots in a suite of rooms.
32- Are these the same two robots?
- Why do you think so/not?
33Next Steps
- How do people do this?
- Currently doing protocol experiments
- Getting Cassie to do it.
34Current Research Issues Belief Revisionin
aDeductively Open Belief SpacePh.D.
Dissertation, Frances L. Johnson
35Belief Revision in a Deductively Open Belief
Space
- Beliefs in a knowledge base must be able to be
changed (belief revision) - Add remove beliefs
- Detect and correct errors/conflicts/inconsistencie
s - BUT
- Guaranteeing consistency is an ideal concept
- Real world systems are not ideal
36Belief Revision in a DOBS Ideal Theories vs.
Real World
- Ideal Belief Revision theories assume
- No reasoning limits (time or storage)
- All derivable beliefs are acquirable (deductive
closure) - All belief credibilities are known and fixed
- Real world
- Reasoning takes time, storage space is finite
- Some implicit beliefs might be currently
inaccessible - Source/belief credibilities can change
37Belief Revision in a DOBS A Real World KR System
- Must recognize its limitations
- Some knowledge remains implicit
- Inconsistencies might be missed
- A source turns out to be unreliable
- Revision choices might be poor in hindsight
- After further deduction or knowledge acquisition
- Must repair itself
- Catch and correct poor revision choices
38Belief Revision in a DOBS Theory Example
Reconsideration
Ranking 1 is more credible that Ranking 2.
Ranking 1 is more credible that Ranking 2.
College A is better than College B. (Source
Ranking 1)
College B is better than College A. (Source
Ranking 2)
College B is better than College A. (Source
Ranking 2)
Ranking 1 was flawed, so Ranking 2 is more
credible than Ranking 1.
Need to reconsider!
39Next Steps
- Implement reconsideration
- Develop benchmarks for implemented krr systems.
40Current Research Issues Default
ReasoningbyPreferential Ordering of
BeliefsM.S. Thesis, Bharat Bhushan
41Small Knowledge Base
- Birds have wings.
- Birds fly.
- Penguins are birds.
- Penguins dont fly.
42KB Using Default Logic
- ?x(Bird(x) ? Has(x, wings))
- ?x(Penguin(x) ? ?Flies(x))
43KB Using Preferential Ordering
- ?x(Bird(x) ? Has(x, wings))
- ?x(Penguin(x) ? ?Flies(x))
- Precludes(?x(Penguin(x) ? ?Flies(x)),
- ?x(Bird(x) ? Flies(x)))
44Next Steps
- Finish theory and implementation.
45Current Research Issues Representation
Reasoningwith Arbitrary ObjectsStuart C. Shapiro
46Classical Representation
- Clyde is gray.
- Gray(Clyde)
- All elephants are gray.
- ?x(Elephant(x) ? Gray(x))
- Some elephants are albino.
- ?x(Elephant(x) Albino(x))
- Why the difference?
47Representation Using Arbitrary Indefinite
Objects
- Clyde is gray.
- Gray(Clyde)
- Elephants are gray.
- Gray(any x Elephant(x))
- Some elephants are albino.
- Albino(some x Elephant(x))
48Subsumption Among Arbitrary Indefinite Objects
(any x Elephant(x))
(any x Albino(x) Elephant(x))
(some x Albino(x) Elephant(x))
(some x Elephant(x))
If x subsumes y, then P(x) ? P(y)
49Example (Runs in SNePS 3)
- Hungry(any x Elephant(x)
- Eats(x, any y Tall(y)
-
Grass(y) - On(y,
Savanna))) - ?
- Hungry(any u Albino(u)
- Elephant(u)
- Eats(u, any v Grass(v)
- On(v,
Savanna)))
50Next Steps
- Finish theory and implementation of arbitrary and
indefinite objects. - Extend to other generalized quantifiers
- Such as most, many, few, no, both, 3 of,
51For More Information
- Shapiro http//www.cse.buffalo.edu/shapiro/
- SNePS Research Group http//www.cse.buffalo.edu/s
neps/