Computational Discovery of Communicable Knowledge - PowerPoint PPT Presentation

About This Presentation
Title:

Computational Discovery of Communicable Knowledge

Description:

Skill/concept hierarchies are learned in a cumulative manner. Cascaded Integration in ICARUS ... ICARUS organizes conceptual memory in a hierarchical manner. ... – PowerPoint PPT presentation

Number of Views:39
Avg rating:3.0/5.0
Slides: 40
Provided by: Lang8
Learn more at: http://www.isle.org
Category:

less

Transcript and Presenter's Notes

Title: Computational Discovery of Communicable Knowledge


1
A Unified Cognitive Architecture for Embodied
Agents
Pat Langley School of Computing and
Informatics Arizona State University Tempe,
Arizona USA
Thanks to D. Choi, T. Konik, N. Li, D. Shapiro,
and D. Stracuzzi for their contributions. This
talk reports research partly funded by grants
from DARPA IPTO, which is not responsible for its
contents.
2
Cognitive Architectures
  • A cognitive architecture (Newell, 1990) is the
    infrastructure for an intelligent system that is
    constant across domains
  • the memories that store domain-specific content
  • the systems representation and organization of
    knowledge
  • the mechanisms that use this knowledge in
    performance
  • the processes that learn this knowledge from
    experience

An architecture typically comes with a
programming language that eases construction of
knowledge-based systems. Research in this area
incorporates many ideas from psychology about the
nature of human thinking.
3
The ICARUS Architecture
ICARUS (Langley, 2006) is a computational theory
of the human cognitive architecture that posits
  • Short-term memories are distinct from long-term
    stores
  • Memories contain modular elements cast as
    symbolic structures
  • Long-term structures are accessed through pattern
    matching
  • Cognition occurs in retrieval/selection/action
    cycles
  • Learning involves monotonic addition of elements
    to memory
  • Learning is incremental and interleaved with
    performance

It shares these assumptions with other cognitive
architectures like Soar (Laird et al., 1987) and
ACT-R (Anderson, 1993).
4
Distinctive Features of ICARUS
However, ICARUS also makes assumptions that
distinguish it from these architectures
  • Cognition is grounded in perception and action
  • Categories and skills are separate cognitive
    entities
  • Short-term elements are instances of long-term
    structures
  • Inference and execution are more basic than
    problem solving
  • Skill/concept hierarchies are learned in a
    cumulative manner

Some of these tenets also appear in Bonasso et
al.s (2003) 3T, Freeds (1998) APEX, and Sun et
al.s (2001) CLARION.
5
Cascaded Integration in ICARUS
Like other unified cognitive architectures,
ICARUS incorporates a number of distinct modules.

learning
problem solving
skill execution
conceptual inference
ICARUS adopts a cascaded approach to integration
in which lower-level modules produce results for
higher-level ones.
6
Goals for ICARUS
  • Our main objectives in developing ICARUS are to
    produce
  • a computational theory of higher-level cognition
    in humans
  • that is qualitatively consistent with results
    from psychology
  • that exhibits as many distinct cognitive
    functions as possible

Although quantitative fits to specific results
are desirable, they can distract from achieving
broad theoretical coverage.
7
An ICARUS Agent for Urban Driving
  • Consider driving a vehicle in a city, which
    requires
  • selecting routes
  • obeying traffic lights
  • avoiding collisions
  • being polite to others
  • finding addresses
  • staying in the lane
  • parking safely
  • stopping for pedestrians
  • following other vehicles
  • delivering packages
  • These tasks range from low-level execution to
    high-level reasoning.

8
ICARUS Concepts for In-City Driving
((in-rightmost-lane ?self ?clane) percepts
( (self ?self) (segment ?seg) (line ?clane
segment ?seg)) relations ((driving-well-in-segme
nt ?self ?seg ?clane) (last-lane ?clane) (not
(lane-to-right ?clane ?anylane)))) ((driving-well
-in-segment ?self ?seg ?lane) percepts ((self
?self) (segment ?seg) (line ?lane segment ?seg))
relations ((in-segment ?self ?seg) (in-lane
?self ?lane) (aligned-with-lane-in-segment ?self
?seg ?lane) (centered-in-lane ?self ?seg
?lane) (steering-wheel-straight
?self))) ((in-lane ?self ?lane) percepts
( (self ?self segment ?seg) (line ?lane segment
?seg dist ?dist)) tests ( (gt ?dist -10)
(lt ?dist 0)))
9
Structure and Use of Conceptual Memory
ICARUS organizes conceptual memory in a
hierarchical manner.
Conceptual inference occurs from the bottom up,
starting from percepts to produce high-level
beliefs about the current state.
10
Representing Short-Term Beliefs/Goals
(current-street me A) (current-segment me
g550) (lane-to-right g599 g601) (first-lane
g599) (last-lane g599) (last-lane
g601) (at-speed-for-u-turn me) (slow-for-right-tur
n me) (steering-wheel-not-straight
me) (centered-in-lane me g550 g599) (in-lane me
g599) (in-segment me g550) (on-right-side-in-segme
nt me) (intersection-behind g550
g522) (building-on-left g288) (building-on-left
g425) (building-on-left g427) (building-on-left
g429) (building-on-left g431) (building-on-left
g433) (building-on-right g287) (building-on-right
g279) (increasing-direction me) (buildings-on-righ
t g287 g279)
11
ICARUS Skills for In-City Driving
((in-rightmost-lane ?self ?line) percepts
((self ?self) (line ?line)) start
((last-lane ?line)) subgoals ((driving-well-in-s
egment ?self ?seg ?line))) ((driving-well-in-seg
ment ?self ?seg ?line) percepts ((segment
?seg) (line ?line) (self ?self)) start
((steering-wheel-straight ?self)) subgoals
((in-segment ?self ?seg) (centered-in-lane ?self
?seg ?line) (aligned-with-lane-in-segment ?self
?seg ?line) (steering-wheel-straight
?self))) ((in-segment ?self ?endsg) percepts
((self ?self speed ?speed) (intersection ?int
cross ?cross) (segment ?endsg street ?cross
angle ?angle)) start ((in-intersection-fo
r-right-turn ?self ?int)) actions ((?steer
1)))
12
ICARUS Skills Build on Concepts
ICARUS stores skills in a hierarchical manner
that links to concepts.
concepts
Each concept is defined in terms of other
concepts and/or percepts. Each skill is defined
in terms of other skills, concepts, and percepts.
skills
13
Skill Execution in ICARUS
Skill execution occurs from the top down,
starting from goals to find applicable paths
through the skill hierarchy.
This process repeats on each cycle to give
teleoreactive control (Nilsson, 1994) with a bias
toward persistence of initiated skills.
14
Execution and Problem Solving in ICARUS
Skill Hierarchy
Problem
Reactive Execution
?
no
impasse?
Primitive Skills
Executed plan
yes
Problem Solving
Problem solving involves means-ends analysis that
chains backward over skills and concept
definitions, executing skills whenever they
become applicable.
15
ICARUS Learns Skills from Problem Solving
Problem
Reactive Execution
?
no
impasse?
Primitive Skills
Executed plan
yes
Problem Solving
Skill Learning
16
Learning from Problem Solutions
ICARUS incorporates a mechanism for learning new
skills that
  • operates whenever problem solving overcomes an
    impasse
  • incorporates only information available from the
    goal stack
  • generalizes beyond the specific objects concerned
  • depends on whether chaining involved skills or
    concepts
  • supports cumulative learning and within-problem
    transfer

This skill creation process is fully interleaved
with means-ends analysis and execution. Learned
skills carry out forward execution in the
environment rather than backward chaining in the
mind.
17
ICARUS Memories and Processes
Perceptual Buffer
Short-Term Belief Memory
Long-Term Conceptual Memory
Conceptual Inference
Perception
Environment
Skill Retrieval and Selection
Short-Term Goal Memory
Long-Term Skill Memory
Skill Execution
Problem Solving Skill Learning
Motor Buffer
18
An ICARUS Agent for Urban Combat
19
ICARUS Summary
ICARUS is a unified theory of the cognitive
architecture that
  • includes hierarchical memories for concepts and
    skills
  • interleaves conceptual inference with reactive
    execution
  • resorts to problem solving when it lacks routine
    skills
  • learns such skills from successful resolution of
    impasses.

We have developed ICARUS agents for a variety of
simulated physical environments, including urban
driving. However, it has a number of limitations
that we must address to improve its coverage of
human intelligence.
20
Challenge 1 Arbitrary Behaviors
ICARUS indexes skills by the goals they achieve
this aids in
  • Retrieving relevant candidate skills for
    execution
  • Determining when skill execution should terminate
  • Constructing new skills from successful solutions

But these goals can describe only instantaneous
states of the environment, which limits ICARUS
representational power. For example, it cannot
encode skills for complex dance steps that end
where they start or the notion of a round trip.
21
Incorporating Temporal Constraints
To support richer skills, we are extending ICARUS
to include
  • Concepts that indicate temporal relations which
    must hold among their subconcepts
  • Skills that use these temporally-defined concepts
    as their goals and subgoals
  • A belief memory that includes episodic traces of
    when each concept instance began and ended

We are also augmenting its inference, execution,
and learning modules to take advantage of these
temporal structures.
22
The Concept of a Round Trip
  • Any round trip from A to B involves
  • First being located at place A
  • Then being located at place B
  • Then being located at place A again
  • We can specify this concept in the new formalism
    as

((round-trip ?self ?a ?b) percepts ((self
?self) (location ?a) (location ?b))
relations ((at ?self ?a) ?start1 ?end1 (at
?self ?b) ?start2 ?end2 (at ?self ?a) ?start3
?end3) constraints (( ?end1 ?start2) (
?end2 ?start3)))
23
Episodes and Skills for Round Trips
  • The inference module automatically adds episodic
    traces like

(at me loc1) 307 398 (home loc1) 200
(in-transit me loc1 loc2) 399 422 (office
loc2) 220 (at me loc2) 422 536 (at me
loc1) 558
  • The execution module compares these to extended
    skills like

((round-trip ?self ?a ?b) percepts ((self
?self) (location ?a) (location ?b))
start ((at ?self ?a) ?start1 ?end1)
subgoals ((at ?self ?b) ?start2 ?end2 (at
?self ?a) ?start3 ?end3) constraints ((
?end1 ?start2) ( ?end2 ?start3)))
  • This checks their heads and uses constraints to
    order subgoals.

24
Challenge 2 Robust Learning
ICARUS currently acquires new hierarchical skill
clauses by
  • Solving novel problems through means-ends
    analysis
  • Analyzing the steps used to achieve each subgoal
  • Storing one skill clause for each solved
    subproblem

However, this mechanism has two important
limitations
  • It can create skills with overly general start
    conditions
  • It depends on a hand-crafted hierarchy of concepts

We hypothesize that a revised mechanism which
also learns new concepts can address both of
these problems.
25
Forming New Concepts
To support better skill learning, we are
extending ICARUS to
  • Create new conceptual predicates and associated
    definitions for start conditions and effects of
    acquired skills
  • That are functionally motivated but structurally
    defined
  • That extend the concept hierarchy to support
    future problem solving and skill learning

Learned concepts for skills preconditions serve
as perceptual chunks which access responses that
achieve the agents goals.
26
Learning Concepts in the Blocks World
When the problem solver achieves a goal, it
learns both a new skill and two concepts, one
for its preconditions and one for effects. The
system uses a mechanism similar to that in
composition (Neves Anderson, 1981) to
determine the conditions for each one. ICARUS
uses the same predicate in two clauses if the
achieved goals are the same and if the initially
true subconcepts are the same (for concept
chaining) or the utilized skills are the same
(for skill chaining).
(clear A)
(unstacked B A)
(unstackable B A)
(clear B)
(hand-empty)
(on B A)
? ? ?
(clear C)
This produces disjunctive and recursive
concepts.
(unstacked D C)
(unstackable D C)
27
Learning Concepts in the Blocks World
ICARUS solves novel problems in a top-down
manner, using means-ends analysis to chain
backward from goals. But it acquires concepts
from the bottom up, just as it learns skills.
Here it defines the base case for the start
concept associated with the skill for making a
block clear.
(clear A)
(unstacked B A)
(unstackable B A)
(clear B)
(hand-empty)
(on B A)
? ? ?
((scclear ?C) percepts ((block ?C) (block ?D))
relations ((unstackable ?D ?C)))
(clear C)
(unstacked D C)
(unstackable D C)
28
Learning Concepts in the Blocks World
This process continues upward as the architecture
achieves higher-level goals. Here ICARUS defines
the recursive case for the start concept
associated with the skill for making a block
clear.
(clear A)
(unstacked B A)
(unstackable B A)
((scclear ?B) percepts ((block ?B) (block ?C))
relations ((scunstackable ?C ?B)))
(clear B)
(hand-empty)
(on B A)
? ? ?
((scclear ?C) percepts ((block ?C) (block ?D))
relations ((unstackable ?D ?C)))
(clear C)
(unstacked D C)
(unstackable D C)
29
Learning Concepts in the Blocks World
Skills acquired with these learned
concepts appear to be more accurate than those
created with ICARUS old mechanism.
(clear A)
(unstacked B A)
((scunstackable ?B ?A) percepts ((block ?B)
(block ?A)) relations ((on ?B ?A) (hand-empty)
(scclear ?B)))
(unstackable B A)
((scclear ?B) percepts ((block ?B) (block ?C))
relations ((scunstackable ?C ?B)))
(clear B)
(hand-empty)
(on B A)
? ? ?
((scclear ?C) percepts ((block ?C) (block ?D))
relations ((unstackable ?D ?C)))
(clear C)
(unstacked D C)
(unstackable D C)
30
Learning Concepts in the Blocks World
((scclear ?A) percepts ((block ?A) (block ?B))
relations ((scunstackable ?B ?A)))
(clear A)
(unstacked B A)
((scunstackable ?B ?A) percepts ((block ?B)
(block ?A)) relations ((on ?B ?A) (hand-empty)
(scclear ?B)))
(unstackable B A)
((scclear ?B) percepts ((block ?B) (block ?C))
relations ((scunstackable ?C ?B)))
(clear B)
(hand-empty)
(on B A)
? ? ?
((scclear ?C) percepts ((block ?C) (block ?D))
relations ((unstackable ?D ?C)))
(clear C)
(unstacked D C)
(unstackable D C)
31
Benefits of Concept Learning (Free Cell)
32
Benefits of Concept Learning (Logistics)
33
Challenge 3 Reasoning about Others
ICARUS is designed to model intelligent behavior
in embodied agents, but our work to date has
treated them in isolation.
  • The framework can deal with other independent
    agents, but only by viewing them as other objects
    in the environment.

But people can reason more deeply about the goals
and actions of others, then use their inferences
to make decisions.
  • Adding this ability to ICARUS will require
    knowledge, but it may also demand extensions to
    the architecture.

34
An Urban Driving Example
  • You are driving in a city behind another vehicle
    when a dog suddenly runs across the road ahead of
    it.
  • You do not want to hit the dog, but you are in no
    danger of that, yet you guess the other driver
    shares this goal.
  • You reason that, if you were in his situation,
    you would swerve or step on the brakes to avoid
    hitting the dog.
  • This leads you to predict that the other car may
    soon slow down very rapidly.
  • Since you have another goal to avoid collisions
    you slow down in case that event happens.

35
Social Cognition in ICARUS
For ICARUS to handle social cognition of this
sort, it must
  • Imagine itself in another agents physical/social
    situation
  • Infer the other agents goals either by default
    reasoning or based on its behavior
  • Carry out mental simulation of the other agents
    plausible actions and their effects on the world
  • Take high-probability trajectories into account
    in selecting which actions to execute itself.

Each of these abilities require changes to the
architecture of ICARUS, not just its knowledge
base.
36
Architectural Extensions
In response, we are planning a number of changes
to ICARUS
  • Add abductive reasoning that makes plausible
    inferences about goals via relational cascaded
    Bayesian classifier
  • Extend the problem solver to support
    forward-chaining search via mental simulation
    using repeated lookahead
  • Revise skill execution to consider probability of
    future events using the desirability of likely
    trajectories

These extensions will let ICARUS agents reason
about other agents and use the results to
influence its own behavior.
37
Automating Social Cognition
Although humans can reason explicitly about other
agents likely actions, they gradually compile
responses and automate them. The ICARUS skill
learning module should achieve this effect by
  • Treating goals achieved via anticipation as
    solved impasses
  • Analyzing steps that led to this solution to
    learn new skills
  • Using these skills to automate behavior when the
    agent finds itself in a similar situation.

Over time, the agent will behave in socially
relevant ways with no need for explicit reasoning
or mental simulation.
38
Concluding Remarks
ICARUS is a unified theory of cognition that
exhibits important human abilities but that also
has limitations. However, our recent work has
extended the architecture to
  • Represent concepts and skills with temporal
    relations and use them to execute arbitrary
    behaviors
  • Acquire new predicates that extend the concept
    hierarchy and enable better skill learning
  • Reason about other agents situations and goals,
    predict their behavior, and select appropriate
    responses.

These extensions bring ICARUS a few steps closer
to a broad-coverage theory of higher-level
cognition.
39
End of Presentation
Write a Comment
User Comments (0)
About PowerShow.com