Intelligent Computer-Aided Instruction: A Survey Organized Around System Components - PowerPoint PPT Presentation

About This Presentation
Title:

Intelligent Computer-Aided Instruction: A Survey Organized Around System Components

Description:

MYCIN, GUIDON. Find the likely bacterial cause for the symptoms ... MYCIN, GUIDON. Need knowledge perspective to communicate knowledge to student. A.M. Davis ... – PowerPoint PPT presentation

Number of Views:148
Avg rating:3.0/5.0
Slides: 58
Provided by: chester7
Learn more at: http://cse.unl.edu
Category:

less

Transcript and Presenter's Notes

Title: Intelligent Computer-Aided Instruction: A Survey Organized Around System Components


1
Intelligent Computer-Aided Instruction A Survey
Organized Around System Components
  • Author Jeff W. Rickel, 1989
  • Speaker Amy Davis
  • CSCE 976 (Advanced AI)
  • April 29th, 2002

2
Outline of Presentation
  • Why ICAI?
  • Overview of main systems and technologies
    discussed in this paper
  • Contributions of seminal systems to various
    components of ICAI systems

3
ICAI better than CAI
  • First came CAI
  • Fully specifies presentations
  • All questions and their answers
  • Strict flow of control
  • electronic page-turning
  • Need for Intelligence recognized
  • Rich domain knowledge (and representation)
  • Ability to use knowledge in unspecified ways
  • Individualize instruction for student

4
ICAI Representative of AI
  • No commercial ICAI systems exist
  • ICAI an active research topic in AI
  • ICAI employs many AI techniques
  • Require reasoning from rich knowledge
    representation
  • Models user
  • Needs communication and information structures
  • Needs common sense reasoning

5
ICAI systems (I)
  • WEST (R. R. Burton and J.S. Brown, 1982)
  • Conquer the west with mathematical equations
    that evaluate to the number of spaces you want to
    move.
  • SCHOLAR (Jaime Carbonell, 1970)
  • Learn geography by holding natural-language
    dialog with the computer.

6
ICAI systems (II)
  • WHY (Stevens and Collins, 1977)
  • Understand rainfall, when and why it happens by
    holding a discussion with the computer.
  • SOPHIE (Sleeman and Brown, 1982)
  • Learn by example how to troubleshoot electronic
    circuits.

7
ICAI systems (III)
  • STEAMER (Hollan, Hutchins and Weitzman, 1984)
  • Manipulate controls to a steam propulsion system
    to gain an understanding of how each control
    effects the system.
  • RBT Recovery Boiler Tutor (Woolf, 1986)
  • Solve problems in real time on a simulated
    boiler.

8
ICAI systems (IV)
  • WUMPUS (Goldstein, 1978)
  • Hunt the Wumpus using mathematical and logical
    skills
  • MYCIN, GUIDON
  • Find the likely bacterial cause for the symptoms
    provided.

9
ICAI Goals
  • More effective computer-based tutors
  • More economical computer-based tutors
  • Reflect current state of AI research

10
Components of ICAI systems
  1. Learning Scenarios
  2. Forms of Knowledge Representation
  3. Student modeling
  4. Student diagnosis
  5. Pedagogical knowledge
  6. Discourse management
  7. Automatic problem generation
  8. User Interfaces

11
ICAI Learning Scenarios
  • Goal Involve more senses
  • Retain information longer
  • Make student an active participant
  • Methods
  • Coaching
  • Socratic
  • Mixed-Initiative
  • Dialogue
  • Articulate Expert
  • Simulation
  • Discovery Learning

12
Learning Scenarios Coaching
  • Only give advice when needed
  • Coach looks over students shoulder
  • Offer timely but unobtrusive advice
  • Expose key knowledge when students performance
    plateaus
  • Like MS help
  • Common in Gaming environment (ex. WEST)
  • Determine if student is using correct skills
  • Determine when student needs guidance

13
Learning Scenarios Mixed Initiative Dialog
  • Hold conversation with student
  • Student responds to computer questions
  • OR
  • Student initiates a line of questioning and
    computer answers
  • SCHOLAR
  • More reactive to student
  • Allows student initiative

14
Learning Scenarios Socratic
  • Education can not be attained through passive
    exercises such as reading or listening, but
    instead from actual problem solving
  • Ask thought-probing questions
  • Require use of new knowledge
  • Point out gaps in knowledge
  • Expose misconceptions
  • WHY tutor

15
Learning Scenarios Articulate Expert
  • SOPHIE
  • Teach by example
  • Solve problems with student watching
  • Explain reasons for decisions
  • Demonstrate troubleshooting tactics
  • Then make student solve problems
  • Occasionally provide guidance
  • Force student to give rationale for choices
  • Students should know Why am I doing this action?

16
Learning Scenarios Interactive, Inspectable
Simulation
  • Provide a simulation of a domain
  • Allow exploration of actions
  • See the effects of actions
  • No fear for real-world consequences
  • Potential to carry into real-life situations
  • STEAMER, RBT

17
Learning Scenarios Discovery-Based Learning
  • Opposite of CAI
  • Student explores
  • Micro-world emulation
  • Discover rules and knowledge
  • Full student control driven by curiosity
  • Prepares student for scientific inquiry, real
    life research, creative thinking
  • Outside scope of this paper

18
Learning ScenariosSummary
  • Determines Look and Feel of tutoring system.
  • Based on student-tutor balance of control
  • Requires support from the Knowledge base of the
    system

19
ICAI Domain Knowledge Representation
  • CAI poor knowledge of their domain
  • Canned presentation
  • Canned questions
  • Canned answers
  • ICAI More knowledge ? fewer limitations
  • Support understanding
  • Allow flexibility in teaching
  • Knowledge is key to intelligent behavior
  • Way knowledge is stored dictates its use

20
Domain Knowledge
  • No general form suitable for all knowledge
  • Challenge
  • Determine types of knowledge required
  • Find suitable representations
  • Support teaching particular subjects
  • Forms examined
  • Rule Based
  • Script
  • Semantic Network
  • Simulation
  • Condition Action Rules

21
Domain Knowledge Rule-Based KR
  • Generally a failure
  • Miss low-level detail
  • Miss relations necessary for learning and
    tutoring
  • No analogies, multiple views
  • No levels of explanation
  • Need to know how rules fit together
  • MYCIN, GUIDON
  • Need knowledge perspective to communicate
    knowledge to student

22
Domain Knowledge Scripts
  • WHY
  • Nodes ? processes, events
  • Edges ? relations between nodes
  • X enables Y
  • X causes Y
  • Script ? partially-ordered sequence of processes
    and events linked by temporal or causal
    connections.
  • Hierarchy of scripts lower levels describe
    causal relationships within higher levels.

23
Domain Knowledge Semantic Networks
  • Highly structured data base
  • Stores concepts and facts
  • Stores connections along many dimensions
  • Embeds linguistic information
  • Avoids storing redundant information through use
    of many connections
  • Use data base to generate questions
  • Common in other disciplines of AI

24
Domain Knowledge Simulation
  • STEAMER
  • Mathematically simulate the steam propulsion
    system
  • Tie graphics to the simulation
  • SOPHIE
  • Propagates constraints to explain why a behavior
    is caused

25
Domain KnowledgeCondition/action rules
  • Popular in AI
  • Model of human intelligence (?)
  • Recognize a condition, initiate an action
  • Attractive because rules are modular

26
Domain KnowledgeSummary
  • One representation doesnt work for everything.
  • Often need multiple representations within one
    problem, WHY
  • Must be determined by how knowledge is to be used

27
ICAI Student Modeling
  • Goal Know what the student knows
  • CAI Keep a tally of correct and incorrect
    answers
  • Little adaptation to student
  • Methods
  • Overlay modeling (Goldstein, 1977)
  • Buggy modeling (R. R. Burton, 1982)

28
Student ModelingOverlay
  • Represent student knowledge as some function of
    the teachers knowledge.
  • Allows comparison between what student knows and
    what student should know.
  • WEST, SCHOLAR, WUMPUS

29
Student ModelingBuggy Modeling
  • Include both buggy and correct rules which the
    student may be following
  • Allows students error to be understood
  • May require enumeration of all possible errors!

30
Student ModelingSummary
  • Student Modeling still very open-ended
  • A full discussion is beyond scope of paper
  • Allows computer to find reasons behind student
    errors student diagnosis.

31
ICAI Student Diagnosis
  • Goal Allow student to make mistakes, capitalize
    on them for better learning.
  • Methods
  • Differential modeling
  • Direct interpretation
  • Plan recognition (buggy model)
  • Error taxonomy

32
Student DiagnosisDifferential Modeling
  • Like overlay modeling View a student error as a
    shortcoming that is detected with comparison to
    the tutors knowledge.
  • WEST

33
Student DiagnosisDirect Interpretation
  • Remove constraints on question, until students
    answer becomes valid
  • Example What is the capital of Texas?
  • Madison
  • Madison is the capital of Wisconsin.
  • Reasons through a semantic net

34
Student DiagnosisPlan recognition
  • Buggy model try to find path in the model,
    (correct or incorrect) leading to students
    answer
  • Plan recognition finding the goals which
    underlie student actions
  • Similar to language parsing

35
Student DiagnosisError Taxonomy
  • Classify errors into types
  • Example of categories
  • Mission information
  • Lack of concept
  • Misfiled fact
  • Overgeneralization
  • SCHOLAR

36
Student DiagnosisSummary
  • Student diagnosis is not goal teaching is
  • Most diagnosis can be made easier by asking a few
    more questions
  • Allowing student to discover own errors is more
    effective (Socratic)
  • A little meaningful feedback goes a long way

37
ICAI Pedagogical Knowledge
  • Teachers need to know more than just their
    subject they need to know how to teach.
  • Main problems
  • Lesson planning
  • Dealing with student errors
  • Production rules

38
PedagogyLesson Planning
  • Develop strategies for ordering topics
  • Decide how to present material
  • Decide balance of control between tutor and
    student

39
PedagogyDealing with student errors
  • Two big decisions
  • Decide when to interrupt student
  • Decide what to say
  • Common ideologies
  • Trap student into discovering error
  • Allow student to see consequences of actions
  • Redirect the student
  • Affirm correct choices

40
PedagogySummary
  • Just knowing the problem domain isnt enough
  • Effective teachers have teaching common sense
  • Effective teachers respond to students

41
ICAI Discourse Management
  • Goal Flexibility in the tutorial discourse
  • CAI Hard-code syllabus, sometimes with alternate
    paths
  • Methods
  • Reactive
  • Incremental knowledge-building
  • Context dependent
  • Hierarchical planning

42
Discourse ManagementReaction
  • Allow responses and misconceptions of student to
    drive the dialog
  • SCHOLAR, WHY
  • Have a few initial goals (WHY), and modify them
    as session proceeds

43
Discourse ManagementIncremental Building
  • Add on to students current knowledge
  • Further develop a strong base
  • Explore new topics
  • WUMPUS

44
Discourse ManagementContext Dependent
  • Use context to disambiguate questions, find
    answers
  • Context Position, progress and current task of
    student
  • Object Oriented Tutoring incorporates this into a
    subject object

45
Discourse ManagementHierarchical planning
  • PhD dissertation of Beverly Woolf, 1984
  • Top-down refinement of goals
  • Domain independent

46
Discourse ManagementSummary
  • Discourse management requires knowledge
  • Knowledge needed not just in subject area
  • Authors vary in opinion of how much flexibility
    is best.

47
ICAI Problem Generation
  • CAI canned problems, canned answers
  • Hard for course author
  • No adaptation to student
  • Limited meaningful feedback
  • Generative CAI programs generate new problems
  • Methods
  • Problem-generation trees
  • Slot filling

48
Problem GenerationTrees
  • Concept tree
  • Student is at a level in the tree
  • Tree determines what to include in question
  • Use context-free grammar to form actual question

49
Problem GenerationSlot filling
  • Choose a kind of problem
  • Example fill-in-the-blank, multiple choice
  • Fill in information to problem from information
    in semantic net
  • Requires rich knowledge base

50
Problem GenerationSummary
  • Tree-like structures are used for generating
    problems
  • Problems that are generated must also be solved

51
ICAI User Interface
  • Tutoring systems should include many senses
  • Communication methods
  • Graphics
  • Canned Text
  • Text generation

52
User InterfaceGraphics
  • Graphics allow representation of concepts
    difficult to explain in words
  • Graphics allow user to more fully feel part of
    the environment
  • STEAMER

53
User InterfaceCanned Text
  • Most communication in tutoring is in English
  • Store text phrases at many levels, select
    appropriate statements as needed.
  • Still more flexible than CAI
  • Few systems do much else
  • Also use canned sentence fragments to make
    complete sentences.

54
User InterfaceText Generation
  • SCHOLAR
  • Includes knowledge for NLP
  • Chooses a style of question, fills in key words
    from semantic net
  • No canned text

55
User InterfaceSummary
  • Whole tutoring system is really one big User
    Interface
  • Input of information is more difficult
  • Most systems use graphics or menus, dont mess
    with parsing natural language.
  • Natural Language is Achilles heel of tutoring
    systems.

56
Summary
  • ICAI systems require
  • Learning scenario that is appropriate to domain
    knowledge
  • Student Models, Pedagogical knowledge, and
    Discourse knowledge are necessary
  • Wrap it all in a sensory-stimulating interface

Nature of domain knowledge
Types of misconceptions
Knowledge Representation
57
Questions and Comments?
Write a Comment
User Comments (0)
About PowerShow.com