Online Assessment for Individualized Distributed Learning Applications - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Online Assessment for Individualized Distributed Learning Applications

Description:

Title: STANDARDS-BASED ACCOUNTABILITY SYSTEMS: DESIGN ISSUES AND RECOMMENDATIONS Created Date: 11/10/1998 4:19:15 PM Document presentation format – PowerPoint PPT presentation

Number of Views:133
Avg rating:3.0/5.0
Slides: 39
Provided by: cseUclaEd
Category:

less

Transcript and Presenter's Notes

Title: Online Assessment for Individualized Distributed Learning Applications


1
Online Assessment for Individualized Distributed
Learning Applications
  • Greg Chung

UCLA Graduate School of Education Information
StudiesNational Center for Research on
Evaluation,Standards, and Student Testing
(CRESST) Annual CRESST Conference Los Angeles,
CASeptember 9, 2004
2
Overview of Talk
  • Distributed learning (DL) context
  • Elements of a DL system
  • Research examples
  • Current work

3
Distributed Learning Definition
  • The distribution via technology of training,
    education, and information that resides at one
    location to any number of learners who may be
    separated by time and space and who may interact
    with other parties (peers, instructor, system)
    synchronously or asynchronously.

4
Characteristics
  • Learner-centric
  • Autonomous learner
  • Asynchronous communication modes
  • Varying degree of instructor support

5
Typical Vision Statement
  • Provide quality instruction to the right
    people, at the right time, and at the right
    place.

6
Implications of DL
  • Operational
  • Anytime, anywhere learning implies anytime,
    anywhere assessment
  • Online, rapid scoring, immediate feedback to
    learner, actionable information
  • Individualized
  • Research
  • Examine ways of extracting useful information
    about learners in an online context

7
Instruction-AssessmentLoop
Instruction
Assessment
Decision
8
Elements of a DL System
  • Framework to guide what information to extract
    from the online environment
  • Method to synthesize disparate information types
  • Automated reasoning support for interpreting
    knowledge and performance observations

9
CRESST Assessment Model
ContentKnowledge
Communication
Problem Solving
Learning
Self-Regulation
Collaboration
10
Data Fusion Strategy
  • Inferential
  • used the generate-and-test problem-solving
    strategy
  • used productive learning strategies
  • understood the fundamentals of rifle marksmanship

construct
  • Descriptive
  • adjusted bicycle pump design
  • performed (virtual) blood test correctly

indicator
  • Event
  • clicked on button 32
  • selected test item 2
  • spent 20 sec on help page 3

clickstream

11
Research Examples
  • Elements of DL system tested in several studies
  • Pump simulation design task
  • Tested whether the generate-and-test problem
    solving strategy could be measured using simple
    aggregation of clickstream data
  • Problem-solving task (IMMEX)
  • Tested whether moment-to-moment learning
    processes could be measured from clickstream data
    (data fused with Bayesian networks)

12
Research Examples
  • Elements of DL system tested in several studies
    (continued)
  • Knowledge of rifle marksmanship
  • Tested individualized instruction based on
    measures of knowledge
  • Data fused with Bayesian networks

13
Research Example 1 Pump Design Task
  • Can the generate-and-test problem solving
    strategy be measured using clickstream data?
  • Novel GUI to support measurement

14
Generate-and-Test Processes
15
Information events -- Click and hold mouse down
to view information
Design events run pump simulation
Solve problem event commit to a design solution
Design events change dimensions of pump
16
Example 1 Results
17
Theory
Online Behavior
18
Example 1 Conclusion
  • Findings consistent with generate-and-test
    problem solving strategy
  • Sequence of events was an important
    characteristic of the data
  • Simple test of data fusion strategy
  • Insertion of software sensors driven by cognitive
    demands of task
  • Low-value clicks transformed into meaningful
    information

19
Research Example 2Problem-solving task
  • Research Question
  • To what extent can learning processes be modeled
    solely from clickstream (i.e., behavioral) data?
  • More complex test of data fusion strategy in a
    different domain
  • Use Bayesian networks to depict dependencies
    between cognitive processes and online behavior

20
Test procedures
Parents
21
Behavioral Indicator Example
  • Construct Understands a test procedure
  • Indicator Not testing for a parent that could
    have been eliminated with a prior test
  • Indicator Successive reduction in the number of
    parents tested across tests
  • Construct Successful learning
  • Indicator test -gt library access of test -gt test
  • Indicator library access of test -gt test -gt
    library access of test
  • Indicator 5s or more spent on library access of
    test -gt test

22
Bayesian Network
Inferred processes
Behavioral indicators
23
Example 2 Results
  • Overall, similar pattern of results between BN
    and think-aloud measures with respect to
  • Task performance measures
  • High vs. low performers
  • Scientific reasoning

24
Example 2 Conclusion
  • More complex test of data fusion strategy
  • Descriptive measures derived from clickstream
    data
  • Low complexity, low inference -- easy to program
    in software
  • Inferences drawn from Bayesian network at level
    that is meaningful for instruction or assessment
    purposes
  • Low-value clicks transformed into meaningful
    information

25
Research Example 3Knowledge ofrifle
marksmanship
  • How can information from assessments be used to
    deliver individualized instructional
    recommendations in a distributed learning (DL)
    context?

26
Linking Assessment and Instruction
Bayesian Network Model of Knowledge Dependencies
Ontology of Marksmanship Domain
probability of knowing a topic
item-level scores
content
Recommender
individualized feedback and content
27
Example of Feedback and Content Delivery
28
Example 3 Results
  • BN probabilities increased for concepts that had
    instructional content served
  • BN probabilities did not change for concepts that
    did not have instructional content
  • BN probabilities corresponded with Marines
    self-ratings of their level of knowledge (80
    agreement)

29
Current Work
  • Circuit analysis
  • Validating technique for use in Electrical
    Engineering gateway course
  • Rifle marksmanship
  • Integrated test of general approach
  • Compare DL system, coach, control conditions on
    shooting performance

30
(No Transcript)
31
Summary and Conclusion
  • Distributed learning systems likely to increase
    in education and training contexts (K16,
    military, business)
  • The cognitive demands underlying performance
    tasks provides strong guidance for developing
    online measures
  • Extracting useful information from online
    behavior appears promising, but more research
    needed

32
(No Transcript)
33
Backup
34
Some 2000-01 Numbers
  • 56 of all postsecondary institutions offered
    distance education courses
  • 90 of public 2-year
  • 89 of public 4-year
  • 48 degree granting (und grad)
  • 40 of private 4-year
  • 33 degree granting (und grad)

2004 NCES Indicator 32
35
Review Process
  • Reviewed 62 commercial and academic Web-based
    products
  • Data sources online searches, existing reviews,
    and online learning trade publications
  • Criteria for inclusion in analyses
  • System claimed to have Web-based testing
    capability
  • Broad criteria intended to maximize coverage of
    products

36
Product/Vendor List
Anlon BKM-elearning Blackboard Centra Class Act
(Darascott's) Click2learn Computer Adaptive
Technology Convene (IZIOPro) CyberWISE Docent E-co
llege Edusystem eno.com e-path BuildKit Aud
Managekit Eval First Class Generation21 iAuthor IM
S Assessment Designer Infosource (content
authoring tool) Interwise Millennium (enterprice
communication platform) Intralearn Jones
e-education
Kenexa Knowledge Planet Learning Manager Learning
Space Learnlinc/Testlinc Librix performance
management (maritz) Macromedia (Authorware
6) Mentorware Microsoft LRN Toolkit MKLesson NCS
Pearson Open Learning Agency of
Australia Pedagogue Testing (Formal
Systems) People Sciences People Soft Performance
Assessment Network Pinnacle Plateau4 Learning
Management System Platte canyon Prometheus Quelsys
QuestionMark Perception RapidExam
2.0 Risc Saba Sage Smartforce Technomedia TEDS
Learning on Demand THINQ Training
Server TopClass Trainersoft 7 Professional TRIADS
Tutorial Gateway Ucompass Educator Vcampus Virtual
-U WBTmanager WebCT  
37
2002 Review
  • Current Web-based systems provide tools for
    end-users to assemble, administer, and score
    tests containing mostly conventional item formats
  • Little support on how to develop quality tests,
    or how to use test information
  • Little support for performance assessments
  • Little support for diagnostic information
  • Weak support for linking instruction to test
    results

38
Results
BN topic Reason-ing Know.Map Prior Know-ledge Shot Group Posi-tion Qual. Score
Overall know-ledge .28 .08 .76 .27 .32 .22(p lt .10)
Aiming .35 .06 .68 .24 .38 .20
Breath control .24 .08 .66 .48 .17 .16
Trigger control .36 .20 .50 .30 .30 .40
Position .17 .14 .59 .17 .36 .32
N 53
Write a Comment
User Comments (0)
About PowerShow.com