Peer Evaluation Module for WebCAT - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Peer Evaluation Module for WebCAT

Description:

Web-CAT does not currently support peer evaluation ... Web-CAT Grading System ... Implement module for Web-CAT. Test both interface designs to see which is ... – PowerPoint PPT presentation

Number of Views:111
Avg rating:3.0/5.0
Slides: 31
Provided by: webca
Category:

less

Transcript and Presenter's Notes

Title: Peer Evaluation Module for WebCAT


1
Peer Evaluation Module for Web-CAT
  • Morgan Pittkin

2
Problem Summary
  • Peer evaluation provides useful feedback for
    students
  • Web-CAT does not currently support peer
    evaluation
  • Current interface designs for peer evaluation
    fall short in usability

3
Solution Summary
  • Examine current designs for similar systems
  • Develop design for peer evaluation module for
    Web-CAT

4
Outline
  • Current System and Designs
  • Summary of Main Issues
  • Architecture
  • Interface
  • Future Work

5
Current Related Systems
  • Moodle Peer Evaluation Module
  • Web-CAT Grading System
  • http//doi.acm.org/10.1145/1007996.1008030 system
    implemented to test peer evaluation
    effectiveness.

6
Main Issues
  • Rubric for Grading
  • Reviewer Assignment
  • Anonymity
  • Interface for Performing Reviews

7
Review Interface
  • Focus on code, guided by rubric
  • Balance between guided review and reviewer
    freedom
  • Integrating rubric with inline comments

8
Data Design
  • Based on current Web-CAT assignment model
  • Stresses reuse

9
Data Model
10
Interface Design
  • Rubric Creator
  • Reviewer-Student Assignment
  • Perform Review

11
Rubric Creator
  • Essential for this type of review
  • Students may lack necessary knowledge to perform
    quality reviews on their own
  • Helps ensure relatively consistent grading
  • For an example see the Moodle quiz creator
    module, http//docs.moodle.org/en/Quizzes

12
Reviewer-Student Assignment
  • Maps reviewer to one or more assignments for
    review.
  • Download spreadsheet, edit manually and
    re-upload.
  • Flexibility for instructor
  • Reuse of reviewer mappings
  • System might provide random assignments for n
    reviews per student

13
Reviewer-Student Assignment
14
Anonymity
  • Lack of anonymity can cause friction over
    negative comments
  • Most students prefer full anonymity
  • We suggest double blind anonymity

15
Review Interface
  • Most important part of this project
  • Must balance guided review based on rubric with
    reviewer freedom

16
Review Interface
  • Previous designs
  • Completely Separate the Two (Moodle Peer
    Evaluation Module)
  • Rubric is central and guides the review process
  • In much of the recent literature, the balance
    problem is mentioned

17
Review Interface
  • Research shows reviewers focus on code rather
    than rubric
  • Forcing reviewers to follow a rubric is too
    restrictive
  • Rubric should act as a tool rather than strict
    guide

18
Review Interface
  • Solution Present code for markup, and make
    rubric criteria easily accessible when questions
    arise
  • Reviewers can fill out rubric when desired,
    probably at the end of the process
  • Use criteria as guide for making comments
  • Important for reviewer to read rubric before
    commenting code

19
Prototype
Rubric Criteria
Inline Comment Area
20
Prototype
Reviewer clicks in code to make inline comments,
optionally associating them with a rubric
criterion
21
Prototype
Reviewer clicks on a criterion link to expand it.
This gives the instructions for that criterion,
and allows the reviewer to mark it.
22
Prototype
By clicking the show/hide links, the reviewer can
hide the details of the criterion, making it less
obtrusive as they view the code and comments to
determine a mark for it.
23
Alternate Design
  • In our initial prototype, the layout had the
    criteria on the side, becoming visible with
    mouseovers
  • One criterion would be shown at the top of the
    page
  • Testing both interfaces should reveal which one
    is more effective

24
Alternate Design
25
Notes about Interface
  • Comments should be color-coded to match criteria
  • Tabbed interface makes it easier for reviewer to
    switch between files
  • Reviewers should be required to provide summary
    comments for why a particular rating was assigned

26
Interface
  • Provides balance between rubric-guided and
    comment-focused reviews
  • Allows flexibility in performing reviews while
    still providing guidance by way of a rubric

27
Future Work
  • Implement module for Web-CAT
  • Test both interface designs to see which is more
    effective
  • Add feature that allows reviewer to easily
    navigate through comments when deciding a rating
    to give
  • Currently the view for a student looking at a
    review of his/her assignment would look a lot
    like the review screen
  • This comment navigation could also be used for
    this purpose

28
Future Work
  • Add facility for instructor to modify anonymity
    settings
  • Currently system designed for double-blind
    reviews
  • This would probably be used most of the time
  • However, it may be useful to have option for
    instructor to change anonymity settings

29
Future Work
  • May want to implement/test a feature that allows
    the reviewer to assign a severity rating to the
    comment
  • Rating would indicate how badly or well this
    comment reflected on the code in question
  • Might allow a default rating to be applied to
    the criteria, which could be modified by the
    reviewer later

30
Fin
Write a Comment
User Comments (0)
About PowerShow.com