Title: Peer Evaluation Module for WebCAT
1Peer Evaluation Module for Web-CAT
2Problem Summary
- Peer evaluation provides useful feedback for
students - Web-CAT does not currently support peer
evaluation - Current interface designs for peer evaluation
fall short in usability
3Solution Summary
- Examine current designs for similar systems
- Develop design for peer evaluation module for
Web-CAT
4Outline
- Current System and Designs
- Summary of Main Issues
- Architecture
- Interface
- Future Work
5Current Related Systems
- Moodle Peer Evaluation Module
- Web-CAT Grading System
- http//doi.acm.org/10.1145/1007996.1008030 system
implemented to test peer evaluation
effectiveness.
6Main Issues
- Rubric for Grading
- Reviewer Assignment
- Anonymity
- Interface for Performing Reviews
7Review Interface
- Focus on code, guided by rubric
- Balance between guided review and reviewer
freedom - Integrating rubric with inline comments
8Data Design
- Based on current Web-CAT assignment model
- Stresses reuse
9Data Model
10Interface Design
- Rubric Creator
- Reviewer-Student Assignment
- Perform Review
11Rubric Creator
- Essential for this type of review
- Students may lack necessary knowledge to perform
quality reviews on their own - Helps ensure relatively consistent grading
- For an example see the Moodle quiz creator
module, http//docs.moodle.org/en/Quizzes
12Reviewer-Student Assignment
- Maps reviewer to one or more assignments for
review. - Download spreadsheet, edit manually and
re-upload. - Flexibility for instructor
- Reuse of reviewer mappings
- System might provide random assignments for n
reviews per student
13Reviewer-Student Assignment
14Anonymity
- Lack of anonymity can cause friction over
negative comments - Most students prefer full anonymity
- We suggest double blind anonymity
15Review Interface
- Most important part of this project
- Must balance guided review based on rubric with
reviewer freedom
16Review Interface
- Previous designs
- Completely Separate the Two (Moodle Peer
Evaluation Module) - Rubric is central and guides the review process
- In much of the recent literature, the balance
problem is mentioned
17Review Interface
- Research shows reviewers focus on code rather
than rubric - Forcing reviewers to follow a rubric is too
restrictive - Rubric should act as a tool rather than strict
guide
18Review Interface
- Solution Present code for markup, and make
rubric criteria easily accessible when questions
arise - Reviewers can fill out rubric when desired,
probably at the end of the process - Use criteria as guide for making comments
- Important for reviewer to read rubric before
commenting code
19Prototype
Rubric Criteria
Inline Comment Area
20Prototype
Reviewer clicks in code to make inline comments,
optionally associating them with a rubric
criterion
21Prototype
Reviewer clicks on a criterion link to expand it.
This gives the instructions for that criterion,
and allows the reviewer to mark it.
22Prototype
By clicking the show/hide links, the reviewer can
hide the details of the criterion, making it less
obtrusive as they view the code and comments to
determine a mark for it.
23Alternate Design
- In our initial prototype, the layout had the
criteria on the side, becoming visible with
mouseovers - One criterion would be shown at the top of the
page - Testing both interfaces should reveal which one
is more effective
24Alternate Design
25Notes about Interface
- Comments should be color-coded to match criteria
- Tabbed interface makes it easier for reviewer to
switch between files - Reviewers should be required to provide summary
comments for why a particular rating was assigned
26Interface
- Provides balance between rubric-guided and
comment-focused reviews - Allows flexibility in performing reviews while
still providing guidance by way of a rubric
27Future Work
- Implement module for Web-CAT
- Test both interface designs to see which is more
effective - Add feature that allows reviewer to easily
navigate through comments when deciding a rating
to give - Currently the view for a student looking at a
review of his/her assignment would look a lot
like the review screen - This comment navigation could also be used for
this purpose
28Future Work
- Add facility for instructor to modify anonymity
settings - Currently system designed for double-blind
reviews - This would probably be used most of the time
- However, it may be useful to have option for
instructor to change anonymity settings
29Future Work
- May want to implement/test a feature that allows
the reviewer to assign a severity rating to the
comment - Rating would indicate how badly or well this
comment reflected on the code in question - Might allow a default rating to be applied to
the criteria, which could be modified by the
reviewer later
30Fin