Title: Self and Peer Assessment Through DUO
1Self and Peer Assessment Through DUO
- Hannah Whaley
- University of Dundee
2Background
- University of Dundee
- Growing use of many forms of self and peer
assessment - Paper based systems and ad hoc online approaches
- Re-developed a system to integrate with Bb
- New system
- User centered design
- Involved academic staff from range of subjects
- Partnered with Blackboard
- Fully integrated with Blackboard
- Released in v8.0 so available now
3Background
4Background
51. Understanding Self and Peer Assessment
2. Deciding Where and When
3. Designing Assessments
4. Running Assessments
61. Understanding Self and Peer Assessment
2. Deciding Where and When
3. Designing Assessments
4. Running Assessments
71. Understanding Self and Peer Assessment
- Important to fully understand the concept
- Confusion over terminology
- Focus on the real use of the pedagogy
- Only then can you realise the full potential for
learning
criterion based reference marking
peer marking
peer review
self assessment
peer reflection
marking rubrics
critical analysis
groupwork assessment
81. Understanding Self and Peer Assessment
not group
fixed marking criteria
individual work
self and peer assessment
reflection, analysis, evaluation
91. Understanding Self and Peer Assessment
- Process
- Academics design assessment
- Includes questions and marking criteria
- Creates the assessment in Blackboard
- Student completes the assessment
- Could be one or more questions
- Submits answers in Blackboard
- Student marks the assessment
- Returns to assessment in Blackboard
- Is given a list of students to mark
- Academic moderates results
- Monitors submissions and marking phases
- Moderates results before releasing them to
students
101. Understanding Self and Peer Assessment
- Challenging process for both staff and students
Academics
Students
Reflection Critical Constructive Engage
Be creative Be precise Let go! Moderate
Sys Admin
Understand, support and get excited
111. Understanding Self and Peer Assessment
2. Deciding Where and When
3. Designing Assessments
4. Running Assessments
12- Formative or summative?
- Formative works particularly well
- Summative should include moderation offer
- Replace old assessment or add new one?
- Updating old assessments works well
- Chance to add innovative new practice
- Whats the purpose of the assessment?
- Add interaction, reduce marking load, extra
practice, new skills - Focus on purpose in assessment design
- How long should it run for?
- 2 weeks standard use the defaults that are
given - 1 week assessment, 1 week evaluating
- Supervise it in IT suites or not?
- Generally, can be completed entirely online
- Makes good use of practical session
2. Deciding Where and When
132. Deciding Where and When
142. Deciding Where and When
152. Deciding Where and When
162. Deciding Where and When
172. Deciding Where and When
182. Deciding Where and When
Self and Peer
Traditional
Creates Question
Creates Question
Prepares Answer
Prepares Answer
Creates Criteria
Creates Criteria
Marking Answers Writing Feedback
Marking Answers Writing Feedback
Moderation
Reviews Feedback
Reviews Feedback
Moderation
Formal marks
Formal marks
Exercise review
Exercise review
191. Understanding Self and Peer Assessment
2. Deciding Where and When
3. Designing Assessments
4. Running Assessments
20- Focus of assessment
- Learning objectives (primary and secondary)
- Discipline specific context
- Flexibility within tool for design
3. Designing Assessments
213. Designing Assessments
- Essay style and exam style assessments are
catered for - Submission options include text,
html and links - Anonymous or not, change number
of peers to mark
- Assessment
- Question
- Criteria
- Question
- Criteria
- Criteria
- Criteria
223. Designing Assessments
Example 1
Subject Life Sciences Motive Reduce Marking
Time Extra Learning Practice at exam qs Old or
New Created from old tutorials
Very specific criteria
Exam style
30 Questions
1 criteria each
Model answers
233. Designing Assessments
Example 1
Subject Life Sciences Motive Reduce Marking
Time Extra Learning Practice at exam qs Old or
New Created from old tutorials
Subject specific
Text answers File uploads
Majority no feedback
Marking 2 peers and self
243. Designing Assessments
Example 1
Subject Life Sciences Motive Reduce Marking
Time Extra Learning Practice at exam qs Old or
New Created from old tutorials
Only 1 exercise used per year previously 4 hours
moderating 4 exercises Students gain lots of
practice Common mistakes, marking scales, model
answers
253. Designing Assessments
Example 2
Subject Geography Motive Innovative practical
lab Extra Learning Understand their answers Old
or New New idea
Bit of Both
2 Questions
Subjective and Specific criteria
Granular and expansive marks
263. Designing Assessments
Example 2
Subject Geography Motive Innovative practical
lab Extra Learning Understand their answers Old
or New New idea
Deep Learning
Text answers
Subjective and Specific criteria
Marking 3 peers and self
273. Designing Assessments
Example 2
Subject Geography Motive Innovative practical
lab Extra Learning Understand their answers Old
or New New idea
Innovative way to introduce students to academic
reading Promoting deep learning synthesis and
evaluation Students forced to give opinions and
justify them Makes use of flexibility of the
system, combining two approaches
283. Designing Assessments
293. Designing Assessments
Example 3
Subject Law Motive Improve assessment Extra
Learning Give better feedback Old or New Added
online component
Blended style
File upload
Open Criteria, with guidance
Small workload
303. Designing Assessments
Example 3
Subject Law Motive Improve assessment Extra
Learning Give better feedback Old or New Added
online component
Quick answers
3 markers
Self reflection
Emphasis on constructive feedback
313. Designing Assessments
Example 3
Subject Law Motive Improve assessment Extra
Learning Give better feedback Old or New Added
online component
Blending online component with existing teaching
practices Enhance face to face section,
formalise feedback Students get better feedback,
from a wider range Understand better and worse
presentations clearly
321. Understanding Self and Peer Assessment
2. Deciding Where and When
3. Designing Assessments
4. Running Assessments
33- Flexibility built into system
- Timing of assessments
- Workload
- Publishing Results
- Motivation
- Moderation
4. Running Assessments
344. Running Assessments
- Motivation
- Student understanding of process
- Importance for their learning
- Assignment completed after both parts
- Marks can be withheld
- Actively encourage non completers
- Email and remind them
- Sometimes
- Marks for marking
- Deviation from average or tutor mark
354. Running Assessments
- Moderation
- Moderator can over ride any average grade
- 3 key phases each with moderator overview
- Submission
- Marking
- Results
364. Running Assessments
- Moderating Submissions
- Encourage
- Check submissions for problems
- Download submissions
374. Running Assessments
- Moderating Evaluations
- Encourage
- Check for problems
- Download evaluation results
384. Running Assessments
- Moderating Results
- Check for problems
- Finalise and publish to Grade Center
- Any grade can be over-ridden in the Grade Center
- Add feedback or grading notes
394. Running Assessments
- Moderation styles
- On request (recommended)
- Highs and Lows
- Unexpected
- Random Sample (recommended)
- Understand the process
- Its student marking
- Cant get a correct grade
- Accept the average and the learning
40Challenges
- Fully understanding the potential and where it
can be used - Designing assessments to suit subjects
- Supporting good pedagogical use
- Time consuming to create good assessments
- Teaching students about effective feedback and
reflection - Research into best practice
41Common Mistakes
- Not understanding it
- Poor criteria
- Overloading students
- Obsessive moderating
42Common Mistakes
- Not using the preview
- Changing questions and criteria
- Changing dates back and forth
- Changing enrolments
43Benefits
- Re-usable resource shareable good practice
- Moved from paper based and ad hoc systems
- Promoting really deep learning
- Comprehension, application, synthesis, evaluation
- Students learn soft skills
- Giving effective feedback, analysing, criticising
- Students gain learning skills
- Assessment criteria, marking, answering questions
- Students can place their work
- See work better and worse than their own, monitor
their own learning
44Some Ideas
- First drafts
- Review resources
- Portfolio submission
- Video
- Past Papers
- Research
45Conclusions
- Experiences gained using the system for 2 years
- Flexible, robust and expandable pedagogy
- Challenge in creating challenging assessments
- Benefit from experience of moderating
- Not always easy
- May not get right first time
- Inspired, motivated, ideas forming?
46Contact
- Hannah Whaley, University of Dundee, Scotland
- h.whaley_at_dundee.ac.uk