DEVELOPING EFFECTIVE COURSE ASSESSMENT FOR DIFFERENT OUTCOMES - PowerPoint PPT Presentation

1 / 56
About This Presentation
Title:

DEVELOPING EFFECTIVE COURSE ASSESSMENT FOR DIFFERENT OUTCOMES

Description:

... then change the methods of assessment' (Brown et al 1997, p 9) ... Suskie, Linda. Questionnaire Survey Research, Association of Institutional Research. ... – PowerPoint PPT presentation

Number of Views:116
Avg rating:3.0/5.0
Slides: 57
Provided by: nov97
Category:

less

Transcript and Presenter's Notes

Title: DEVELOPING EFFECTIVE COURSE ASSESSMENT FOR DIFFERENT OUTCOMES


1
DEVELOPING EFFECTIVE COURSE ASSESSMENT FOR
DIFFERENT OUTCOMES
  • EAC Training Modules
  • For OBE Implementations

2
WHY AM I HERE ???
Cycle for Continuous Improvement
3
Learning Outcomes
  • At the end of this module, participants should be
    able to
  • Apply concept of assessment and evaluation in OBE
    implementation of a course
  • Plan the assessment and evaluation of learning
    outcomes within their courses

How do I measure these outcomes? How effective is
my delivery method? How to improve? Or How good
is good?
4
Table of Content
  • Glossary of terms
  • OBE and Philosophy of Assessment
  • Assessment and Evaluation
  • CQI
  • Performance criteria/ indicators
  • Exercises
  • Assessment Tips

5
Glossary
6
(No Transcript)
7
Assessment is
  • the formative or/and summative determination for
    a specific purpose of the students competence in
    demonstrating a specific outcome
  • the processes that identify, collect, use and
    prepare data that can be used to evaluate
    achievement.

8
Formative Assessment
  • Collecting info according to preset criteria to
    supply feedback on how learning can be improved
  • is intended to inform students how to improve
    their learning
  • provides feedback to students on their learning
    achievements

9
Summative Assessment
  • Judging the worth according to preset criteria of
    the students demonstration of outcome attainment
    competence
  • used to sum up a persons achievement, e.g.
    Written Examination.
  • Reliability is essential as they are used
    numerically to classify students and compare them
    to each other

10
(No Transcript)
11
(No Transcript)
12
(No Transcript)
13
Main Issues in Assessment
14
The BIG Picture
15
(No Transcript)
16
OBE addresses the following key questions
  • What do you want the students to have or able to
    do?
  • How can you best help students achieve it?
  • How will you know what they have achieved?
  • How do you close the loop

17
Main Issues
Assessment must be in line with the desired
learning outcomes.
If one changes the method of teaching, but keeps
the assessment unchanged, one is very likely to
fail. To get the assessment right is vitally
important.
If you want to change student learning then
change the methods of assessment (Brown et al
1997, p 9).
18
Main Issues
A crucial aspect of a successful teaching and
learning system is student assessment
For many students Bad teaching is painfully
bearable, Bad assessment is unavoidable.
Always remember our roles as a gatekeeper as well
as a coach!
19
Main Issues
assessment shapes learning so if you want to
change learning then change the assessment
method match the assessment tasks to the
learning outcomes match the criteria to the
task and learning outcomes keep the criteria
simple be fair, reliable and valid in your
marking provide meaningful, timely feedback.
20
Domains Taxonomy
21
Higher order
lower order
Intermediate
22
Higher order
lower order
Intermediate
23
Higher order
lower order
Intermediate
24
Effective Assessment
25
Course Assessment
  • must be an open process (transparent)
  • should be valid
  • needs to be reliable
  • needs to be fair
  • should be an integral component of course design
  • should promote change

26
The fundamentals of effective assessment
  • Assessment should help students to learn.
  • Assessment must be consistent with the objectives
    of the course and what is taught and learnt.
  • Variety in types of assessment allows a range of
    different learning outcomes to be assessed. It
    also keeps students interested.
  • Students need to understand clearly what is
    expected of them in assessed tasks.


27
The fundamentals of effective assessment
  • The fundamentals of effective assessmentCriteria
    for assessment should be detailed, transparent
    and justifiable.
  • Students need specific and timely feedback on
    their work - not just a grade.
  • Too much assessment is unnecessary and may be
    counter-productive.
  • Assessment should be undertaken with an awareness
    that an assessor may be called upon to justify a
    student's result.


28
The fundamentals of effective assessment
  • The best starting point for countering plagiarism
    is in the design of the assessment tasks.
  • Group assessment needs to be carefully planned
    and structured.
  • When planning and wording assignments or
    questions, it is vital to mentally check their
    appropriateness to all students in the class,
    whatever their cultural differences.
  • Systematic analysis of students' performance on
    assessed tasks can help identify areas of the
    curriculum which need improvement.

29
Some weaknesses
  • the tasks do not match the stated outcomes
  • the criteria do not match the tasks or outcome
  • the criteria are not known to students
  • students do not understand the criteria
  • overuse of one mode of assessment such as written
    examinations, essays
  • overload of students and staff
  • insufficient time for students to do the
    assignments

30
Some weaknesses
  • too many assignments with the same deadline
  • insufficient time for staff to mark the
    assignments or examinations
  • absence of well defined criteria so consistency
    is difficult to achieve
  • inadequate or superficial feedback provided to
    students
  • wide variations in marking between modules and
    assessors
  • variations in assessment demands of different
    modules
  • Timbang kati or not being assessed

31
Six serious flaws in current assessment practice
  • criteria used between subjects, within subjects,
    between institutions and within institutions for
    awarding of degree class not consistent
  • frames of reference which lecturers bring to
    assessment are systematically biased
  • lecturers have little idea of how others set and
    mark assignments and are usually untrained in
    assessment

32
Six serious flaws in current assessment practice
  • few lecturers understand the technical design
    factors which can affect assessment outcomes
  • New forms of assessment , eg continuous
    assessment, are as prone to distortion as formal
    examinations
  • the approach to assessment remains conservative
    through ignorance.

Atkins et al (1993), p.26 - 27
33
Designing assessments
  • 1. What are the outcomes to be assessed?
  • 2. What are the capabilities/skills (implicit or
    explicit) in the outcomes?
  • 3. Is the method of assessment chosen consonant
    with the outcomes and skills?
  • 4. Is the method relatively efficient in terms of
    student time and staff time?

34
Designing assessments
  • 5. What alternatives are there? What are their
    advantages and disadvantages?
  • 6. Does the specific assessment task match the
    outcomes and skills?
  • 7. Are the marking schemes or criteria
    appropriate?

35
Assessment Plan
36
ASSESSMENT PLAN
  • Make sure each criteria that has been stated in
    the objectives is clearly defined.
  • Include also the frequency of assessment activity
    in the planning.
  • Continuous Quality Improvement (CQI) can be
    proven through well documented report and data.

37
ASSESSMENT PLAN
  • Language of Assessment (Terms, Definitions,
    other terminologies)
  • Assessment Questions (What questions are you
    trying to answer?)
  • Develop a systematic Assessment Procedure
  • Measurable performance criteria

38
ASSESSMENT PLAN
  • When most students seem to be ready to
    demonstrate mastery, assess their learning
  • This assessment should take into account the
    context in which outcomes should be demonstrated.
  • Intended outcomes provide benchmarks against
    which student achievement can be judged.

39
ASSESSMENT PLAN
  • We must use assessment methods that are valid,
    reliable and fair.
  • It is not enough to focus assessment only on
    subject-specific outcomes that ignore long-term
    purpose of the programme of study.
  • In OBE, assessment should always contribute to
    the goal of improving students learning.

40
ASSESSMENT PLAN
  • The more realistic assessment procedures are, the
    clearer picture we will have of what the students
    are learning.
  • In OBE, assessment should always contribute to
    the goal of improving students learning.

41
BOTTOM LINES
  • All assessment options have advantages and
    disadvatages
  • Ideal method means those that are best fit
    between program needs, satisfactory validity, and
    affordability (time, effort, and money)
  • Crucial to use multi-method/multi-source
    approach to maximise validity and reduce bias of
    any approach

42
VALIDITY
  • Relevance the assessment option measures the
    educational outcome as directly as possible
  • Accuracy the option measures the educational
    outcome as precisely as possible
  • Utility the option provides formative and
    summative results with clear implications for
    educational program evaluation and improvement

43
ASSESSMENT METHOD TRUISMS
  • There will always be more than one way to
    measure any learning outcomes
  • No single method is good for measuring a wide
    variety of student abilities
  • There is generally an inverse relationship
    between the quality of measurement methods and
    their expediency
  • It is important to pilot test to see if a
    method is appropriate for your course

44
Case Study 1
45
(No Transcript)
46
(No Transcript)
47
Public Speaking Evaluation Sheet
48
(No Transcript)
49
Assessment Tips With Gloria Rogers
50
Assessment Tips With Gloria Rogers
  • 1. You cannot do everything
  • 2. One size does not fit all
  • 3. More data are not always better
  • 4. Pick your battles
  • 5. Take advantage of local resources win
  • 6. Go for the early win
  • 7. Decouple from faculty evaluation

51
Assessment Tips With Gloria Rogers
  • Be sure that you have a clear vision of why you
    are collecting specific data.
  • How many data points are enough to provide
    adequate evidence of outcome achievement? It is
    not always true that more data are better.
  • How often? Data do not need to be collected from
    every student on every outcome every year.
  • How used? Once a strategy has been implemented
    with an efficient process for data collection,
    how are the data going to be used? Although this
    decision should drive the data collection
    process, far too often it comes after data has
    been amassed. Rule of thumb if the use of the
    data is not known, dont collect it.

52
Assessment Tips With Gloria Rogers
  • We are already assessing students in courses
    why can't we just use student grades as an
    indication of what our students know or can do?
  • Grades represent the extent to which a student
    has successfully met the faculty member's
    requirements and expectations for a course.
  • Because many factors contribute to an assigned
    grade, it is almost impossible to make inferences
    about what a student knows or can do by only
    looking at the grades for a course.

53
Assessment Tips With Gloria Rogers
  • For program assessment, a numeric score that is
    directly linked to students' performance on a
    specific performance criteria can be used as
    evidence of program learning outcomes.
  • For example, for the outcome, Students have an
    understanding of ethical responsibility,
  • one of the performance criteria could be,
    Students will demonstrate the ability to
    evaluate the ethical dimensions of a problem in
    their engineering discipline.

54

Assessment Tips With Gloria Rogers
  • The measure used to assess those outcomes should
    be used consistently, should reflect specific
    student knowledge or skills, and should be
    directly linked to specific performance criteria.
  • It is important to remember that the focus is not
    a score or grade, but the student knowledge or
    skill that is represented by that scoreor grade

55
REFERENCES
  • Suskie, Linda. Questionnaire Survey Research,
    Association of Institutional Research.
  • http//airweb.org/publications/airpublications.h
    tm
  • Rogers, Gloria. Model for Reporting EC2000
    Criteria 2 3
  • http//www.rose-hulman.edu/irpa
  • Killen, R. (2000). Outcomes-based education
    Principles and possibilities, Unpublished
    manuscript, University of Newcastle, Australia

56
BEM/IEM wish to thank everyone who contribute to
this material. Special thanks are due to
  • Abdul Wahab
  • Azlan
  • Jailani
  • Mazlan
  • Megat Johari
  • Mohd. Saleh
  • Shahrin
  • Wan Hamidon
Write a Comment
User Comments (0)
About PowerShow.com