Title slide - PowerPoint PPT Presentation

1 / 81
About This Presentation
Title:

Title slide

Description:

Title: Title Author: John Dale Last modified by: user1 Created Date: 2/27/2002 9:10:39 AM Document presentation format: On-screen Show (4:3) Company – PowerPoint PPT presentation

Number of Views:120
Avg rating:3.0/5.0
Slides: 82
Provided by: JohnD339
Category:

less

Transcript and Presenter's Notes

Title: Title slide


1
Assessment, Feedback and Evaluation
Vinod Patel John Morrissey
2
Learning outcomes
  • By the end of this session you will be able to
  • Define assessment, feedback and evaluation
  • Discuss how these are related and how they differ
  • Discuss the application of each in clinical
    education.
  • Begin to apply them in practice

3
Lesson Plan
  • Definitions
  • Assessment theory practice
  • Tea break
  • Feedback
  • Evaluation theory practice
  • Questions and close

4
Definitions ?
  • Assessment ?
  • Feedback
  • Evaluation ?

5
Assessment definition
The processes and instruments applied to measure
the learners achievements, normally after they
have worked through a learning programme of one
sort or another
Mohanna K et al (2004) Teaching Made Easy a
manual for health professionals
6
Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
7
Evaluation definition
A systematic approach to the collection,
analysis and interpretation of information about
any aspect of the conceptualisation, design,
implementation and utility of educational
programmes
Mohanna K et al (2004) Teaching Made Easy a
manual for health professionals
8
Part 1
Assessment
9
In this section
  • Purposes of assessment
  • Millers pyramid
  • The utility function

10
Why assess ?
11
Why assess ? 1 of 2
  • To inform students of strengths and
    weaknesses.
  • To ensure adequate progress has been made
    before students move to the next level.
  • To provide certification of a standard of
    performance.

12
Why assess ? 2 of 2
  • To indicate to students which parts of the
    curriculum are considered important.
  • To select for a course or career.
  • To motivate students in their studies.
  • To measure the effectiveness of teaching and
    to identify weaknesses in the curriculum.

13
Summative
Formative
14
Clinical Education Assessment Methods
  • Written Assessments
  • Observed clinical practice
  • Others
  • Vivas
  • Portfolios

15
How a skill is acquired
  • Cognitive phase
  • Fixative phase
  • Practice
  • Feedback
  • Autonomous phase

Fitts P Posner M (1967) Human Performance
16
Does
Shows how
Knows how
Knows
Miller GE (1990) Acad Med (Suppl) 65 S63
17
(No Transcript)
18
(No Transcript)
19
Clinical Work Observed
ACAT, CbD, CeX,
Does
OSLER
Shows how
OSCE
Short Answer-Reasoning
Knows how
Written Exams
Knows
MCQ
Miller GE (1990) Acad Med (Suppl) 65 S63
20
Question
How can we tell whether these tests are any good
or not ?
Answer
We do the maths .
21
Utility function
U wrR x wvV x weE x waA x wcC
  • U Utility
  • R Reliability
  • V Validity
  • E Educational impact
  • A Acceptability
  • C Cost
  • W Weight

Van der Vleuten, CPM (1996) Advances in Health
Sciences Education 1, 41-67.
22
The Assessment Pentagram
Validity
Reliability
Acceptability
Feasibility
Educational Impact
23
Validity reliability
  • Validity the extent to which the competence
    that the test claims to measure is actually
    being measured.
  • Reliability the extent to which a test
    yields reproducible results.

Schuwirth van der Vleuten (2006) How to design
a useful test the principles of assessment
24
Validity another definition
The degree to which empirical evidence and
theoretical rationales support the adequacy and
appropriateness of inferences and actions based
on test scores or other modes of assessment.
Messick (1994) Educational Researcher 23 13
25
Some causes of low validity
  • Vague or misleading instructions to
    candidates.
  • Inappropriate or overcomplicated wording.
  • Too few test items.
  • Insufficient time.
  • Inappropriate content.
  • Items too easy or too difficult.

McAleer (2005) Choosing Assessment Instruments
26
Some causes of low reliability
  • Inadequate sampling.
  • Lack of objectivity in scoring.
  • Environmental factors.
  • Processing errors.
  • Classification errors.
  • Generalisation errors.
  • Examiner bias.

McAleer (2005) Choosing Assessment Instruments
27
Types of validity
  • Face
  • Predictive
  • Concurrent
  • Content
  • Construct

28
The examination fairly and accurately assessed
my ability
29
The examination fairly and accurately assessed
the candidates ability
30
Problem
Appearances can be deceptive.
31
Types of reliability
  • Test-retest
  • Equivalent forms
  • Split-half
  • Interrater and intrarater

32
(No Transcript)
33
(No Transcript)
34
The Assessment Pentagram
Validity
Reliability
Acceptability
Feasibility
Educational Impact
35
Utility function
U wrR x wvV x weE x waA x wcC
  • U Utility
  • R Reliability
  • V Validity
  • E Educational impact
  • A Acceptability
  • C Cost
  • W Weight

Van der Vleuten, CPM (1996) Advances in Health
Sciences Education 1, 41-67.
36
Does
Shows how
Knows how
Knows
Miller GE (1990) Acad Med (Suppl) 65 S63
37
Clinical Work Observed
ACAT, CbD, CeX,
Does
OSLER
Shows how
OSCE
Short Answer-Reasoning
Knows how
Written Exams
Knows
MCQ
Miller GE (1990) Acad Med (Suppl) 65 S63
38
FY Workplace Assessment
  • Mini-CEX (from USA) Clinical Examination
  • DOPS (developed by RCP) Direct Observation of
    Procedural Skills
  • CBD (based on GMC performance procedures)
    Case-based Discussion
  • MSF (from industry) Multi-Source Feedback

Carr (2006) Postgrad Med J 82 576
39
Practical Exercise
40
Educational interventions
  • How will we assess?
  • How will we feedback?
  • How will we evaluate?

41
Educational interventions
  • Communication skills for cancer specialists
  • 2nd year medical speciality training
  • Medical humanities SSM for medical students
  • Masters-level pharmacology module
  • Procedural skills for medical students
  • Clinical Officers ETATMBA

42
Communication skills
Learning outcome To improve communication skills of HCPs working with individuals with cancer, e.g. with respect to breaking bad news, discussion of management plans, end of life care
Duration 2 days
Students Mostly specialist cancer nurses, including Macmillan nurses, also consultants and trainee medics. N 30.
Teaching learning Mainly consultations with simulated patients
43
Speciality training
Learning outcome To ensure trainees have reached the appropriate stage in the acquisition of the knowledge, skills and attitudes necessary to independent medical practice
Duration 1 year
Students Second-year GP trainees. N 50.
Teaching learning Clinical apprenticeship, protected training days
44
Medical humanities SSM
Learning outcome By a reading of Middlemarch by George Eliot, to enhance students ability to reflect on medical practice and to enter imaginatively into the lives of others
Duration 90-minute sessions weekly for 10 weeks
Students Second-year medical students. N 20.
Teaching learning Small group teaching and discussion
45
M-level pharmacology module
Learning outcome To enhance knowledge and understanding of pharmacotherapies used in diabetes and its complications, and to develop the ability to apply this knowledge in clinical practice
Duration 200 hours, i.e. 20 CATS points
Students Mostly DSNs, a few GPs and endocrinology trainees, some from overseas. N 20.
Teaching learning 20 hours directed learning small group teaching and discussion and 180 hours self-directed learning
46
Procedural skills
Learning outcome To ensure newly-qualified doctors are competent in all the bedside and near-patient procedures listed in Tomorrows Doctors
Duration 4 years
Students Medical students. N 200.
Teaching learning Small group teaching sessions distributed across three hospitals
47
Clinical Officers
Learning outcome To..
Duration x
Students y.
Teaching learning z.
48
The ideal assessment instrument
  • Totally valid.
  • Perfectly reliable.
  • Entirely feasible.
  • Wholly acceptable.
  • Huge educational impact.

49
Part 2
Feedback
50
Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
51
Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
52
Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
53
Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
54
In this section
  • Importance of feedback
  • How to give feedback models
  • How to improve feedback

55
Experiential learning
56
Feedback
  • Its value is self-evident experiential learning
    cannot take place without it
  • It is often infrequent, untimely, unhelpful or
    incomplete
  • It is often not acted upon to improve performance

57
How to give feedback
  • Establish appropriate interpersonal climate
  • Use appropriate location
  • Establish mutually agreed goals
  • Elicit learners thoughts and feelings
  • Reflect on observed behaviours
  • Be non-judgmental
  • Relate feedback to specific behaviours
  • Offer right amount of feedback
  • Offer suggestions for improvement

Hewson MG Little ML (1998) J Gen Int Med 13 (2)
111
58
Practical Exercise
59
Some methods of feedback
  • Pendleton's rules
  • ALOBA
  • SCOPME model
  • Chicago model

60
Pendletons Rules
  • Clarification of matters of fact
  • Trainee identifies what went well
  • Trainer identifies what went well
  • Trainee discusses what did not do well and how to
    improve
  • Trainer identifies areas for improvement
  • Agreement on areas for improvement and
    formulation of action plan

Pendleton D et al (1984) in The Consultation an
Approach to Learning and Teaching
61
Difficulties with Pendleton ?
  • The strict format may inhibit spontaneous
    discussion.
  • Unhelpful polarisation between good points and
    bad points
  • Opening comments may seem predictable, insincere
    and a merely a prelude to criticism.

Carr (2006) Postgrad Med J 82 576
62
ALOBA 1 of 2
  • Start with the learners agenda.
  • Look at the outcomes the learner and the patient
    are trying to achieve.
  • Encourage self-assessment and self-problem
    solving first.
  • Involve the whole group in problem-solving.
  • Use descriptive feedback to encourage a
    non-judgemental approach.
  • Provide balanced feedback.

Kurtz et al (1998) Teaching and Learning
Communication Skills in Medicine
63
ALOBA 2 of 2
  • Make offers and suggestions generate
    alternatives.
  • Be well-intentioned, valuing and supportive.
  • Rehearse suggestions.
  • Value the interview as a gift of raw material for
    the group.
  • Opportunistically introduce theory, research
    evidence and wider discussion.
  • Structure and summarise learning to reach a
    constructive end-point.

Kurtz et al (1998) Teaching and Learning
Communication Skills in Medicine
64
SCOPME
  • Listen
  • Reflect back
  • Support
  • Counsel
  • Treat information in confidence
  • Inform without censuring  

65
Chicago
  • Review aims and objectives of the job at the
    start.
  • Give interim feedback of a positive nature.
  • Ask the learner to give a self-assessment of
    their progress.
  • Give feedback on behaviours rather than
    personality.
  • Give specific examples to illustrate your views.
  • Suggest specific strategies to the learner to
    improve performance.

66
Improving feedback
  • Recognise that we all need feedback to learn and
    improve
  • Ask for feedback yourself and model this process
    for learners
  • Inform learners that you expect them to ask for
    feedback
  • Make feedback a routine activity
  • Discuss the need of feedback with colleagues

Sergeant J Mann K in Cantillon P Wood D (eds)
ABC of Learning and Teaching in Medicine 2nd edn
(2010)
67
Part 3
Evaluation
68
Evaluation definition
A systematic approach to the collection,
analysis and interpretation of information about
any aspect of the conceptualisation, design,
implementation and utility of educational
programmes
Mohanna K et al (2004) Teaching Made Easy a
manual for health professionals
69
In this section
  • Purposes of evaluation
  • Data sources
  • Kirkpatricks heirarchy

70
The Audit Cycle
Ask question(s)
Re-audit
Review literature
Set criteria standards
Review standards
Action plan
Design audit
Feed back findings
Collect data
Analyse data
Wakley G Chambers R (2005) Clinical Audit in
Primary Care
71
Why evaluate ?
  • To ensure teaching is meeting students needs.
  • To identify areas where teaching can be
    improved.
  • To inform the allocation of faculty resources.
  • To provide feedback and encouragement to
    teachers
  • To support applications for promotion by
    teachers.
  • To identify and articulate what is valued by
    medical schools.
  • To facilitate development of the curriculum.

Morrison (2003) Br Med J 326 385
72
Evaluation
  • Scale micro ? macro
  • Formative? summative
  • Internal? external
  • Can you evaluate an assessment ?

73
Evaluation data sources
Student ratings Employer ratings
Peer ratings Video recordings
Self-ratings Administrator ratings
Assessment scores Teacher scholarship
Expert ratings Teacher awards
Student interviews Teaching portfolios
Exit ratings
Based on Berk RA (2006) Thirteen Strategies to
Measure College Teaching
74
(No Transcript)
75
(No Transcript)
76
Teaching Observation
  • A method of evaluating teaching
  • Different models and purposes
  • Three stages pre-observation, observation,
    post-observation
  • Form (instrument) for recording the information,
    observation and feedback
  • Siddiqui ZS, Jonas-Dwyer D Carr SE (2007)
    Twelve tips for peer observation of teaching.
    Medical Teacher 29297-300

77
Teaching Observation purposes
  • Evaluation authority / summative
  • Developmental expert / formative
  • Peer review collaborative / mutual learning

78
(No Transcript)
79
Evaluation triangulation
Self
Peers
Students
80
Practical Exercise
81
Kirkpatricks Hierarchy
  • Complexity of behaviour
  • Time elapsed
  • ? Reliable measures
  • Confounding factors

Results
Behaviour
Learning
Reaction
Hutchinson (1999) Br Med J 318 1267
82
Whats the point ?
83
(No Transcript)
84
Issues with Kirkpatrick
  • Is it a hierarchy ?
  • Omissions learning objectives ?
  • Linkages between levels ?
  • Kirkpatrick plus

Tamkin P et al (2002) Kirkpatrick and Beyond, IES
85
Conclusions
  • There is no perfect assessment instrument
  • Feedback to students is essential to experiential
    learning
  • The ultimate purpose of evaluation is to improve
    clinical outcomes
Write a Comment
User Comments (0)
About PowerShow.com