Title: Title slide
1Assessment, Feedback and Evaluation
Vinod Patel John Morrissey
2Learning outcomes
- By the end of this session you will be able to
- Define assessment, feedback and evaluation
- Discuss how these are related and how they differ
- Discuss the application of each in clinical
education. - Begin to apply them in practice
3Lesson Plan
- Definitions
- Assessment theory practice
- Tea break
- Feedback
- Evaluation theory practice
- Questions and close
4Definitions ?
- Assessment ?
- Feedback
- Evaluation ?
5Assessment definition
The processes and instruments applied to measure
the learners achievements, normally after they
have worked through a learning programme of one
sort or another
Mohanna K et al (2004) Teaching Made Easy a
manual for health professionals
6Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
7Evaluation definition
A systematic approach to the collection,
analysis and interpretation of information about
any aspect of the conceptualisation, design,
implementation and utility of educational
programmes
Mohanna K et al (2004) Teaching Made Easy a
manual for health professionals
8Part 1
Assessment
9In this section
- Purposes of assessment
- Millers pyramid
- The utility function
10Why assess ?
11Why assess ? 1 of 2
- To inform students of strengths and
weaknesses. - To ensure adequate progress has been made
before students move to the next level. - To provide certification of a standard of
performance.
12Why assess ? 2 of 2
- To indicate to students which parts of the
curriculum are considered important. - To select for a course or career.
- To motivate students in their studies.
- To measure the effectiveness of teaching and
to identify weaknesses in the curriculum.
13Summative
Formative
14Clinical Education Assessment Methods
- Written Assessments
- Observed clinical practice
- Others
- Vivas
- Portfolios
-
15How a skill is acquired
- Cognitive phase
- Fixative phase
- Practice
- Feedback
- Autonomous phase
Fitts P Posner M (1967) Human Performance
16Does
Shows how
Knows how
Knows
Miller GE (1990) Acad Med (Suppl) 65 S63
17(No Transcript)
18(No Transcript)
19Clinical Work Observed
ACAT, CbD, CeX,
Does
OSLER
Shows how
OSCE
Short Answer-Reasoning
Knows how
Written Exams
Knows
MCQ
Miller GE (1990) Acad Med (Suppl) 65 S63
20Question
How can we tell whether these tests are any good
or not ?
Answer
We do the maths .
21Utility function
U wrR x wvV x weE x waA x wcC
- U Utility
- R Reliability
- V Validity
- E Educational impact
- A Acceptability
- C Cost
- W Weight
Van der Vleuten, CPM (1996) Advances in Health
Sciences Education 1, 41-67.
22The Assessment Pentagram
Validity
Reliability
Acceptability
Feasibility
Educational Impact
23Validity reliability
- Validity the extent to which the competence
that the test claims to measure is actually
being measured. - Reliability the extent to which a test
yields reproducible results.
Schuwirth van der Vleuten (2006) How to design
a useful test the principles of assessment
24Validity another definition
The degree to which empirical evidence and
theoretical rationales support the adequacy and
appropriateness of inferences and actions based
on test scores or other modes of assessment.
Messick (1994) Educational Researcher 23 13
25Some causes of low validity
- Vague or misleading instructions to
candidates. - Inappropriate or overcomplicated wording.
- Too few test items.
- Insufficient time.
- Inappropriate content.
- Items too easy or too difficult.
McAleer (2005) Choosing Assessment Instruments
26Some causes of low reliability
- Inadequate sampling.
- Lack of objectivity in scoring.
- Environmental factors.
- Processing errors.
- Classification errors.
- Generalisation errors.
- Examiner bias.
McAleer (2005) Choosing Assessment Instruments
27Types of validity
- Face
- Predictive
- Concurrent
- Content
- Construct
28The examination fairly and accurately assessed
my ability
29The examination fairly and accurately assessed
the candidates ability
30Problem
Appearances can be deceptive.
31Types of reliability
- Test-retest
- Equivalent forms
- Split-half
- Interrater and intrarater
32(No Transcript)
33(No Transcript)
34The Assessment Pentagram
Validity
Reliability
Acceptability
Feasibility
Educational Impact
35Utility function
U wrR x wvV x weE x waA x wcC
- U Utility
- R Reliability
- V Validity
- E Educational impact
- A Acceptability
- C Cost
- W Weight
Van der Vleuten, CPM (1996) Advances in Health
Sciences Education 1, 41-67.
36Does
Shows how
Knows how
Knows
Miller GE (1990) Acad Med (Suppl) 65 S63
37Clinical Work Observed
ACAT, CbD, CeX,
Does
OSLER
Shows how
OSCE
Short Answer-Reasoning
Knows how
Written Exams
Knows
MCQ
Miller GE (1990) Acad Med (Suppl) 65 S63
38FY Workplace Assessment
- Mini-CEX (from USA) Clinical Examination
- DOPS (developed by RCP) Direct Observation of
Procedural Skills - CBD (based on GMC performance procedures)
Case-based Discussion - MSF (from industry) Multi-Source Feedback
Carr (2006) Postgrad Med J 82 576
39Practical Exercise
40Educational interventions
- How will we assess?
- How will we feedback?
- How will we evaluate?
41Educational interventions
- Communication skills for cancer specialists
- 2nd year medical speciality training
- Medical humanities SSM for medical students
- Masters-level pharmacology module
- Procedural skills for medical students
- Clinical Officers ETATMBA
42Communication skills
Learning outcome To improve communication skills of HCPs working with individuals with cancer, e.g. with respect to breaking bad news, discussion of management plans, end of life care
Duration 2 days
Students Mostly specialist cancer nurses, including Macmillan nurses, also consultants and trainee medics. N 30.
Teaching learning Mainly consultations with simulated patients
43Speciality training
Learning outcome To ensure trainees have reached the appropriate stage in the acquisition of the knowledge, skills and attitudes necessary to independent medical practice
Duration 1 year
Students Second-year GP trainees. N 50.
Teaching learning Clinical apprenticeship, protected training days
44Medical humanities SSM
Learning outcome By a reading of Middlemarch by George Eliot, to enhance students ability to reflect on medical practice and to enter imaginatively into the lives of others
Duration 90-minute sessions weekly for 10 weeks
Students Second-year medical students. N 20.
Teaching learning Small group teaching and discussion
45M-level pharmacology module
Learning outcome To enhance knowledge and understanding of pharmacotherapies used in diabetes and its complications, and to develop the ability to apply this knowledge in clinical practice
Duration 200 hours, i.e. 20 CATS points
Students Mostly DSNs, a few GPs and endocrinology trainees, some from overseas. N 20.
Teaching learning 20 hours directed learning small group teaching and discussion and 180 hours self-directed learning
46Procedural skills
Learning outcome To ensure newly-qualified doctors are competent in all the bedside and near-patient procedures listed in Tomorrows Doctors
Duration 4 years
Students Medical students. N 200.
Teaching learning Small group teaching sessions distributed across three hospitals
47Clinical Officers
Learning outcome To..
Duration x
Students y.
Teaching learning z.
48The ideal assessment instrument
- Totally valid.
- Perfectly reliable.
- Entirely feasible.
- Wholly acceptable.
- Huge educational impact.
49Part 2
Feedback
50Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
51Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
52Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
53Feedback definition
Specific information about the comparison
between a trainees observed performance and a
standard, given with the intent to improve the
trainees performance
Van de Ridder JM et al (2008) Med Educ 42(2) 189
54In this section
- Importance of feedback
- How to give feedback models
- How to improve feedback
55Experiential learning
56Feedback
- Its value is self-evident experiential learning
cannot take place without it - It is often infrequent, untimely, unhelpful or
incomplete - It is often not acted upon to improve performance
57How to give feedback
- Establish appropriate interpersonal climate
- Use appropriate location
- Establish mutually agreed goals
- Elicit learners thoughts and feelings
- Reflect on observed behaviours
- Be non-judgmental
- Relate feedback to specific behaviours
- Offer right amount of feedback
- Offer suggestions for improvement
Hewson MG Little ML (1998) J Gen Int Med 13 (2)
111
58Practical Exercise
59Some methods of feedback
- Pendleton's rules
- ALOBA
- SCOPME model
- Chicago model
60Pendletons Rules
- Clarification of matters of fact
- Trainee identifies what went well
- Trainer identifies what went well
- Trainee discusses what did not do well and how to
improve - Trainer identifies areas for improvement
- Agreement on areas for improvement and
formulation of action plan
Pendleton D et al (1984) in The Consultation an
Approach to Learning and Teaching
61Difficulties with Pendleton ?
- The strict format may inhibit spontaneous
discussion. - Unhelpful polarisation between good points and
bad points - Opening comments may seem predictable, insincere
and a merely a prelude to criticism.
Carr (2006) Postgrad Med J 82 576
62ALOBA 1 of 2
- Start with the learners agenda.
- Look at the outcomes the learner and the patient
are trying to achieve. - Encourage self-assessment and self-problem
solving first. - Involve the whole group in problem-solving.
- Use descriptive feedback to encourage a
non-judgemental approach. - Provide balanced feedback.
Kurtz et al (1998) Teaching and Learning
Communication Skills in Medicine
63ALOBA 2 of 2
- Make offers and suggestions generate
alternatives. - Be well-intentioned, valuing and supportive.
- Rehearse suggestions.
- Value the interview as a gift of raw material for
the group. - Opportunistically introduce theory, research
evidence and wider discussion. - Structure and summarise learning to reach a
constructive end-point.
Kurtz et al (1998) Teaching and Learning
Communication Skills in Medicine
64SCOPME
- Listen
- Reflect back
- Support
- Counsel
- Treat information in confidence
- Inform without censuring
65Chicago
- Review aims and objectives of the job at the
start. - Give interim feedback of a positive nature.
- Ask the learner to give a self-assessment of
their progress. - Give feedback on behaviours rather than
personality. - Give specific examples to illustrate your views.
- Suggest specific strategies to the learner to
improve performance.
66Improving feedback
- Recognise that we all need feedback to learn and
improve - Ask for feedback yourself and model this process
for learners - Inform learners that you expect them to ask for
feedback - Make feedback a routine activity
- Discuss the need of feedback with colleagues
Sergeant J Mann K in Cantillon P Wood D (eds)
ABC of Learning and Teaching in Medicine 2nd edn
(2010)
67Part 3
Evaluation
68Evaluation definition
A systematic approach to the collection,
analysis and interpretation of information about
any aspect of the conceptualisation, design,
implementation and utility of educational
programmes
Mohanna K et al (2004) Teaching Made Easy a
manual for health professionals
69In this section
- Purposes of evaluation
- Data sources
- Kirkpatricks heirarchy
70The Audit Cycle
Ask question(s)
Re-audit
Review literature
Set criteria standards
Review standards
Action plan
Design audit
Feed back findings
Collect data
Analyse data
Wakley G Chambers R (2005) Clinical Audit in
Primary Care
71Why evaluate ?
- To ensure teaching is meeting students needs.
- To identify areas where teaching can be
improved. - To inform the allocation of faculty resources.
- To provide feedback and encouragement to
teachers - To support applications for promotion by
teachers. - To identify and articulate what is valued by
medical schools. - To facilitate development of the curriculum.
Morrison (2003) Br Med J 326 385
72Evaluation
- Scale micro ? macro
- Formative? summative
- Internal? external
- Can you evaluate an assessment ?
73Evaluation data sources
Student ratings Employer ratings
Peer ratings Video recordings
Self-ratings Administrator ratings
Assessment scores Teacher scholarship
Expert ratings Teacher awards
Student interviews Teaching portfolios
Exit ratings
Based on Berk RA (2006) Thirteen Strategies to
Measure College Teaching
74(No Transcript)
75(No Transcript)
76Teaching Observation
- A method of evaluating teaching
- Different models and purposes
- Three stages pre-observation, observation,
post-observation - Form (instrument) for recording the information,
observation and feedback - Siddiqui ZS, Jonas-Dwyer D Carr SE (2007)
Twelve tips for peer observation of teaching.
Medical Teacher 29297-300
77Teaching Observation purposes
- Evaluation authority / summative
- Developmental expert / formative
- Peer review collaborative / mutual learning
78(No Transcript)
79Evaluation triangulation
Self
Peers
Students
80Practical Exercise
81Kirkpatricks Hierarchy
- Complexity of behaviour
- Time elapsed
- ? Reliable measures
- Confounding factors
Results
Behaviour
Learning
Reaction
Hutchinson (1999) Br Med J 318 1267
82Whats the point ?
83(No Transcript)
84Issues with Kirkpatrick
- Is it a hierarchy ?
- Omissions learning objectives ?
- Linkages between levels ?
- Kirkpatrick plus
Tamkin P et al (2002) Kirkpatrick and Beyond, IES
85Conclusions
- There is no perfect assessment instrument
- Feedback to students is essential to experiential
learning - The ultimate purpose of evaluation is to improve
clinical outcomes