Title: Assessment of Knowledge and Performance
1Assessment of Knowledge and Performance
- John Littlefield, PhD
- University of Texas
- Health Science Center at San Antonio
2Goals Assessment of Knowledge and Performance
- 1. Clarify 2 distinct uses for assessments of
knowledge and performance - 2. Define 3 aspects of validity for all knowledge
and performance assessment methods - 3. Compare and contrast 3 techniques for
assessing clinical knowledge and performance - 4. Identify poorly written multiple choice test
items and write a key features test item - 5. Describe 3 options for scoring OSCE
performance - 6. Describe three elements of a clinical
performance assessment system - 7. Critique a clinical performance assessment
system that you use
3Agenda Assessment of Knowledge and Performance
- Exercise Warm-up for assessing clinical
knowledge and performance - Presentation Quality assurance when assessing
clinical knowledge and performance - Exercise Take then critique a multiple choice
test - Presentation Key features test items
- Exercise Write several key features test items
- Presentation Widening the lens on SP
assessment - Exercise Strengths weaknesses of a clinical
performance assessment system that you use - Presentation Improving clinical performance
assessment systems - Exercise Critique your clinical performance
assessment system
4Recall a student/resident whose clinical
performance made you uneasy
- Was the student/resident aware
- of your concern? Yes No
- 2. What action did you take?
- a. Talk with faculty colleagues about your
concerns Yes No - b. Write a candid performance
- assessment and send it to clerkship/residency
director Yes No - 3. Did any administrative action
- occur related to your concern? Yes No
- 4. Do you think the performance
- assessments in your clerkship/
- residency files reflect faculty
- candid performance appraisals? Yes No
5What concerns do you have about clinical
knowledge and performance assessment?
- Smart but not professional
- Doesnt have technical skills
- Heterogeneity of evaluator skills (fairness /
accuracy) - How to motivate evaluators
- Options for remediation
- How to validate the exam
- Oral exams really worth it
- How many evals needed before making a decision
6 Uses for Assessment Formative vs. Summative
Purpose Feedback for
Certification/ Learning Grading Breadt
h of Narrow Focus on Broad Focus on Scope
Specific Objectives General
Goals Scoring Explicit Feedback Overall
Performance Learner Affective
Little Anxiety Moderate to High
Response Anxiety Target Audience
Learner Society
7Validity of Knowledge Performance Assessments
- Content Does the assessment method measure a
representative cross section of student/resident
competencies? - Reliability of scores Does student/resident
perform at about the same level across 5 to 7
different patients / case problems? Does student
receive similar ratings from different faculty? - 3. Latent process Does the context surrounding
the assessment evoke the domain of cognitive
processing used by a clinician?
Lissitz RW Samuelsen K. A suggested change
in terminology and emphasis regarding validity
and education. Educational Researcher, V. 36(8),
437-48, 2007.
8Content of a Clinical Performance Assessment
- Which clinical competencies are addressed by the
performance assessment? - How thoroughly will you assess each competency?
- How will you score performance?
9Reliability of Physician Performance Scores on
Multiple Patients
10Latent Process Aspect of Validity Four Levels
of Performance Assessment
Does (Global Rating)
Shows How (OSCE)
Knows How (Examination Oral)
Knows (Examination
Multiple-choice)
Miller, GE. Assessment of clinical
skills/competence/performance, Academic
Medicine, 65(9), supplement, 1990, S63-7
11Compare and Contrast Three Assessment
Techniques(Multiple choice exam, OSCE, Global
ratings)
- M.C.E. OSCE Global
rtgs. - Content
- Reliability
- 5 to 7 case problems
- agreement among
- raters
- 3. Latent process
- adequate good excellent
12Interim Summary of Session
- Session thus far
- Two uses of knowledge and performance
assessments Formative and Summative - Validity of all knowledge and performance
assessment techniques - Compare and contrast 3 assessment techniques
- Coming up
- Take and critique a 14 item multiple choice exam
- Presentation on Key Features items
13How are Multiple Choice Items Selected for an
Exam?
14 Sample Exam Blueprint based on Clinical Problems
Page G, Bordage G, Allen T. Developing
key-feature problems and examinations to assess
clinical decision-making skills, Acad. Med.
70(3), 1995.
15Key Features of a Clinical Problem 1
- Definition Critical steps that must be taken to
identify and manage a patients problem - focuses on a step in which examinees are likely
to make an error - is a difficult aspect in identifying and managing
the problem - Example For a pregnant woman experiencing
third-trimester bleeding with no abdominal pain,
the physician should - generate placenta previa as the leading diagnosis
- avoid performing a pelvic examination (may cause
bleeding) - avoid discharging from clinic or emergency room
- order coagulation tests and cross-match
1. Page G, Bordage G, Allen T. Developing
key-feature problems and examinations to assess
clinical decision-making skills, Acad. Med.
70(3), 1995.
16Test Items based on a Clinical Problem and its
Key Features
17Scoring the Placenta Previa Clinical Problem
- Key Feature 1 To receive one point, must list
placenta previa or one of the following synonyms
marginal placenta or low placental insertion - Key Features 2-4 Receive 1/3 point for listing
each of the following 1. Avoid performing a
pelvic exam, 2. Avoid discharging from clinic, 3.
Order coagulation tests and cross match - Total Score for Problem Add scores for items 1
and 2 and divide by 2 (range 0 - 1)
18Steps to Develop Key Features Problems1
- 1. Assemble problem-writing group
- Select a problem and define its key features
- Usually chosen from an exam blueprint
- Think of several real cases of the problem in
practice - What are the essential steps in resolving this
problem (must be done)? - Typical decisions or actions Elicit hx.,
Interpret symptoms, Make dx. - Define qualifiers such as urgency or
decision-making priority - Select a case scenario
- Select question formats
- Specify number of required answers
- Prepare scoring key
- Pilot test the problems
1. Farmer Page. A practical guide to
assessing clinical decision-making skills using
the key features approach. Medical Education,
391188-94, 2005.
19Interim Summary of Session
- Session thus far
- Two uses of knowledge and performance
assessments Formative and Summative - Validity of all assessment techniques
- Compare and contrast three assessment techniques
- Take and critique a 14 item multiple choice exam
- Write a Key Features item
- Coming up
- Scoring performance on an SP exam
20Schematic Diagram of a 9 Station OSCE
Start
1
2
3
4
5
9
End
6
7
8
21OSCE Stations Standardized Patient or Simulation
http//www.med.unc.edu/oed/sp/welcome.htm
http//www.simulution.com/product
22Scoring OSCE Performance
- Traditional scoring of SP assessment focuses on
numerical data typically from checklists - Checklist scoring may not accurately assess
clinical performance quality of residents and
expert clinicians 1 - Dimensions of the SP exam 2
- basic science knowledge (organize the
information) - physical exam skills (memory of routines)
- establishing a human connection
- role of the student (appear knowledgeable)
- existential dimension of the human encounter
(balance ones own beliefs with the patients) - Clinical competence mixture of knowledge and
feeling, information processing and intuition
- Hodges et. al. OSCE checklists do not capture
increasing levels of expertise, Acad. Med.
74(10), 1999, 1129-34.
2. Rose Wilkerson. Widening the
lens on SP assessment What the encounter can
reveal about development of clinical competence,
Acad. Med. 76(8), 2001, 856-59.
23Interim Summary of Session
- Session thus far
- Two uses of knowledge and performance
assessments Formative and Summative - Validity of all assessment techniques
- Compare and contrast three assessment techniques
- Take and critique a 14 item multiple choice exam
- Write a Key Features test item
- Use global ratings and narrative comments when
scoring OSCE performance - Coming up
- Improving clinical performance assessment systems
24Bubble Diagram of a Resident Performance
Assessment System
25Diagnostic Checklist for Clinical Performance
Assessment System
26Three Year Study to Improve the Quality of
Resident Performance Assessment Data
- What median percentage of each residents
rotations returned one or more completed forms? - 2. How precise were the scores marked on the
returned forms? - 3. What median percentage of each residents
rotations returned one or more forms with
behaviorally-specific written comments?
Littlefield, DaRosa, Paukert et. al. Improving
Resident Performance Assessment Data Numeric
Precision Narrative Specificity, Acad. Med.
80(5), 489-95, 2005
27Results of the Study
28Making Evaluation Decisions
Complete Knowledge (God)
Partial Knowledge (Us)
Papadakis et. al. Disciplinary action by medical
boards and prior behavior in medical school,
NEJM, 35325, 2673-82, 2005.
29Goals Assessment of Knowledge Performance
- 1. Clarify 2 distinct uses for assessments of
knowledge and performance - 2. Define 2 aspects of validity for all knowledge
and assessment methods - 3. Compare and contrast 3 techniques for
assessing clinical knowledge and performance - 4. Identify poorly written multiple choice test
items and write a key features test item - 5. Describe 3 options for scoring OSCE
performance - 6. Describe three elements of a clinical
performance assessment system - 7. Critique a clinical performance assessment
system that you use