OSCE - PowerPoint PPT Presentation

1 / 55
About This Presentation
Title:

OSCE

Description:

Kathy Boursicot Train the Trainer Assessment Workshop October 29, 2003 Hong Kong International Consortium OSCE Format Purpose Advantages Writing principles Training ... – PowerPoint PPT presentation

Number of Views:781
Avg rating:3.0/5.0
Slides: 56
Provided by: Katherine114
Category:
Tags: osce | osce

less

Transcript and Presenter's Notes

Title: OSCE


1
OSCE
  • Kathy Boursicot
  • Train the Trainer Assessment Workshop
  • October 29, 2003

Hong Kong International Consortium
2
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

3
What is an OSCE?
  • Objective
  • Structured
  • Clinical
  • Examination

Harden RG and Gleeson FA Assessment of clinical
competence using an objective structured
clinical examination (OSCE) Medical
Education,1979, Vol 13 41-54
4
OSCE test design
5
Varieties of OSCEs
Patient-based
Clinical task
Written task
6
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

7
Simple model of competence
Does
Shows how
Knows how
Knows
8
Testing formats
Professional practice
Behaviour attitude/skills
OSCEs
EMQs, SEQs
Cognition knowledge
MCQs
9
OSCE - Objective
  • All the candidates are presented with the same
    test
  • Specific skill modalities are tested at each
    station
  • History taking
  • Explanation
  • Clinical examination
  • Procedures

10
OSCE - Structured
  • The marking scheme for each station is structured
  • Structured interaction between examiner and
    student

11
OSCE Clinical Examination
  • Test of performance of clinical skills
  • candidates have to demonstrate their skills, not
    just describe the theory

12
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

13
Characteristics of assessment instruments
  • Utility
  • Reliability
  • Validity
  • Educational impact
  • Acceptability
  • Feasibility

Van der Vleuten, C. The assessment of
professional competence developments, research
and practical implications, Advances in Health
Science Education, 1996, Vol 1 41-67
14
Test characteristics
  • Reliability of a test / measure
  • reproducibility of scores across raters,
    questions, cases, occasions
  • capability to differentiate consistently between
    good poor students

15
Sampling
Domain of Interest
?
?
16
Reliability
  • Competencies are highly domain-specific
  • Broad sampling is required to obtain adequate
    reliability
  • across content, i.e., range of cases/situations
  • across other potential factors that cause error
    variance, i.e.,
  • testing time, number of cases, examiners,
    patients, settings, facilities

17
Test characteristics
  • Validity of a test / measure
  • The content is deemed appropriate by the relevant
    experts
  • The test measures the characteristic (e.g.
    knowledge, skills) that it is intended to measure
  • The performance of a particular task predicts
    future performance

18
Test characteristics
  • Validity of a test / measure

19
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

20
Advantages of using OSCEs in clinical assessment
  • Careful specification of content Validity
  • Observation of wider sample of activities
    Reliability
  • Structured interaction between examiner student
  • Structured marking schedule
  • Each student has to perform the same tasks
    Acceptability

21
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

22
OSCE Station Writing
23
How to start
  • Decide what tasks you
  • want to
  • can
  • should
  • test in an OSCE format
  • OSCEs test performance, not knowledge

24
Constructive alignment
  • Need to know the learning objectives of your
    course / programme
  • Map these across
  • Subject areas
  • Knowledge areas
  • Skill areas

25
Blueprinting
  • Content of the assessment should align with the
    learning objectives of the course
  • Blueprinting
  • allows mapping of test items to specific learning
    outcomes
  • ensures adequate sampling across subject area and
    skill domains

26
OSCE blueprint systems-based
Hx taking (incl. diag) Phys exam (incl. diag) Procedures Counseling/Education Ordering investigs
CVS
Endocrine
Gastro
H N
Haem LN
Musculoskl
etc
27
OSCE blueprint discipline-based
Hx taking (incl. diag) Phys exam (incl. diag) Procedures Counseling/Education Ordering investigs
Anaes CC
Clin Pharm
Comm Health
Emergency med
Family med
Musculoskel
etc
28
Key features of success in designing OSCEs
  • Feasibility
  • Congruence

29
Feasibility
  • Is it a reasonable task to expect the candidates
    to perform?
  • Can the task be examined at an OSCE station?
  • Can the task be performed in the time allowed?

30
Feasibility
  • Is it a reasonable task to expect the candidates
    to perform? Is it authentic?
  • Can the task be examined at an OSCE station?
  • Match clinical situations as closely as possible
  • Some tasks may require simulated patients
  • Some tasks may require manikins
  • Some tasks simply cannot be examined in this
    format

31
Feasibility
  • Can task be performed in time allowed?
  • Pilot the stations to see if they are feasible
  • Check equipment /helpers/practicalities

32
Congruence
  • Is it testing what you want it to test?
  • Station construct describe what station is
    testing

33
Congruence
  • Ensure that all parts of station coordinate
  • Candidate instructions
  • Marking schedule
  • Examiner instructions
  • Simulated patient instructions
  • Equipment

34
Station construct
  • This station tests the candidates ability
  • to

35
Candidate instructions
  • State circumstances e.g. outpatient clinic,
    ward, A E, GP surgery
  • Specify the task required of the candidate e.g.
    take a history, perform a neurological
    examination of the legs, explain a diagnosis
  • Specify tasks NOT required
  • Instruct on summing up e.g. tell the patient,
    tell the examiner

36
Examiner instructions
  • Copy of candidate instructions
  • Specific instructions appropriate to the task
  • e.g., do not prompt, explicit prompts, managing
    equipment

37
Simulated patient instructions
  • Give as much detail as possible so they can be
    consistent
  • try to leave as little as possible for them to ad
    lib!
  • Give enough information to enable them to answer
    questions consistently
  • Be specific about affect in each role
  • Specify patient demographics
  • i.e., gender, age, ethnicity, social class, etc.

38
Marking schedule
  • Ensure marks are allocated for tasks the
    candidates are asked to perform
  • Decide relative importance of diagnosis vs
    process (history taking, examination)
  • Separate checklist for process skills

39
Equipment
  • Be detailed
  • Think of
  • Chairs table / couch / bench
  • Manikins - specify
  • Medical equipment
  • Stethoscope, ophthalmoscope, sphyg, suturing
    materials, etc

40
Designing stations
  • Use your blueprint
  • Be clear what you are testing define the
    construct
  • Check for congruence
  • Pilot for feasibility

41
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

42
Training observers
  • Understand the principles of OSCEs
  • Enhance inter-rater consistency

43
Techniques
  • Examiners must train together
  • Videos
  • live stations
  • Discussion of marking inconsistencies

44
Training observers
  • General training
  • Station-specific training

45
OSCE
  • Format
  • Purpose
  • Advantages
  • Writing principles
  • Training observers
  • Scoring considerations

46
Scoring considerations
  • Global vs checklist scoring
  • Weighting
  • Standard setting

47
Checklist scoring
  • Advantages
  • Helps examiner know what the station setters are
    looking for
  • Helps the examiner be objective
  • Facilities the use of non-expert examiners
  • Disadvantages
  • Can just reward process/thoroughness
  • May not sufficiently reward the excellent
    candidate
  • Ignores the examiners expertise

48
Global scoring
  • Advantages
  • Utilises the expertise of the examiners
  • They are in a position to make a (global)
    judgement about the performance
  • Disadvantages
  • Examiners have to be expert examiners i.e.
    trained
  • Examiners must be familiar with expected
    standards for the level of the test

49
Weighting
  • In a checklist, some items may be weighted more
    than others
  • More complicated scoring system
  • Makes no difference to very good very bad
    candidates
  • Can enhance discrimination at the cut score

50
Standard setting
  • No perfect method!
  • Should be criterion-referenced method
  • e.g. Angoff, Ebel, etc.
  • But
  • are these suitable for performance based tests?

51
Performance-based standard setting methods
  • Borderline group method
  • Contrasting group method
  • Regression based standard method

Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L,
van der Vleuten C Comparison of a rational and
an empirical standard setting procedure for an
OSCE, Medical Education, 2003 Vol 37 Issue 2,
Page 132
52
Borderline method
Test score distribution
Checklist
1. Hs shjs sjnhss sjhs sjs sj 2. Ksks
sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk
dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk
dd 6. Hskls skj sls ska ak akl ald 7.
Hdhhddh shs ahhakk as TOTAL
?
?
?
?
Borderline score distribution
?
Pass, Fail, Borderline P/B/F
Passing score
53
Contrasting groups method
Test score distribution
Checklist
1. Hs shjs sjnhss sjhs sjs sj 2. Ksks
sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk
dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk
dd 6. Hskls skj sls ska ak akl ald 7.
Hdhhddh shs ahhakk as TOTAL
?
?
?
Pass
Fail
?
?
Pass, Fail, Borderline P/B/F
Passing score
54
Regression based standard
Checklist
X passing score
1. Hs shjs sjnhss sjhs sjs sj 2. Ksks
sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk
dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk
dd 6. Hskls skj sls ska ak akl ald 7.
Hdhhddh shs ahhakk as TOTAL
?
Checklist Score
?
X
?
?
?
Overall rating 1 2 3 4 5
Clear Borderline Clear v good pass
excellent pass fail pass
1 Clear fail 2 Borderline 3 Clear pass 4
v good pass 5 excellent pass
55
Borderline/contrasting/regression based methods
  • Panel equals examiners
  • Reliable cut-off score based on large sample of
    judgments (no. of stations x no. of candidates)
  • Credible based on expert judgment in direct
    observation
  • Passing score not known in advance (as with all
    examinee centered methods)
  • Judgments not independent of checklist scoring
Write a Comment
User Comments (0)
About PowerShow.com