Title: nMRCGP in a nutshell
1nMRCGP in a nutshell
- Ramesh Mehay
- Programme Director (Bradford VTS)
Originally written 2007, updated Jan 2009
2Aims and objectives
- Aims
- Increase our understanding of nMRCGP
- Help us feel more prepared for the assessments
- And therefore feel better!
- Objectives
- Provide an overview of nMRCGP
- Share understanding
- Share concerns (and address them?)
- Practise COT
3Session plan
- Overview of the MRCGP and its components
- Share fears and concerns
- Practise some COTs in groups
- ?modelling
- (IS2 practise some CBDs)
4Background to nMRCGP
- nMRCGP replaces both old MRCGP and SA
- Based on new GP curriculum
- new curriculum developed by reviewing
literature, very extensive consultation with
doctors and patients, etc - All components of nMRCGP are mapped to the
competencies in the curriculum - GP training now overseen by PMETB, like all other
medical specialties (JCPTGP is dead)
5A programme of assessment
6Components
- AKT (Applied Knowledge Test)
- machine marked test, 3x/year, at various venues
- CSA (Clinical Skills Assessment)
- OSCE-type exam, 3x/ year, Croydon
- WPBA (Workplace Based Assessment)
- recorded in e-portfolio held by GP trainee
throughout the 3 years
7Clinical Skills Assessment
- Integrative assessment with 3 domains
- Data gathering, technical and assessment
- Clinical management
- Interpersonal skills
- 13 stations, 10 mins each, balanced selection of
cases - clear pass, marginal pass, marginal fail, clear
fail, serious concerns - significant failure rate
- take early enough to have time to retake
8Work Place Based Assessment WBPA
- Workplace assessment
- the assessment of actual working practices
undertaken in the working environment
9Overview of WBA
- What the trainee actually does
- Competencies demonstrated when ready
- Assessment of developmental progression - guides
decisions about future learning - Recorded in an electronic portfolio
- Process is learner led - trainee has to ensure
their e-portfolio covers the e-curriculum
10WBA compulsory components
- Case Based Discussion (CBD)
- Consultation Observation Tool (COT) or
Mini-Clinical Evaluation Exercise (Mini CEX) - Multi Source Feedback (MSF)
- Patient Satisfaction Questionnaire (PSQ)
- Direct Observation of Procedural Skills (DOPS)
11WBA local subunits
- OOH work booklet
- Clinical Supervisors Report (CSR)
- Naturally Occurring Evidence (NOE)
- Significant Event Review (SER)
- Referrals analysis
- Audit
- (Case Review, Personal Learning, Complaints)
12Who makes judgements?
- The Trainer/Clinical Supervisor as (s)he does the
assessments - Educational Supervisor as he reviews the whole
thing with the trainee - ARCP panels who review the whole thing when a
trainee is moving up an ST grade
13Case based Discussion (CBD)
- Structured interview designed to explore
professional judgement in clinical cases - Professional judgement ability to make
holistic, balanced and justifiable decisions in
situations of complexity and uncertainty - Attributes tested
- Application of medical knowledge
- Application of ethical frameworks
- Ability to prioritise, consider implications,
justify decisions - Recognising complexity and uncertainty
14CBD Competency areas
- CBD looks at 10 of the 12 competencies
- Practising holistically
- Data gathering and interpretation
- Making decisions/diagnoses
- Clinical management
- Managing medical complexity
- Primary Care Administration (IMT)
- Working with colleagues
- Community orientation
- Maintaining an ethical approach
- Fitness to practice
- (not assessed by CBD communication skills AND
maintaining performance/learning/teaching)
15CBD - the process
- Trainee selects 3 cases, gives material to
trainer 1w in advance - Need balance of cases and contexts
- Trainer selects 2, and plans structured questions
in advance - 1h session cover 2 cases
- 20mins case, 10mins feedback
- Trainer records evidence and judges level of
performance - (insuff evid/needs devel/competent/excellent)
- Need to do a MINIMUM of 6 per post
- All 6 before the ES meeting! (really, within 4m)
16Key Points on CBD
- It is a STRUCTURED oral interview
- On what the trainee actually did
- And why they did that
- And if they considered anything else at the time
- So, dont ask what if questions like you do in
Random Case Analysis - Stick to the here and now of the case
- Use the question maker framework on
www.bradfordvts.co.uk (click nMRCGP then click
CBD)
17CBD Whats the Experience So Far?
- Trainees
- Initially anxious but less stressful than current
SA - Valued feedback
- Found it realistic
- Some concern re relationship with trainer
- Trainers
- Time consuming, need extra protected time
- Helpful structure
- May be more helpful for difficult trainees
- Concern re relationship with trainees
18Consultation Observation Tool (COT)
- Single consultation per session
- Trainee and Trainer view together
- Trainer assesses consultation on 4pt rating scale
(similar to old MRCGP/SA) - No rule about consultation length
- Ideally at least one consultation is assessed by
someone other than trainer - Ideally wide range of contexts required,
including at least one child, older person,
mental health problem
19Why Work Place Based Assessment?
- What was wrong with the old MRCGP or Summative
Assessment?
20(No Transcript)
21(No Transcript)
22Millers Pyramid or Prism of Clinical Competence
23What is Authentic Performance?
- Testing should be as close as possible to the
situation in which one attacks the problem. - Ill-structured problems are not found in
simulated and/or standardized tests. - The variation inherent in professional practice
will always elude capture by a set of rules.
Wiggins, Assessing Student Performance Exploring
the Purpose and Limits of Testing, Jossey-Bass,
Inc. 1993
24Relationship between tools and competency areas
25Good Assessment Instruments have
- Reliability (R)
- Validity (V)
- Educational impact (E)
- Acceptability (A)
- Cost (C)
- (Mnemonic CARVE)
- Van der Vleuten, The assessment of professional
competencedevelopments, research and practical
implications, Adv Health Sci Educ 1 (1996),
26Why WPBA?
- High validity Authenticity
- High educational impact
- Reliability depends on how many you do also
some built in triangulation - Reconnects assessment with learning and the
workplace - Assessment over entire training envelope
- Cost Effective and now accepted!
27And it gives continuous feedback
- a process of monitoring students progress
through an area of learning so that decisions can
be made about the best way to facilitate future
learning
28The Problem With WPBA
- Inter-observer variation
- Intra-observer variation
- Case specificity
29Requirements of a high stakes performance
assessment
- Specification
- Calibration
- Moderation
- Training
- Verification and audit
- (Baker, ONeil, Linn 1991)
30Rough Guide to Rating Scale
- Excellent Smooth and efficient. Able to use
knowledge, judgment and skills to adjust
management appropriately to the specific patient
and operative procedure. - Competent Lacks smoothness and efficiency but
is able to use knowledge, judgment and skills to
adjust management appropriately to the specific
patient and operative procedure. - NEEDS FURTHER DEVELOPMENT
- Beginner Lacks smoothness and efficiency. Able
to manage the case but exhibits limited use of
personal judgment and responsiveness to the
specifics of the patient and operative procedure.
Requires some limited coaching or attending
intervention. - Novice Can only manage the case with extensive
coaching and attending intervention.