nMRCGP - PowerPoint PPT Presentation

1 / 55
About This Presentation
Title:

nMRCGP

Description:

1 day AM & PM sittings x3/year likely Oct/Feb/May. ... the College website and publication in peer reviewed journals and GP rag mags. ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 56
Provided by: richar461
Category:
Tags: nmrcgp

less

Transcript and Presenter's Notes

Title: nMRCGP


1
nMRCGP
  • Dr Richard Jones
  • Associate Director Oxford PGMDE
  • nMRCGP Assessor/Trainer

Dr Nicki Williams nMRCGP Assessor WPBA trainer
2
What does it replace
  • Some would say
  • MRCGP
  • Multiple Choice Paper MCP
  • Written paper
  • Video/Simulated Surgery
  • Oral examination
  • Actually replaces SA licensing exam now
    fundamental difference

3
nMRCGP - Composition
  • AKT ? MCP
  • CSA ? Simulated Surgery
  • WPBA which is a log of learning with emphasis on
    linking education to learning needs rather than
    just a set of assessments
  • Log of learning Training Record recorded on a
    centrally held
  • e-portfolio contains embedded tools
  • Case based discussion CBD ? Oral
  • Consultation observation tool COT ? Video
  • Multisource Feedback MSF
  • Patient Satisfaction Questionnaire PSQ
  • Only latter 2 have no current formal equivalent
    in MRCGP

4
The salient point is
  • There is nothing mystical about the new
    assessment.
  • Much of the methodology is similar to what you
    are familiar with.
  • There has essentially been some re-badging and
    re-organisation.

5
Some of the theory
  • Where would we be without Millers pyramid no
    assessment presentation would be complete without
    it!
  • Dont intend to buck the trend.

6
Millers pyramid of clinical competence
WPBA
Does
CSA WPBA
Shows
AKT WPBA
Knows how
Knows
AKT
7
So how does everything fit together?
8
PMETB
  • Postgraduate Medical Education Training Board
  • RCGP Licensed to deliver the assessment for
    General Practice according to a set of assessment
    principles.
  • Details on PMETB website and along with RCGP
    curriculum statements can be accessed via RCGP
    website also

9
Groups within the nMRCGP assessment process
  • Applied Knowledge Test
  • Timed multiple choice question paper, written or
    computer based. Good test of knowledge.
  • Clinical Skills Assessment
  • Clinical consulting skills examination, based on
    cases from general practice, with role players as
    patients, and experienced MRCGP assessors. This
    assessment is able to provide a pre-determined,
    standardised level of challenge to candidates.
  • Workplace Based Assessment
  • Portfolio based, with formative and summative
    assessment, variety of testing formats. Able to
    test doctor in his/her place of work, doing what
    he/she actually does. Some external validation
    included.

10
AKT
  • Pearson-Vue centres nationally.
  • 1 day AM PM sittings x3/year likely
    Oct/Feb/May.
  • Once bank large enough may be available more
    often, or even all year round.

11
AKT
  • MCP
  • 65 clinical medicine
  • 15 administration
  • 20 research and statistics
  • AKT
  • 80 clinical medicine
  • 10 administration
  • 10 research and statisitics

12
CSA
13
Why a Clinical Skills Assessment?
  • Criticism of current MRCGP that there is no
    clinical consulting skills component
  • Provides external validation / triangulation with
    the other two testing methods used
  • Using simulated patients is a valid and reliable
    method for testing clinical skills, so long as
    quality assurance of case production, role player
    and assessor training is carried out
  • Able to offer a standardised, pre-determined
    level of challenge to candidates and to vary this
    level of challenge as needed by the assessment
    requirements

14
Running a Clinical Skills Assessment
  • Tries to replicate the assessment in a fair and
    standardised way for 3,000 - 4,000 candidates per
    year
  • Currently unable to set up reliable bank using
    real patients with stable physical signs for a
    CSA of this size limits type of cases we can
    offer as role players dont generally have signs

15
Definition of the purpose of the CSA
  • An assessment of a doctors ability to integrate
    and apply appropriate clinical, professional,
    communication and practical skills in general
    practice
  • Integrative skills assessment tests a doctors
    abilities to gather information and apply learned
    understanding of disease processes and
    person-centred care appropriately in a
    standardised context, making evidence-based
    decisions, and communicating effectively with
    patients and colleagues.

16
CSA Blueprint derived from the Curriculum
17
Case Selection Blueprint
18
How does the CSA differ from the Simulated
Surgery?
  • NOT just a test of communication skills in a
    clinical setting
  • Based on the nMRCGP blueprint, and samples across
    this blueprint.
  • It will be taken by many more candidates (3,000 -
    4,000 per year versus 300 - 400 per year)
  • Looks at integrative clinical skills in primary
    care settings

19
How does the CSA differ from the Simulated
Surgery? continued
  • Includes assessment of clinical and practical
    skills
  • But much of the experience gained from designing
    and running the Simulated Surgery has been
    invaluable in the development of the CSA.

20
What is the CSA likely to look like?
  • Candidate stays in surgery and patient and
    examiner move around circuit
  • Will use multiple circuits 3
  • Will take place for a number of weeks, several
    times a year, probably Oct, Feb May starting
    Oct 2007!
  • Temporary assessment centre to be used initially,
    based in Croydon
  • Dedicated assessment centre within new College
    build planned within next 3-5 years

21
What is the CSA likely to look like? continued
  • Will consist mostly of simulated patient cases.
  • 13 stations, probably each of 10 minutes
  • Marks collected by Opscan techniques
  • Some triangulation with Workplace Based
    Assessment competencies
  • Stations picked from intended learning outcomes
    across the nMRCGP blueprint with clear derivation

22
Three domains for each case
23
The Marking Schedule
  • Each case is marked in these 3 domains
  • Data gathering, examination and clinical
    assessment skills
  • Clinical management skills
  • Interpersonal skills
  • All domains have equal weighting
  • 4 grades awardable
  • Clear pass
  • Marginal pass
  • Marginal fail
  • Clear fail

24
The Marking Schedule continued
  • Assessor uses word pictures to help decide grade
    for each domain, then uses this information to
    make a judgement on the grade for the case
    overall (4 decisions)
  • Feedback to candidates
  • Serious concerns box

25
CSA Resource
  • CSA DVD available from Wessex Faculty RCGP
    contact Carol White there on (01264) 355013
    Personal 355005 Office or cwhite_at_rcgp.org.uk
  • Produced by Mark Coombe and Mei Ling Denney
    assessor trainers with help from other CSA
    assessors

26
How the CSA is aiming to meet PMETB assessment
criteria
  • 1. This methodology is judged to be the best way
    to test Clinical Skills in general practice
    currently.
  • 2. Cases are based on the nMRCGP curriculum.
  • 3. Assessment methodology chosen is fit for
    purpose validated and reliable, elsewhere and
    our main pilot.
  • 4. Standard setting will be transparent and in
    the public domain with wide consultation.
  • 5. Feedback will be given to all candidates.

27
How the CSA is aiming to meet PMETB assessment
criteria continued
  • PMETB ASSESSMENT CRITERIA
  • 6. Recruitment of assessors will be on ability
    to rank order, mark reliably, knowledge.
  • 7. Lay input has been consistently sought.
  • 8. Documentation will be accessible nationally
    through the College website and publication in
    peer reviewed journals and GP rag mags.
  • 9. Resources? Continually under review

28
WPBA
29
WPBA definition
  • The evaluation of a doctors progress in their
    performance over time, in those areas of
    professional practice best tested in the
    workplace.
  • (Replaces and is significantly better than the
    current Trainers Report of SA)

30
Theoretical base
  • Finally (!) offers the opportunity to re-couple
    teaching, learning and assessment.
  • It is authentic the assessment gets as close as
    possible to the real situations in which doctors
    work.
  • Assessment of performance in the workplace
    provides us with the only route into many aspects
    of professionalism (competencies).

31
So what does it all mean?
32
WPBA will
  • Provide feedback on areas of strength and
    development needs
  • Identify trainees in difficulty
  • Drive learning in important areas of competency
  • Determine fitness to progress onto the next stage
    of the trainee s career

33
WPBA what does it look like?
  • Each GPStR owns an e-portfolio, covering 3 years
    of speciality training.
  • A key component of the e-portfolio is the
  • Training Record ( a log of learning)
  • It also contains other sections e.g.
  • Skills log ( DOPS)
  • Record of achievement in CSA, AKT etc.

34
The training record functions
  • Coverage of the RCGP curriculum (non assessed
    items like tutorial records) Multiple sampling
    from multiple perspectives.
  • Progression across the twelve competency areas
    recorded at 6-monthly evidenced staging reviews

35
Training Record 6monthly reviews
  • Review all the information gathered, tagged into
    competency areas
  • Judge progress against competency areas
  • Provide developmental feedback
  • (linking learning to assessment to teaching)

36
Tools for evidence
  • Naturally occurring evidence
  • Usual tool box
  • Specified (embedded/complementary ) tools
  • case based discussion (CbD)
  • consultation observation (COT)/mini CEX
  • multi-source feedback (MSF)
  • patient satisfaction questionnaire (PSQ)
  • NB not pass/fail assessments they gather
    evidence

37
Complementary tools CbD
  • Structured form of problem-case analysis
  • Start with records
  • Find an area of uncertainty
  • Follow decision-making through in depth
  • Categorise collected evidence into competencies
  • Grade
  • Feedback to GPStR agree learning plan

38
Complementary tools COT
  • Method of reviewing a consultation, live or on
    video/DVD
  • Performance criteria almost identical to current
    MRCGP so nothing new!

39
Complementary tools MSF 360 degree
  • At 30 and 34 months
  • Assessment of clinical ability and professional
    behaviour
  • 5 Clinical and 5 non-clinical raters in primary
    care
  • 10 questions
  • Web based. Processed centrally
  • Results to the e-portfolio

40
Complementary tools PSQ
  • At end of training (only assessed in the primary
    care setting)
  • Measures consultation and relational empathy
    (CARE)
  • 30 consecutive consultations in GP setting
  • Needs skill of trainer in giving feedback
  • Centrally processed
  • Results to e-portfolio

41
Where to find the evidence
42
Staged Reviews
  • Specialty Training Year 1
  • Prior to 6 month (6m) review
  •  3 x COT or mini-CEX 3 x CBD 1 x MSF, 5
    clinicians only DOPS, if in secondary
    care Clinical supervisors reports, if in
    secondary care
  • Prior to 12 m review
  •  3 x COT or mini-CEX 3 x CBD 1 x MSF, 5
    clinicians only 1 x PSQ, if in primary
    care DOPS, if in secondary care Clinical
    supervisors reports, if in secondary care
  • Specialty Training Year 2
  • Prior to 18 m review
  •  3 x COT or mini-CEX 3 x CBD PSQ, if not
    completed in ST1 DOPS, if in secondary
    care Clinical supervisors reports, if in
    secondary care
  • Prior to 24 m review (primary care) 3 x COT 3
    x CBD PSQ, if not completed in ST1
  • Specialty Training Year 3 (primary care)
  • Prior to 30m review 6 x CBD 6x COT 1 x MSF
  • Prior to 34m review 6 x CBD 6 x COT 1 x
    MSF 1 x PSQ
  • Notes
  • 1. Throughout the training mini-CEX and COT
    assessments will be used interchangeably. The
    former being adopted in the secondary care
    setting, the latter in primary care.
  • 2. DOPS assessment will only need to be carried
    out until the mandatory practical skills have
    been assessed as satisfactory.
  • 3. Patient satisfaction will only be assessed in
    the primary care setting.
  • 4. Multi-source feedback will involve clinical
    raters only when in secondary care and both
    clinical and non-clinical raters when in primary
    care.

43
Staged reviews how many of which tool?
  • Specialty Training Year 3 (primary care)
  • Prior to 30m review 6 x CBD 6x COT 1 x MSF
  • Prior to 34m review 6 x CBD 6 x COT 1 x
    MSF 1 x PSQ
  • NB DOPS assessment will only need to be carried
    out until the mandatory practical skills have
    been assessed as satisfactory.

44
nMRCGP
  • Any questions?

45
(No Transcript)
46
CbD Exercise
  • Read through the competency areas
  • Watch the DVD
  • Jot down evidence in competency areas
  • In each competency area do you have enough
    evidence to make a judgement?
  • Needs further development? competent? excellent?

47
CbD Exercise
  • Small groups
  • Identify a decision that GPStR made
  • Try to formulate questions that might test that
    decision in depth
  • Use of the word WHY? Gets straight into
    decision-making and justification thereof
  • What competency areas are also covered despite
    the depth of that discussion please identify
    them

48
CbD Exercise
  • Small groups
  • Consider a competency area either
  • Not considered
  • Insufficient evidence
  • Consider what questions you might have asked to
    gather that evidence

49
CSA Exercise
  • Remind yourselves of the domains for marking
  • Watch the DVD
  • Score each domain separately without conferring
  • Make a global judgement based on independent and
    safe practice
  • Be prepared to defend it!

50
CSA
  • How can we best prepare our GPStRs for this new
    assessment?

51
Grade descriptors
  • CP        The candidate demonstrates a high level
    of competence, with a justifiable clinical
    approach that is fluent appropriately focussed
    and technically proficient. The candidate shows
    sensitivity, actively shares and may empower the
    patient

52
Grade descriptors
  • MP       The candidate demonstrates an adequate
    level of competence, displaying a clinical
    approach that may not be fluent but is
    justifiable and coherent and technically
    proficient. The candidate shows sensitivity and
    actively engages the participation of the
    patient.

53
Grade descriptors
  • MF       The candidate fails to demonstrate
    adequate competence with a clinical approach that
    may be hesitating, unsystematic and at times
    inconsistent with accepted practice. Technical
    proficiency may be of concern. The patient is
    treated with sensitivity and respect but the
    doctor may not sufficiently facilitate or respond
    to the patients contribution.

54
Grade descriptors
  • CF        The candidate clearly fails to
    demonstrate competence, with clinical management
    that is incompatible with accepted practice and a
    problem-solving approach that may be arbitrary
    and technically incompetent. The patient may not
    be treated with adequate attention, sensitivity
    or respect for their contribution.

55
Grade descriptors
  • Excellent Using a patient-centred clinical
    method that may empower and motivate the patient,
    the candidates clinical approach and
    interpersonal skills show fluency,
    sophistication and time-efficiency
    Performance is not necessarily perfect but is as
    good as could be achieved under exam conditions
  • Serious concerns The candidates performance
    demonstrates serious deficiencies in thinking and
    behaviour that may place patients at risk of harm
    from decisions and actions that the doctor takes,
    or fails to take.
Write a Comment
User Comments (0)
About PowerShow.com