Title: Simulation For Emergency Medicine
1Simulation For Emergency Medicine
- CORD Academic AssemblyMarch 4th, 2006
2Steve McLaughlin, MDEM Program DirectorMedical
Director BATCAVE Simulation CenterUniversity
of New MexicoMary Jo Wagner, MDEM Program
DirectorSynergy Medical Education
AllianceMichigan State University
3Objectives
- Describe the current state of education and
research in simulation. - List the various simulators, mannequins and
models available for emergency medicine training. - Discuss the strengths and weaknesses of each
simulation modality. - List some of the best practice examples for using
simulation in EM residencies.
4Outline
- Introduction
- Spectrum of Simulation Equipment
- Best Practice Examples
- Hands-on Practice
5Introduction
- Simulation is the act of mimicking a real
object, event or process. - Simulation is a person, device or set of
conditions which present evaluation problems
authentically. The student responds to the
problems as they would under natural
circumstances.
6Introduction
- Characteristics
- Cues and consequences are like reality
- Situations can be complex
- Fidelity (exactness of duplication) is not
perfect - Feedback to users questions, decisions, and
actions.
7Introduction
- History
- 1928 Edwin Link develops first flight simulator
Link Trainer - 1960 Laerdal introduces first Resusci-Annie
- 1968 Harvey cardiology simulator
- 1970 First power plant simulators
- 1973 First computer aided modeling of physiology
- 1975 Standardized patients and OSCEs introduced
- 1988 First full body, computerized mannequin at
Stanford - 1989 ACRM Anesthesia focused on patient safety
and education movement at this time - 1990 Term Virtual Reality was introduced, Screen
Based Simulators Introduced
8Link Trainer
9Harvey
10Introduction
- History
- 1990s US IOM To Err Is Human report.
- 1994 Boston Center for Medical Simulation
- 1993 First national/international simulation
meetings MMVR - Late 1990s Introduction of simulation into
specialties like EM - 1997 MIST VR Task Trainer
- 1998 AAMC MSOP
- 1991-1993 a total of 30 articles on High Fidelity
Simulation - 2000-1 Current generation of full body mannequins
introduced by METI and Laerdal - 2000-2003 a total of 385 articles on High
Fidelity Simulation - 2005 Society for Medical Simulation
- 2006 Simulation in Healthcare Journal
11Introduction
- Why is this a valuable tool? Or is it?
- Learners can learn without risk to patient.
- Learning can be focused without regard to patient
care needs/safety/etc. - Opportunity to repeat lesson/skill to mastery.
- Specific learning opportunities? guaranteed.
- Learning can be done at convenient times.
- Performance can be observed/recorded.
12Introduction
- Why is simulation important in medical education?
- Problems with clinical teaching
- New technologies for diagnosis/treatment
- Assessing professional competence
- Medical errors and patient safety
- Deliberate practice
13Introduction
- CORD Consensus Conference
- Simulation is a useful tool for assess
competence. Especially patient care, IP skills
and SBP. - There is a lack of evidence to support the use of
simulation for high stakes assessment. - Definitions of competence and tools to evaluate
performance must be developed and tested. - Scenarios and evaluation tools should be
standardized.
14Introduction
- ACGME Toolbox of Assessment Methods
- Simulation is the best, second best tool for
assessing - Medical procedures
- Ability to develop and carry out patient
management plans - Investigative/analytical thinking
- Knowledge/application of basic sciences
- Ethically sound practice
15Introduction
- LCME Requirements
- Allows simulated patients to count for student
exposure to particular cases. - RRC Requirements
- Allows simulated procedures to count for
program/individual totals. - Very helpful for rare procedures.
16Introduction
-
- Simulation is one tool
- (new, expensive and exciting)
- in our educational repertoire.
- (Similar to lecture, case discussion, skill lab,
MCQ, SP, etc.)
17Outline
- Introduction
- Spectrum of Simulation Equipment
- Best Practice Examples
- Hands-on Practice
18Available Simulation Equipment
- Standardized Patients
- Improvised Technology
- Screen Based Simulation
- Task Trainers
- Low/Mid/High Fidelity Mannequins
- Virtual Reality
19Evaluating Simulators
- Usability
- Validity
- Face, Content, Construct, Concurrent, Predictive
- Transfer
- Efficiency
- Cost
- Evidence
20Standardized Patients
- Individuals trained to portray specific illness
or behavior in a realistic and consistent manner
for the purposes of teaching or assessment. - Used in classroom setting, or without knowledge
in clinical setting - Especially useful to teach and assess
communications and professionalism competencies
in a standardized method.
21Standardized Patients
- Initially started in the 1980s
- Now - Association of Standardized Patient
Educators - http//www.aspeducators.org/sp_info.htm
- Required Clinical Skills testing for all students
- USMLE Part II CS exam
Univ of South Florida standardized patient
22Standardized Patients
- Strengths
- Can consistently reproduce clinical scenario for
standardized testing of learners - Ability to assess rare conditions not otherwise
reliably seen - Patients trained to provide objective accurate
feedback - Can use in real settings (arrive at office/ED as
real patient for realistic environment)
23Standardized Patients
- Weaknesses
- Little research on effectiveness
- Most studies are from preclinical medical school
education - Few studies done with residents or practitioners
and nearly all have small numbers (15-50) - Cost to pay time to teach standardized
patients - Quality of experience heavily dependent upon
training of the patient and scenarios developed
24Standardized Patients
Audience Comments
25Improvised Technology
- Models made of easily available items
- Closely mimic human tissue
- Allow for near replica of actual procedural steps
- Generally used for instruction of procedures
- Commonly used examples
- Slab of ribs to teach insertion of chest tubes
- Pigs feet or head for suturing practice
- Other examples in the literature
- Jello for vascular model
- Lasagna for split skin graft harvesting
26 Animal Models
27Improvised Technology Educational Theory
- Cognitive process for learning a procedure
- Understanding of indications, contraindications
complications - Knowledge of equipment used for procedure
- Step-by-step knowledge of technical procedure
- Identifying anatomical landmarks and tissue
clues - E.g. pop when entering dura or peritoneal
cavity
28Improvised Technology Educational Theory
- Improvised Technology
- Useful to teach
- Knowledge of equipment
- Step-by-step knowledge of procedure
- Some tissue clues
- Less useful for
- Anatomical landmarks
- Greatest predictor of procedural competency
was the ability to sequentially order procedural
steps - Chapman DM et al Open Thoracotomy ProceduralAnn
Emerg Med 1996 28641.
29Improvised Technology
- Strengths
- Cheap!!!
- Made easily available at all sites
- Easy to duplicate for repetitive use or numerous
users - Minimal instructor education needed
- Ability to create models otherwise not available
- Resuscitative Thoractomy
30Improvised Technology
- Weaknesses
- Almost no research on effectiveness
- Less real-life experience, therefore stress
factor removed - Often does not duplicate most difficult aspect of
procedure (E.g. obese patient) - Static devices , therefore useful for specific
procedures only, not actively changing clinical
scenarios
31Examples Vascular model
A Sock skin B Film canister for support C
Foam curler connective tissue D Straw vessel
32Examples DPL model
A Fine fabric peritoneum B Foam connective
tissue C Shower curtain skin D PVC pipe
intestines E Umbilicus marking
33Examples Lumbar puncture model
A Box spinous process B Film canister lateral
masses b Lid of film canister C Foam curler
connective tissue D Dural pop from packing
bubbles Not seen pillow muscular layer
34Examples Thoracotomy model
A Shower curtain skin B Foam connective
tissue C Laundry basket rib cage D Clips E
Packing air bag lungs F Ice cube tray spine G
Plastic bag pericardium with tape
phrenic nerve H covered football heart
with hole I Tubing esophagus with
NG in place J Tubing aorta
35Examples Thoracotomy model
36Improvised Technology
Audience Comments
37Screen Based Simulation
- Desktop Computer
- Strengths low cost, distance learning, variety
of cases, improving realism, self guided - Weaknesses procedural skills, teamwork skills
38Screen Based Simulation
- Laerdal Microsim
- www.Anesoft.com
- ACLS
- Critical Care
- Anesthesia
- Sedation
- Neonatal
39Screen Based Simulation
Audience Comments
40Task Trainers
- Devices designed to simulate a specific task or
procedure. - Examples
- Lap simulator
- Bronch simulator
- Traumaman
- Artificial knee
41Task Trainers
42Task Trainers
- Strengths
- High fidelity, good research on efficacy, may
have self guided teaching, metrics available - Weaknesses
- Poor haptics on most machines, expensive, focus
on single task, not integrated into complete
patient care
43Task Trainers
Audience Comments
44Low Fidelity Mannequins
- Features
- Static airways
- /- rhythm generation
- No/minimal programmed responses.
- Strengths Low cost, reliable, easy to use,
portable - Weaknesses Limited features, less interactive,
instructor required
45Low Fidelity Mannequins
46Mid Fidelity Mannequins
- Relatively new class of mannequins, often used
for ACLS training. - Features
- Active airways ETT, LMA, Combitube
- Breathing/pulses, rhythms
- Basic procedures pacing, defibrillation
- Some automated response and programmed scenarios
47Mid Fidelity Mannequins
- Strengths
- Active airways, somewhat interactive, moderate
cost, moderate portability - Weaknesses
- Semiskilled instructor, limited advanced
procedures (lines, chest tubes)
48High Fidelity Mannequins
- Mannequin with electrical, pneumatic functions
driven by a computer. - Adult, child and newborn models
- Features
- Dynamic airways, reactive pupils
- Heart sounds, lung sounds, chest movement
- Pulses, rhythms, vital signs
- Abdominal sounds, voice
- CO2 exhalation, cardiac output, invasive
pressures - Bleeding, salivation, lacrimation
49High Fidelity Mannequins
- Procedures
- O2, BVM, Oral/nasal airway, ETT, LMA, Cric
- Pericardiocentesis, PIV
- Defibrillation, Pacing, CPR
- Needle or open thoracentesis
- TOF, Internal gas analysis
- Foley placement
- Reacts to medications
50Features
51Laerdal vs. METI
- Laerdal
- Instructor programmed physiology changes
- Windows
- Terrific Airway
- Reliability
- Ease of Use
- Cost 35-45K
- METI
- Physiology modeled to respond to interventions
- Macintosh
- Drug Recognition
- Gas Analyzer
- Two Cost Levels
- ECS 45K
- HPS gt150K
52(No Transcript)
53(No Transcript)
54(No Transcript)
55High Fidelity Mannequins
- Strengths
- Many dynamic responses, preprogrammed scenarios,
widest variety of procedures, most immersive. - Weaknesses
- Cost, procedures are not very realistic,
reliability, lack of portability, significant
instructor training required.
56Mannequins
Audience Comments
57Virtual Reality
- Advanced form of human-computer interaction
- Allow humans to work in the computers world
- Environment understandable to us
- Four necessary components
- Software
- Hardware
- Input devices
- Output devises
58Input and Output devices
59Virtual Reality
- Types of VR applicable to medicine
- Immersive VR
- Desktop VR
- Pseudo-VR
- Augmented reality
60Immersive VR
61Desktop VR
62Pseudo-VR
63Augmented Reality
64Virtual Reality
Audience Comments
65Outline
- Introduction
- Spectrum of Simulation Equipment
- Best Practice Examples
- Hands-on Practice
66Research
- Rapidly expanding body of literature since 2000.
- First issue of Simulation in Healthcare Jan
2006. - Many articles on look at what we did level and
data that says everyone thought it was nifty. - Focus on best practices in teaching/learning and
assessment using simulation.
67Best Teaching Practices
- Screen based teaching with feedback is better
than self study. - Schwid, H. A., G. A. Rooke, et al. (2001).
"Screen-based anesthesia simulation with
debriefing improves performance in a
mannequin-based anesthesia simulator." Teaching
Learning in Medicine 13(2) 92-6. - We measured the effectiveness of screen-based
simulator training with debriefing on the
response to simulated anesthetic critical
incidents. - The intervention group handled 10 anesthetic
emergencies using the screen-based anesthesia
simulator program and received written feedback
on their management, whereas the traditional
(control) group was asked to study a handout
covering the same 10 emergencies. - All residents then were evaluated on their
management of 4 standardized scenarios in a
mannequin-based simulator using a quantitative
scoring system. - Residents who managed anesthetic problems using a
screen-based anesthesia simulator handled the
emergencies in a mannequin-based anesthesia
simulator better than residents who were asked to
study a handout covering the same problems.
68Best Teaching Practices
- Comparing simulation to other teaching modalities
demonstrates some slight advantages. - Lee, S. K., M. Pardo, et al. "Trauma assessment
training with a patient simulator a prospective,
randomized study." Journal of Trauma-Injury
Infection Critical Care. 55(4)651-7, 2003 Oct. - Interns (n 60) attended a basic trauma course,
and were then randomized to trauma assessment
practice sessions with either the patient
simulator (n 30) or a moulage patient (n 30).
After practice sessions, interns were randomized
a second time to an individual trauma assessment
test on either the simulator or the moulage
patient. - Within randomized groups, mean trauma assessment
test scores for all simulator-trained interns
were higher when compared with all
moulage-trained interns. - Use of a patient simulator to introduce trauma
assessment training is feasible and compares
favorably to training in a moulage setting.
69Best Teaching Practices
- Simulation can be an effective replacement for
live practice for some skills. - Hall, R. E., J. R. Plant, et al. (2005). "Human
Patient Simulation Is Effective for Teaching
Paramedic Students Endotracheal Intubation." Acad
Emerg Med 12(9) 850-855. - Paramedic students (n 36) with no prior ETI
training received identical didactic and
mannequin teaching. After randomization, students
were trained for ten hours on a patient simulator
(SIM) or with 15 intubations on human subjects in
the OR. All students then underwent a formalized
test of 15 intubations in the OR. - When tested in the OR, paramedic students who
were trained in ETI on a simulator are as
effective as students who trained on human
subjects.
70Best Teaching Practices
- Learner centered teaching with simulation.
- Gordon, J. A. and J. Pawlowski (2002). "Education
on-demand the development of a simulator-based
medical education service." Academic Medicine.
77(7) 751-2. - Using the simulator, we wanted to create a
medical education service-like any other clinical
teaching service, but designed exclusively to
help students fill in the gaps in their own
education, on demand. We hoped to mitigate the
inherent variability of standard clinical
teaching, and to augment areas of deficiency. - Upon arriving at the skills lab for their
appointments, students would proceed to
interview, evaluate, and treat the
mannequin-simulator as if it were a real patient,
using the instructor for assistance as needed.
All students participated in an educational
debriefing after each session. - Customized, realistic clinical correlates are now
readily available for students and teachers,
allowing reliable access to "the good teaching
case."
71Best Teaching Practices
- Cheap may be as good as expensive.
- Keyser, E. J., A. M. Derossis, et al. (2000). "A
simplified simulator for the training and
evaluation of laparoscopic skills." Surg Endosc
14(2) 149-53. - The purpose of this study was to compare a
simplified mirrored-box simulator to the video-
laparoscopic cart system. - 22 surgical residents performed seven structured
tasks in both simulators in random order. Scores
reflected precision and speed. - There were no significant differences in mean raw
scores between the simulators for six of the
seven tasks. - A mirrored-box simulator was shown to provide a
reasonable reflection of relative performance of
laparoscopic skills.
72Best Teaching Practices
- Team behavior can be effected by focused
simulation experiences. - Shapiro, M. J., J. C. Morey, et al. (2004).
"Simulation based teamwork training for emergency
department staff does it improve clinical team
performance when added to an existing didactic
teamwork curriculum?" Quality Safety in Health
Care 13(6) 417-21. - ED staff who had recently received didactic
training in the Emergency Team Coordination
Course (ETCC) also received an 8 hour intensive
experience in an ED simulator in which three
scenarios of graduated difficulty were
encountered. A comparison group, also ETCC
trained, was assigned to work together in the ED
for one 8 hour shift. - Experimental and comparison teams were observed
in the ED before and after the intervention. - The experimental team showed a trend towards
improvement in the quality of team behavior (p
0.07) the comparison group showed no change in
team behavior during the two observation periods
(p 0.55).
73Best Teaching Practices
- Innovative use of two new technologies helps to
engage learners in a large group setting. - Vozenilek, J., E. Wang, et al. (2005).
"Simulation-based Morbidity and Mortality
Conference New Technologies Augmenting
Traditional Case-based Presentations." Acad Emerg
Med j.aem.2005.08.015. - The use of two separate technologies were
enlisted a METI high-fidelity patient simulator
to re-create the case in a more lifelike fashion,
and an audience response system to collect
clinical impressions throughout the case
presentation and survey data at the end of the
presentation. - The re-creation of the patient encounter with all
relevant physical findings displayed in high
fidelity, with relevant laboratory data, nursing
notes, and imaging as it occurred in the actual
case, provides a more engaging format for the
resident-learner.
74Best Teaching Practices
- Orientation
- Introduction to session
- Expectations
- What is real/what is not
- Self assessment
- Debriefing
- Evaluation
75How To Best Use Simulation
- Provide feedback
- Give opportunities for repetitive practice
- Integrate simulation into overall curriculum
- Provide increasing levels of difficulty
- Provide clinical variation in scenarios
- Control environment
- Provide individual and team learning
- Define outcomes and benchmarks
76Best Teaching Practices
Audience Comments
77Best Assessment Practices
- Simulation has some data to support its use as an
assessment modality. - Schwid, H. A., G. A. Rooke, et al. (2002).
"Evaluation of anesthesia residents using
mannequin-based simulation a multiinstitutional
study." Anesthesiology 97(6) 1434-44. - 99 anesthesia residents consented to be
videotaped during their management of four
simulated scenarios on MedSim or METI
mannequin-based anesthesia simulators - Construct-related validity of mannequin-based
simulator assessment was supported by an overall
improvement in simulator scores from CB and CA-1
to CA-2 and CA-3 levels of training. - Criterion-related validity was supported by
moderate correlation of simulator scores with
departmental faculty evaluations (0.37-0.41, P lt
0.01), ABA written in-training scores (0.44-0.49,
lt 0.01), and departmental mock oral board scores
(0.44-0.47, P lt 0.01). - Reliability of the simulator assessment was
demonstrated by very good internal consistency
(alpha 0.71-0.76) and excellent interrater
reliability (correlation 0.94-0.96 P lt 0.01
kappa 0.81-0.90).
78Best Assessment Practices
- Task trainers appear to be a valid method for
assessing procedural competence. - Adrales, G. L., A. E. Park, et al. (2003). "A
valid method of laparoscopic simulation training
and competence assessment." Journal of Surgical
Research 114(2) 156-62. - Subjects (N 27) of varying levels of surgical
experience performed three laparoscopic
simulations, representing appendectomy (LA),
cholecystectomy (LC), and inguinal hemiorrhaphy
(LH). - Years of experience directly correlated with the
skills ratings (all P lt 0.001) and with the
competence ratings across the three procedures (P
lt 0.01). Experience inversely correlated with the
time for each procedure (P lt 0.01) and the
technical error total across the three models (P
lt 0.05).
79Best Assessment Practices
- Multiple simulated encounters are needed to
accurately assess resident abilities. - Boulet, J. R., D. Murray, et al. (2003).
"Reliability and validity of a simulation-based
acute care skills assessment for medical students
and residents." Anesthesiology 99(6) 1270-80. - The authors developed and tested 10 simulated
acute care situations that clinical faculty at a
major medical school expects graduating
physicians to be able to recognize and treat at
the conclusion of training. Forty medical
students and residents participated in the
evaluation of the exercises. - The reliability of the simulation scores was
moderate and was most strongly influenced by the
choice and number of simulated encounters. - However, multiple simulated encounters, covering
a broad domain, are needed to effectively and
accurately estimate student/resident abilities in
acute care settings.
80Best Assessment Practices
- Checklists scoring of videotaped performance can
have a high degree of inter-rater reliability. - Devitt, J. H., M. M. Kurrek, et al. (1997).
"Testing the raters inter-rater reliability of
standardized anaesthesia simulator performance."
Can J Anaesth 44(9) 924-8. - We sought to determine if observers witnessing
the same event in an anaesthesia simulator would
agree on their rating of anaesthetist
performance. - Two one-hour clinical scenarios were developed,
each containing five anaesthetic problems. - Video tape recordings were generated through
role-playing with recording of the two scenarios
three times each resulting in a total of 30
events to be evaluated. Two clinical
anaesthetists, reviewed and scored each of the 30
problems independently. - The raters were in complete agreement on 29 of
the 30 items. There was excellent inter- rater
reliability ( 0.96, P 0.001).
81Best Assessment Practices
- Validation that simulator performance correlates
with real practice. - Fried, G. M., A. M. Derossis, et al. (1999).
"Comparison of laparoscopic performance in vivo
with performance measured in a laparoscopic
simulator." Surg Endosc 13(11) 1077-81
discussion 1082. - Twelve PGY3 residents were given a baseline
evaluation in the simulator and in the animal
model. They were then randomized to either five
practice sessions in the simulator (group A) or
no practice (group B). Each group was retested in
the simulator and in the animal (final test). - Performance in an in vitro laparoscopic simulator
correlated significantly with performance in an
in vivo animal model. Practice in the simulator
resulted in improved performance in vivo.
82Best Assessment Practices
- ABEM type assessment tools measure performance
equally well in oral or simulation environments. - Gordon, J. A., D. N. Tancredi, et al. (2003).
"Assessment of a clinical performance evaluation
tool for use in a simulator-based testing
environment a pilot study." Academic Medicine
78(10 Suppl). - Twenty-three subjects were evaluated during five
standardized encounters using a patient
simulator. - Performance in each 15-minute session was
compared with performance on an identical number
of oral objective-structured clinical examination
(OSCE) sessions used as controls. - In this pilot, a standardized oral OSCE scoring
system performed equally well in a
simulator-based testing environment.
83Best Assessment Practices
- There are many aspects of human
knowledge/skills/attitudes to assess and the
correct tool must be used for each one. - Kahn, M. J., W. W. Merrill, et al. (2001).
"Residency program director evaluations do not
correlate with performance on a required 4th-year
objective structured clinical examination."
Teaching Learning in Medicine 13(1) 9-12. - We surveyed program directors about the
performance of 50 graduates from our medical
school chosen to represent the highest (OSCEHI)
and lowest (OSCELO) 25 performers on our required
4th-year OSCE. - OSCE scores did not correlate with Likert scores
for any survey parameter studied (r lt .23, p gt
.13 for all comparisons). Similarly, program
director evaluations did not correlate with class
rank or USMLE scores (r lt .26, p gt .09 for all
comparisons). - We concluded that program director evaluations of
resident performance do not appear to correlate
with objective tests of either clinical skills or
knowledge taken during medical school.
84Best Assessment Practices
- Softer competencies like professionalism can be
assessed with the aid of simulation technology. - Gisondi, M. A., R. Smith-Coggins, et al. (2004).
"Assessment of Resident Professionalism Using
High-fidelity Simulation of Ethical Dilemmas."
Acad Emerg Med 11(9) 931-937. - Each resident subject participated in a simulated
critical patient encounter during an Emergency
Medicine Crisis Resource Management course. An
ethical dilemma was introduced before the end of
each simulated encounter. Resident responses to
that dilemma were compared with a - It was observed that senior residents (second and
third year) performed more checklist items than
did first-year residents (p lt 0.028 for each
senior class). - Residents performed a critical action with 100
uniformity across training years in only one
ethical scenario ("Practicing Procedures on the
Recently Dead"). Residents performed the fewest
critical actions and overall checklist items for
the "Patient Confidentiality" case. - Although limited by small sample size, the
application of this performance-assessment tool
showed the ability to discriminate between
experienced and inexperienced EMRs with respect
to a variety of aspects of professional
competency.
85Best Assessment Practices
- The scoring/evaluation system chosen to assess
simulated performance is critical. - Regehr, G., R. Freeman, et al. (1999). "Assessing
the generalizability of OSCE measures across
content domains." Academic Medicine 74(12)
1320-2. - Students' scores from three OSCEs at one
institution were correlated to determine the
generalizability of the scoring systems across
course domains. - Analysis revealed that while checklist scores
showed quite low correlations across examinations
from different domains (ranging from 0.14 to
0.25), global process scores showed quite
reasonable correlations (ranging from 0.30 to
0.44). - These data would seem to confirm the intuitions
about each of these measures the checklist
scores are highly content-specific, while the
global scores are evaluating a more broadly based
set of skills.
86Best Assessment Practices
- Learners are smart and will figure out the game.
- Jones, J. S., S. J. Hunt, et al. (1997).
"Assessing bedside cardiologic examination skills
using "Harvey," a cardiology patient simulator ."
Acad Emerg Med 4(10) 980-5. - To assess the cardiovascular physical examination
skills of emergency medicine (EM) housestaff and
attending physicians. - Prospective, cohort assessment of EM housestaff
and faculty performance on 3 valvular abnormality
simulations conducted on the cardiology patient
simulator, "Harvey." - Forty-six EM housestaff (PGY1-3) and attending
physicians were tested over a 2-month study
period. Physician responses did not differ
significantly among the different levels of
postgraduate training. - Housestaff and faculty had difficulty
establishing a correct diagnosis for simulations
of 3 common valvular heart diseases. However,
accurate recognition of a few critical signs was
associated with a correct diagnosis in each
simulation.
87Best Assessment Practices
- Determine what you want to assess.
- Design a simulation that provokes this
performance. - Observe/record the performance.
- Analyze the performance using some type of
rubric checklist, GAS, etc. - Debriefing, feedback and teaching.
88Best Assessment Practices
Audience Comments
89Outline
- Introduction
- Spectrum of Simulation Equipment
- Best Practice Examples
- Hands-on Practice
90Summary
-
- Simulation is one tool
- (new, expensive and exciting)
- in our educational repertoire.
- (Similar to lecture, case discussion, skill lab,
MCQ, SP, etc.)
91Summary
- Provide feedback
- Give opportunities for repetitive practice
- Integrate simulation into overall curriculum
- Provide increasing levels of difficulty
- Provide clinical variation in scenarios
- Control environment
- Provide individual and team learning
- Define outcomes and benchmarks
- Determine what you want to assess.
- Design a simulation that provokes this
performance. - Observe/record the performance.
- Analyze the performance using some type of
rubric checklist, GAS, etc. - Debriefing, feedback and teaching.
92Outline
- Introduction
- Spectrum of Simulation Equipment
- Best Practice Examples
- Hands-on Practice
93References
- Features and uses of high-fidelity medical
simulations that lead to effective learning a
BEME systematic review. Issenberg, McGaghie,
Petrusa, Gordon and Scalese. Medical Teacher, vol
27, 2005, p 10-28. - Loyd GE, Lake CL, Greenberg RB. Practical Health
Care Simulations. Philadelphia, PA.
Elsevier-Mosby. 2004. - Bond WF, Spillane L, for the CORD Core
Competencies Simulation Group The use of
simulation for emergency medicine resident
assessment. Acad Emerg Med 200291295-1299. - ACGME Resources
- www.acgme.org/Outcome/assess/Toolbox.pdf
- www.acgme.org/Outcome/assess/ToolTable.pdf
- Accessed on Feb 2nd 2006.
94Additional References
- 1. Glassman PA, Luck J, O'Gara EM, Peabody JW.
Using standardized patients to measure quality
evidence from the literature and a prospective
study. Joint Commission Journal on Quality
Improvement. 2000 26644-653. - 2. Owen H, Plummer JL. Improving learning of a
clinical skill the first year's experience of
teaching endotracheal intubation in a clinical
simulation facility. Medical Education. 2002
36635-642. - 3. Pittini R, Oepkes D, Macrury K, Reznick R,
Beyene J, Windrim R. Teaching invasive perinatal
procedures assessment of a high fidelity
simulator-based curriculum. Ultrasound in
Obstetrics Gynecology. 2002 19478-483. - 4. Kurrek MM, Devitt JH, Cohen M. Cardiac arrest
in the OR how are our ACLS skills? Can J
Anaesth. 1998 45130-2. - 5. Schwid HA, Rooke GA, Ross BK, Sivarajan M. Use
of a computerized advanced cardiac life support
simulator improves retention of advanced cardiac
life support guidelines better than a textbook
review. Critical Care Medicine. 1999 27821-824. - 6. Rosenblatt MA, Abrams KJ, New York State
Society of Anesthesiologists I, Committee on
Continuing Medical E, Remediation, Remediation
S-C. The use of a human patient simulator in the
evaluation of and development of a remedial
prescription for an anesthesiologist with lapsed
medical skills. Anesthesia Analgesia. 2002
94149-153, table of contents. - 7. Gisondi MA, Smith-Coggins R, Harter PM,
Soltysik RC, Yarnold PR. Assessment of Resident
Professionalism Using High-fidelity Simulation of
Ethical Dilemmas. Acad Emerg Med. 2004
11931-937. - 8. Schwid HA, Rooke GA, Michalowski P, Ross BK.
Screen-based anesthesia simulation with
debriefing improves performance in a
mannequin-based anesthesia simulator. Teaching
Learning in Medicine. 2001 1392-96. - 9. Gaba DM, Fish K.J., Howard S.K. Crisis
Management in Anesthesiology. New York Churchill
Livingstone 1994. - 10. Reznek M, Smith-Coggins R, Howard S, Kiran K,
Harter P, Sowb Y, Gaba D, et al. Emergency
Medicine Crisis Resource Management (EMCRM)
Pilot Study of a Simulation-based Crisis
Management Course for Emergency Medicine. Acad
Emerg Med. 2003 10386-389. - 11. Small SD, Wuerz RC, Simon R, Shapiro N, Conn
A, Setnik G. Demonstration of high-fidelity
simulation team training for emergency medicine.
Academic Emergency Medicine. 1999 6312-323.
95Additional References
- 12. Shapiro MJ, Morey JC, Small SD, Langford V,
Kaylor CJ, Jagminas L, Suner S, et al. Simulation
based teamwork training for emergency department
staff does it improve clinical team performance
when added to an existing didactic teamwork
curriculum?see comment. Quality Safety in
Health Care. 2004 13417-421. - 13. Berkenstadt H, Ziv A, Barsuk D, Levine I,
Cohen A, Vardi A. The use of advanced simulation
in the training of anesthesiologists to treat
chemical warfare casualties. Anesthesia
Analgesia. 2003 961739-1742, table of contents. - 14. Kyle RR, Via DK, Lowy RJ, Madsen JM, Marty
AM, Mongan PD. A multidisciplinary approach to
teach responses to weapons of mass destruction
and terrorism using combined simulation
modalities.see comment. Journal of Clinical
Anesthesia. 2004 16152-158. - 15. Kobayashi L, Shapiro MJ, Suner S, Williams
KA. Disaster medicine the potential role of high
fidelity medical simulation for mass casualty
incident training. Medicine Health, Rhode
Island. 2003 86196-200. - 16. Kassirer JPaK, R. I. Learning Clinical
Reasoning. First ed. Baltimore, MD Williams and
Wilkins 1991. - 17. Bond WF DL, Kostenbader M, Worrilow CC.
"Using Human Patient Simulation to Instruct
Emergency Medicine Residents in Cognitive Forcing
Strategies". Paper presented at Innovation in
Emergency Medical Education Exhibit, SAEM Annual
Meeting, . 2003 Boston. - 18. Croskerry P. Achieving quality in clinical
decision making cognitive strategies and
detection of bias. Academic Emergency Medicine.
2002 91184-1204. - 19. Croskerry P. The importance of cognitive
errors in diagnosis and strategies to minimize
them.comment. Academic Medicine. 2003
78775-780. - 20. Boulet JR, Murray D, Kras J, Woodhouse J,
McAllister J, Ziv A. Reliability and validity of
a simulation-based acute care skills assessment
for medical students and residents.
Anesthesiology. 2003 991270-1280. - 21. Bond WF, Spillane L. The use of simulation
for emergency medicine resident assessment.
Academic Emergency Medicine. 2002 91295-1299. - 22. Byrne AJ, Greaves JD. Assessment instruments
used during anaesthetic simulation review of
published studies. British Journal of
Anaesthesia. 2001 86445-450. - 23. Gordon JA, Tancredi DN, Binder WD, Wilkerson
WM, Shaffer DW. Assessment of a clinical
performance evaluation tool for use in a
simulator-based testing environment a pilot
study. Academic Medicine. 2003 78.
96Additional References
- 24. LaMantia J, Rennie W, Risucci DA, Cydulka R,
Spillane L, Graff L, Becher J, et al.
Interobserver variability among faculty in
evaluations of residents' clinical skills.
Academic Emergency Medicine. 1999 638-44. - 25. Morgan PJ, Cleave-Hogg D, McIlroy J, Devitt
JH. Simulation technology a comparison of
experiential and visual learning for
undergraduate medical students. Anesthesiology.
2002 9610-16. - 26. Schwid HA, Rooke GA, Carline J, Steadman RH,
Murray WB, Olympio M, Tarver S, et al. Evaluation
of anesthesia residents using mannequin-based
simulation a multiinstitutional study.
Anesthesiology. 2002 971434-1444. - 27. Beaulieu MD, R. M., Hudon E, Saucier D,
Remondin M Favreau R (2003). "Using standardized
patients to measure professional performance of
physicians." International journal for quality in
health care journal of the International
Society for Quality in Health Care / ISQua 15(3)
251-9 - 28. Chapman DM. Rhee KJ. Marx JA. Honigman B.
Panacek EA. Martinez D. Brofeldt BT. Cavanaugh
SH. Open thoracotomy procedural competency
validity study of teaching and assessment
modalities. Annals of Emergency Medicine.
28(6)641-7, 1996 Dec. - 29. Cubison, T. C. S. and T. Clare (2002).
"Lasagne a simple model to assess the practical
skills of split-skin graft harvesting and
meshing." British Journal of Plastic Surgery
55(8) 703-4. - 30. Davidson R, D. M., Rathe R, Pauly R, Watson
RT (2001). "Using standardized patients as
teachers a concurrent controlled trial."
Academic medicine journal of the Association of
American Medical Colleges 76(8) 840-3 - 31. Hance, J., R. Aggarwal, et al. (2005).
"Objective assessment of technical skills in
cardiac surgery." European Journal of
Cardio-Thoracic Surgery 28(1) 157-62. - 32. Maran, N. J. and R. J. Glavin (2003). "Low-
to high-fidelity simulation - a continuum of
medical education?see comment." Medical
Education 37 Suppl 1 22-8. - 33. Nikendei, C., A. Zeuch, et al. (2005).
"Role-playing for more realistic technical skills
training." Medical Teacher 27(2) 122-6Clauser,
B. E., S. G. Clyman, et al. (1996). "Are fully
compensatory models appropriate for setting
standards on performance assessments of clinical
skills?" Academic Medicine 71(1 Suppl) S90-2. - 34. Williams, R. G. (2004). "Have standardized
patient examinations stood the test of time and
experience?" Teaching Learning in Medicine
16(2) 215-22