Getting Started in Medical Education Research - PowerPoint PPT Presentation

1 / 47
About This Presentation
Title:

Getting Started in Medical Education Research

Description:

... team: Carol Hodgson, LuAnn Wilkerson, David Irby, Judy Shea, Clair Kuykendall, ... Provide knowledge necessary to understand purpose and process of medical ... – PowerPoint PPT presentation

Number of Views:607
Avg rating:3.0/5.0
Slides: 48
Provided by: carolsho
Category:

less

Transcript and Presenter's Notes

Title: Getting Started in Medical Education Research


1
Getting Started in Medical Education Research
  • 2009
  • Ernie Yoder, MD, PhD (SJH)
  • Development team Carol Hodgson, LuAnn Wilkerson,
    David Irby, Judy Shea, Clair Kuykendall, Larry
    Gruppen, Ernie Yoder

2
MERC
  • Medical Education Research Certificate
  • Provide knowledge necessary to understand purpose
    and process of medical education research
  • To develop more informed consumers of medical
    education research literature
  • To develop more effective collaborators in
    medical education research

3
MERC
  • Not intended to singularly develop independent
    medical education researchers

4
MERC Curriculum
  • Formulate questions/design studies
  • Hypothesis Directed Research session
  • Search and evaluate literature
  • Basic statistics and data management
  • Measuring educational outcomes reliability and
    validity
  • Institutional Review Boards and Ethics (plan to
    make on-line)

5
MERC Curriculum
  • 6. Qualitative research methods
  • 7. Program evaluation and evaluation research
  • 8. Questionnaire design and survey research
  • 9. Other new topics under discussion, such as
    writing the research report

6
Research Plan Goals
  • What do you wish to get from this session?
  • Attendees will be able to discuss steps in the
    research process
  • Moving from general issues to specific research
    questions
  • Using the literature to refine the question
  • Measuring the variables
  • Selecting the right research design

7
IRB goals
  • Participants will be able to
  • decide when an educational innovation requires
    IRB review
  • identify characteristics of educational research
    that are of concern to IRBs
  • decide when asking for exemption is a viable
    strategy

8
The Research Process
Define Research Question
No hole in the literature
Hole in literature
Conduct Literature Review
Refine Question
Operationalize variables
Design Study
Obtain IRB Approval
Collect Analyze Data
Write and Report Results
9
Recognizing Choosing Among Research
Opportunities
  • What situations or problems tend to puzzle,
    fascinate, challenge, or interest you?
  • List as many ideas or research questions as you
    can as quickly as possible
  • Identify which are
  • - Most interesting (I)
  • - Feasible (F)
  • - Fundable ()
  • - Best overall ()
  • Write/Rewrite your best idea or research question?

10
Research that Makes a Difference
  • Investigates important questions
  • Is ethical
  • Connected to theory
  • Connects the study to prior research
  • Uses appropriate research design and analysis
    procedures
  • Disseminates results

11
Definitions
  • What is a variable?
  • What is the dependent variable?
  • What is the independent variable?
  • What is a sample?
  • What is a population?
  • What is measurement?
  • What is instrumentation?

12
The Research Question
  • Based on literature/theory
  • Includes sample description (e.g., 4th-year
    medical students)
  • Includes study design (e.g., relationship,
    difference between groups, etc.)
  • Includes the independent dependent variables
  • Is measurable
  • Stated as a question or hypothesis

13
Example Research Question
  • Do first-year medical students who complete a
    student-run anatomy review course score higher on
    the anatomy final exam than students who do not
    complete the review course?
  • What is the independent variable?
  • What is the dependent variable?
  • What are possible control variables?
  • What is the sample?
  • What is your design?

14
A FINER Research Question
  • Feasible (Can sample, get data)
  • Interesting (To you)
  • Novel (New)
  • Ethical (Values)
  • Relevant (To others)

15
The Research Process
Define Research Question
No hole in the literature
Hole in literature
Conduct Literature Review
Refine Question
Operationalize variables
Operationalize variables
Design Study
Design Study
Obtain IRB Approval
Obtain IRB Approval
Collect Analyze Data
Collect Analyze Data
Write and Report Results
Write and Report Results
16
Searching for Related Work
  • Details in later workshop (search/appraise lit)
  • Identify possible sources of information
  • Colleagues and librarians
  • Databases, PUBMED, PSYCLIT, Science Citation
    Index
  • Journals, chapters, books, publications
  • Read critically and summarize
  • Citations referenced
  • Sample size
  • Study design and limitations
  • Overall conclusions

17
The Research Process
Define Research Question
No hole in the literature
Hole in literature
Conduct Literature Review
Refine Question
Operationalize variables
Operationalize variables
Design Study
Design Study
Obtain IRB Approval
Obtain IRB Approval
Collect Analyze Data
Collect Analyze Data
Write and Report Results
Write and Report Results
18
Elaborating and Focusing
  • At your table (in your group)
  • Each person share their best question with the
    group.
  • Critique each others research questions
  • Choose one FINER question (enter on 1.)
  • Refine the group question based on critique
    (enter on 2.)
  • Write final version of question (enter on 3.)

19
Critique Your Question
  • Clearly stated?
  • Stated as a question?
  • Testable?
  • Defines variables to be studied?
  • Defines sample to be studied?
  • Describes the setting for the study?
  • Answer questions (4-13)

20
Sharing questionsGroup critique of example
Which terms require clear definitions?
21
The Research Process
Define Research Question
No hole in the literature
Hole in literature
Conduct Literature Review
Refine Question
Operationalize variables
Design Study
Obtain IRB Approval
Collect Analyze Data
Write and Report Results
22
Types of Research
  • Qualitative Research
  • Correlational Research
  • Experimental and Quasi-Experimental Research
  • Causal-Comparative Research

23
Qualitative ResearchWhat is the phenomenon?
  • Use When
  • Focus on meaning and context
  • In-depth recording and triangulation
  • Inductively derived interpretation
  • Methods
  • Interview
  • Observation
  • Think aloud, stimulated recall
  • Chart review
  • Surveys

24
Correlational Research
  • Use When
  • Predictors cant be randomized
  • Subjects /or treatments not controllable
  • Control groups not available
  • Methods
  • Surveys
  • Tests
  • Chart review
  • Archived data

All data is confidential
Example predict resident performance on board
exam
25
Experimental and Quasi-Experimental Research
  • Use When
  • Temporal relationship
  • Feasible explanatory mechanism
  • No valid alternative explanation
  • Subjects and treatments controlled
  • Methods
  • Control over treatment and measurement
  • Randomization
  • Control Groups

Example Comparison of learning strategies for
reaching competency
26
Causal-Comparative
  • Focus is determining cause for or consequences of
    differences between groups of people (similar to
    prospective or retrospective cohort studies)
  • Can look at levels of exposure, comparing 2 or
    more groups
  • Does not give conclusive evidence
  • Identifies possible causes of variation (e.g. IQ,
    Gender, Exposures

27
Operationalization and Measurement
  • Three basic questions
  • What do you measure?
  • How do you measure?
  • How well do you measure?

28
What do you measure?
  • Outcomes need to be aligned with the hypotheses
    and purposes of the study
  • Covariates factors that likely co-vary with
    and/or may influence the responses
  • Multiple variables for underlying factor,
    dimension, or construct
  • Any feature that is relevant to the environment
  • LEARNING/Achievement presentation, practice,
    tutoring, previous experience, time on task,
    environment

29
How do you measure?
  • Operationalization is essential to the conduct of
    the study
  • Counting events (frequency of occurrence)
  • Measuring time and physical quantities
  • Externalizing internal (psychological) states,
    events, and processes (attitudes) and experience
  • Ratings, rankings, etc. questions

30
Educational Measurement
  • Knowledge
  • Tests (MCQ, essay, oral)
  • Attitudes
  • Questionnaires, surveys
  • Behavior or performance (skills)
  • OSCEs, standardized patients, direct observation

31
How Well ...
  • Reliability score accuracy or stability
  • Would the score be repeated if tested again?
  • Would the score be reproduced by different
    raters?
  • Validity score meaning
  • Does the score measure what you intend to measure?

32
The Research Process
Define Research Question
No hole in the literature
Hole in literature
Conduct Literature Review
Refine Question
Operationalize variables
Design Study
Obtain IRB Approval
Collect Analyze Data
Write and Report Results
33
What do you want to study?
  • Description of groups?
  • Differences between groups?
  • Relationships among groups?
  • Predictions?

34
Defining Your Sample
  • Target population (conclusions about)
  • Accessible population (may not be the same)
  • Intended sample
  • Inclusion and exclusion criteria (reasoned,
    supported by literature)
  • Availability
  • Time frame
  • Willingness to participate
  • Intended variables
  • Actual sample (selection/access)

35
Research DesignsQuasi Experimental
  • NON-RANDOMIZED (convenience)
  • One shot case study
  • X -----O
  • One group Pretest-Posttest
  • O-----X-----O
  • (OObservation XTreatment/intervention)

36
Research Designs Quasi-Experimental
  • NON-RANDOMIZED (convenience/cohorts)
  • Post-test only control group
  • X---------O
  • -----------O
  • Pre-test/Post-test Control Group
  • O--------X-------O
  • O-----------------O

37
Research Designs Experimental - Randomized
  • Post-test only control group (Randomized)
  • X---------O
  • -----------O
  • Pre-test/Post-test Control Group (Randomized)
  • O--------X-------O
  • O-----------------O
  • Solomon Four Group Design (Randomized)
  • (1) O------X------O
  • (2) O--------------O
  • (3) ------X------O
  • (4) --------------O R randomization

R
R
R
Factors our effects of pre-test
38
Threats to Internal Validity
  • History (of subjects differ)
  • Maturation (with experience)
  • Repeated measurement (adapt/learn)
  • Statistical regression (to the mean)
  • Selection (did randomization work?)
  • Loss of Subjects/mortality (not random)
  • Investigator bias (need for blinding)

Especially for long-term studies
39
External Validity
  • Is the sample representative of the population?
    Can the study be generalized to the population?
    (test, report table 1)
  • Are the conditions the same? For example,
    laboratory setting versus natural setting. (for
    generalizability, environ. effect)
  • Did the subjects act differently because they
    were subjects in the study (Hawthorne Effect,
    under observation)?

40
Creating a Study Design
OR
P
o
p
u
l
a
t
i
o
n
S
a
m
p
l
e
R
Help in visualizing the design
T
r
e
a
t
m
e
n
t
C
o
n
t
r
o
l
M
e
a
s
u
r
e
m
e
n
t
M
e
a
s
u
r
e
m
e
n
t
R


r
a
n
d
o
m
i
z
e
A
n
a
l
y
s
i
s
A
n
a
l
y
s
i
s
41
Draw Your DesignQuestion 14
42
Some Key Concepts
  • Hypothesis testing
  • Control
  • Randomness
  • Blinding and objectivity (train observers,
    measurers)

43
Planning Next Steps
  • Create a project plan tasks deadlines
  • Find collaborators, mentors, consultants
  • Search for funding
  • Protect time for research
  • Keep a research journal
  • Have fun

44
Is Your Study Research?
  • Research means a systematic investigation
    including research, development, testing, and
    evaluation to develop or contribute to
    generalizable knowledge
  • If you might publish the results, its research
  • What about quality or evaluation studies?

45
The Research Process
Define Research Question
No hole in the literature
Hole in literature
Conduct Literature Review
Refine Question
Operationalize variables
Design Study
Obtain IRB Approval
Collect Analyze Data
Write and Report Results
46
Institutional Review Board
  • Introduction

Development Team Carol Hodgson, Robin Harvan,
Larry Gruppen, Brian Mavis
47
The Purposes of Educational Research
  • Quality assurance of curricula
  • Evaluation of innovations
  • Cost-effectiveness assessments
  • Basic research into fundamental educational
    issues
  • Institutional research
  • Exemptions/Expedited Review

48
Certification in Human Subjects Research
  • http//ohsr.od.nih.gov/cbt/
  • http//www.research.umich.edu/training/peerrs.html

49
References
  • Bass, Dunn, Norton, Stewart, Tudiver. (1993).
    Conducting Research in the Practice Setting.
    Newbury Park, CA Sage.
  • Campbell Stanley (1963). Experimental and
    Quasi-experimental Designs for Research. Dallas
    Houghton Mifflin.
  • Glesne Peshkin (1992). Becoming Qualitative
    Researcher An Introduction. Longman.
  • Hulley Cummings (1988). Designing Clinical
    Research. Baltimore Williams Wilkins.

50
References
  • Carney, PA, Nierenberg, DW, et al. (2004).
    Educational Epidemiology Applying
    Population-Based Design and Analytic Approaches
    to Study Medical Education. JAMA. 2921044-1050.
  • Miller, MD, Linn, RL, Gronlund, NE. (2009).
    Measurement and Assessment in Teaching. 10th ed.
    Merrill. Upper Saddle River, NJ.
Write a Comment
User Comments (0)
About PowerShow.com