Title: Research Utilization: Moving Research to Practice
1Research Utilization Moving Research to Practice
- Laura Cohen, PhD
- Mobility RERC, Shepherd Center
- Stephen Sprigle, PhD
- Mobility RERC,Center, CATEA, GA Tech
- ISS 2007
Funding for this program was provided by NIDRR
through the RERC for Wheeled Mobility
(H133E030035) and the Research Utilization
Support and Help (RUSH) Project (H133A031402).
2Agenda
- Overview of issues related to Knowledge
Translation (KT) - Models of Training Evaluation
- Models of Research Utilization
- Mobility RERC- KT project
- Measures and Constructs Used
- Our Project Results
- Future Steps - Discussion
3Research to Practice
- Why is it important?
- Accountability
- Results
- Innovation
- Results shape development, practice, policy
- Who Cares?
- Patients and families
- Payors
- Policy Makers
4What is Knowledge Translation (KT)?
- KT is both a process and a strategy that can
lead to utilization of research findings and
improved outcomes for consumers - Canadian Institutes for Health Research (2004)
5Dissemination Challenges
- Multiple challenges to disseminating innovations
across clinical practice - Between the health care we have and the care we
could have lies not just a gap but a chasm (IOM,
2001) - In health care, invention is hard, but
dissemination is harder (Berswick, 2003)
63 things that influence adoption of innovation
- How innovation is perceived
- Characteristics of people who do/dont adopt
innovation (early late adopters) - Contextual factors such as
- leadership
- management
- Incentives
- communication
7Questions
- Why is there a chasm between new knowledge and
health care practice? - Why dont clinicians readily incorporate the
findings of clinical research quickly into their
daily practice? - Is there a knowledge gap or is there something
more fundamental and complex involved? - What can professions do to speed up the
dissemination of innovations into clinical
practice?
8Clinical Practice What is done
Research What is known
9What has already been done?
- Research in effectiveness of education
- Review of 600 articles published in medical
education research journals - Only 4 studies measured pt outcomes
- Remainder divided
- measuring acquisition of knowledge
- satisfaction
- Same issue for rehab education
10Looking at professional education
- How do we demonstrate that professional education
is producing clinicians who deliver high-quality
care? - What is the effect of professional education on
improving patient care? - What is the potential for research using
patient-centered clinical outcomes to measure the
performance of professional education?
11Current Model of Training
- Confounders
- Funding
- Workplace culture
- Access to technology
- Healthcare system
12Kirkpatricks HierarchyHow to evaluate the
effectiveness of training
Donald Kirkpatrick (1959)
13Kirkpatricks Hierarchy
- Criticisms
- Implies hierarchy of value related to levels
- Assumption that levels are associated
- Implies causal relationship
- Fails to account for confounders
- Acclaims
- Simple
- Pragmatic model for thinking about training
14Issues Affecting Evaluation
15Reaction Level
- Little correlation between
- learner reactions measures of learning OR
- learner reactions measures of changed behavior
- Satisfaction is not necessarily related to good
learning and sometimes discomfort is essential. - Mixed results may indicate that what is measured
at reaction level stage might be more informative
about value of training
16Learning
- Literature encourages use of pre/post
questionnaires to gauge learning - Trainee might be able to repeat what they have
learned but NOT be able to apply it - Performance during training may not be a
predictor of post-training performance - Testing may not be appropriate for measuring
attainment of skills
17Behavioral Change
- Organizational factors
- Work Culture, Administrative Support (top down)
- Other factors
- Perceived difficulty, Perceived usefulness, Job
commitment - Individual Factors
- Flexibility to change, Motivation to learn
curiosity, Learning curve - Evaluation of behavior change needs to account
for these factors
18Organizational Results
- Most difficult level of evaluation
- Implies training must be evaluated using hard
outcome data - Inherent difficulties
- Linking soft skills training to hard results
- Time delays in measurement
- Hard measures miss much that is of value
19What is happening now
- Training evaluation is becoming more common
- Predominate level of analysis Level 1
- Few attempts at levels 3 or 4
- Few companies with comprehensive training
evaluation attempt to justify ROI of training
20Conclusion
- Kirkpatricks model remains useful
- frames where evaluation might be made
- Remember to
- Consider intervening factors affecting strength
between links - Provide supports to help practitioner undertake
meaningful evaluation of use to the organization - Know what not to evaluate
- Keep it simple
21How do you do training?
- Models of Research Utilization
- Best-Practice Knowledge Transfer
- Collaborative Support
- Knowledge Synthesis
- Technology Transfer
- Other models
- Includes evaluation to determine impact
22Best Practice Knowledge Transfer
- Specific to setting
- Generalizing research findings to real life
- Uses conditions from research protocol and
transfers to clinical setting - Effective when transfer of skills and behaviors
among service providers is intended outcome
23Collaborative Support
- Credible source of information
- Users perceptions of information
- Probability of using information
- Networking- careful selection of partners
- Increase credibility with intended audiences and
systems - Resources means to disseminate
- User-friendly format
- Effective for awareness, attitudes and behaviors
24Knowledge Synthesis
- Knowledge not something that can be sent and
received - Requires understanding
- Developers
- Users
- Effects short-term outcomes affecting awareness,
learning and behaviors
25Technology Transfer
Idea
Prototype
Useful Product or Technology
- Activities/events that support movement prototype
to adoption - push/pull forces
- Useable and beneficial to target group
- Effects outcomes in areas of awareness,
motivations and decisions
26Other Models
- Blend of models
- Show theory-based approach for target audience,
system or outcomes - Strategies vary depending on
- characteristics of research results
- target user and/or system
27Effect of an educational research dissemination
program on practice patterns for professionals
recommending MWCs
- Objective
- To measure the utilization of rehabilitation
research training by measuring short and mid term
impacts of knowledge, attitudes and behaviors of
clinicians.
28Training Program Intervention
- Program Design
- Needs assessment
- current research related to SM
- how to compare equipment from one manufacturer to
another - how to justify equipment
- Clinicians responsible low exposure to MWC
evals - SADMERC identified cities in need of
education/training - 15 Contact Hours (1.5 CEUs)
- 5.5 hrs equipment labs
- 3 hrs case studies and group discussion
- 6 Two day training programs
29Study Enrollment
- 160 enrolled, 139 completed
- 23 withdrew or changed groups (16 lost to f/u)
- 2 changed groups (util. to conf only)
- 1 changed util. to control
- Reason? Lack of post conf WPR
- 48 utilization group (n38)
- 291 pre WPR, 209 post WPR
- 84 conference only group
- 57 clinicians (n52)
- 27 suppliers (n23)
- 28 control group (n26)
30Pre/Post Measures
- Reaction (Kirkpatricks Level 1)
- Conference eval form
- Knowledge (Kirkpatricks Level 2)
- Knowledge Questionnaire-15 multiple choice items
- Attitudes (Kirkpatricks Level 3)
- Manual Wheelchair (MWC) Questionnaire
- Behaviors (Kirkpatricks Level 3)
- Work Product Reviews (WPR)
- Feature tracking (Utilization Practices)
- Rubric scoring (Rationale)
31MWC Questionnaire(Kirkpatricks Level 3)
32Work Product Review
- Reviewed LOMN and order forms
- Feature Match
- Surveyed range of features specified
- Rubric
- Appraised clinical rationale using rubric
- Domains
- Problem Identification
- Feature Match
- Solution Selection
- Overall Impression
- Reliability Testing
- Intrarater reliability (n1, 10 random files, 1
month apart) - Coefficient alpha .93 rubric
- Coefficient alpha .95 feature match
33Cohort Conf Only degree, prof., yrs prac.,
yrs SM
Suppliers more hr SM, prof dev hrs
34(No Transcript)
35(No Transcript)
36(No Transcript)
37Rubric Analysis
- 38 subjects
- Different pre (291) and post (209) WPR
- Weighted totals used for analysis
- Paired sample correlations for pre/post
administrations were significant (from r.655 to
r.842) - Paired sample t-tests, Bonferonni corrected for
multiple testing revealed no sig. change for any
section between pre/post administrations
38Discussion
- Pretest rubric scores most predictive of post
test scores - Positive relationship between posttest scores and
experience - Psychometric properties of rubric
- Good intrarater reliability
- May not be sensitive to change associated with
training - ? Thwarted by of cases, facility documentation
systems - Further psychometric development
- Reliability (interrater reliability)
- Validity (content)
39Feature Utilization
40Discussion
- Feature match appears to be a psychometrically
good tool - Good test-retest reliability
- Good internal consistency
- Weighted feature match scores did show
significant difference in features recommended as
expected. - More features not necessarily better
41Conclusion
- Positive changes in knowledge scores following
training - Attitudes and behaviors were not significantly
influenced - Utilization practices showed improvement in of
features specified yet quality of LOMN had no
change.
42Further Measure Development
- Psychometric development
- MWC questionnaire
- WPR measures (rubric, feature match)
- Promising internal consistency
- Test-retest reliability
- Still need to determine responsivity, validity
and reliability - Determine if results were due to sensitivity of
measures OR impact of training
43Plan
- New RUA project funded
- create a web-based distance education program
- use projects evidence-based training program
- examine differences in effectiveness between
in-person and distance training
44Discussion
- Future opportunities
- Knowledge dissemination training
- Link clinical training to pt outcomes
45Our Project Partners
46Our SPONSORS