Title: Evaluating Health Information Technology: A Primer
1Evaluating Health Information Technology A
Primer
- Eric Poon, MD MPH
- Clinical Informatics Research and Development,
- Partners Information Systems
- Davis Bu, MD MA
- Center for Information Technology Leadership,
- Partners Information Systems
- AHRQ National Resource Center for Health
Information Technology
2Pre-Conference Logistics
- To Access Slides
- Go to http//extranet.ahrq.gov/rc
- Login with username and password
- Follow the links to download slides
- Problems? Email ResourceCenter_at_norc.org
- QA Session at the End
- Dial 1 to ask a question
- Please pick up handset (not speakerphone)
- Note that this teleconference is being recorded
3Outline
- Why evaluate?
- General Approach to Evaluation
- Deciding what to Measure
- Study Design Types
- Analytical issues in HIT evaluations
- Some practical advice on specific evaluation
techniques
4Why Measure Impact of HIT?
- Impact of HIT often hard to predict
- Many slam dunks go awry
- Understand how to clear barriers to effective
implementation - Understand what works and what doesnt
- Justify enormous investments
- Return on investment
- Allow other institutions to make tradeoffs
intelligently - Use results to win over late adopters
- You cant manage/improve what isnt measured
- Good publicity for organization
5General Approach to Evaluating HIT
- Understand your intervention
- Select meaningful measures
- Pick the study design
- Validate data collection methods
- Data analysis
6Getting Started Get to know your intervention
- Clarify the question What problem does it
address? - Think about intermediate processes
- Identify potential barriers to successful
implementation - Identify potential managerial and behavioral
process to overcome implementation barriers
7Array of Measures
- Quality and Safety
- Clinical Outcomes
- Clinical Processes
- Knowledge
- Patient knowledge
- Provider knowledge
- Satisfaction
- Patient satisfaction
- Provider satisfaction
- Resource utilization
- Costs and charges
- LOS
- Employee time/workflow
8Introducing the Evaluation Toolkit
- Rough guides on general approach, costs and
potential pitfalls - Major domains
- Clinical Outcomes
- Clinical Process
- Provider Adoption Attitudes
- Measure Characteristics
- IOM Domain
- Data Source
- Relative Cost
- Would love to hear your feedback
- Patient Knowledge Attitudes
- Workflow Impact
- Financial Impact
- Potential Pitfalls
- General Notes
9Selecting Evaluation Measures for HIT Three
Examples
10Computerized Provider Order Entry (CPOE) Example
- Clarify the primary question
- Does CPOE improve quality of care?
- Competing questions
- Does CPOE save money?
- What are the barriers to physician acceptance?
- Does CPOE introduce new errors?
11CPOE How can it affect quality?
- Think about intermediate processes
- Patient data is presented to ordering physician
- ADE alerts may be triggered and presented at the
point of care (which alerts?) - Guideline reminders may be triggered an presented
at the point of care (which guidelines?) - Medication order is entered
- Medication order is executed by pharmacy
- Medication order is executed by nursing staff
12Does CPOE Improve Quality of Care?
13Evaluating CPOEs Impact on Quality
- Select Appropriate Methodology
- Does existing data exist that can be leveraged?
(e.g. ongoing QA activities) - Does concurrent control exist?
- How will the data be analyzed?
14Electronic Medical Records (EMR) Example
- Clarify the primary question
- What are the barriers and facilitators to
effective EMR implementation? - Competing questions
- Do EMRs save money?
- Do EMRs improve quality of care?
- Do EMRs introduce new errors?
15EMR Dissecting the EMR Implementation Process
- Identify stakeholders
- Providers, et al.
- Catalogue stakeholder interests and values
- Workflow efficiency
- Clarify stakeholder role in implementation
- Users of system, clinical leaders, administrative
leaders - Clarify impact of Implementation on clinical
processes - User interface optimization, workflow
re-engineering - Define implementation success criteria
- Provider buy-in, provider use and acceptance
16EMR Understanding the Barriers and Facilitators
to Implementation
17EMR Understanding the Barriers and Facilitator
to Implementation
- Select Appropriate Methodology
- Combination of quantitative and qualitative
studies - Example efficiency measures
- Time motion studies how did the system affect
provider efficiency? - Attitude Surveys How did the system affect
provider perception of efficiency? - Semi-structured interviews How did the
implementation affect stakeholder workflow? Did
that effect change over time and why?
18Local Health Information Infrastructure
(Laboratory)
- Clarify the primary question
- Can LHIIs for labs generate a positive ROI?
- Competing questions
- Can LHIIs for labs improve quality of care?
- Which architecture is best suited for LHIIs for
labs? - How do LHIIs for labs affect provider and patient
perception of the health care system?
19LHII (Laboratory) Defining the ROI
- Specify intermediate processes
- Data is pulled from local laboratories
- Previous labs pulled
- Lab order entered
- Lab order transmitted
- Administrative handling
- Lab results reported
- Lab results recorded
- Data is pulled from primary provider
- Authorization and payment is coordinated with
payer - Implementation of LHIO
20LHII (Laboratory) Defining the ROI
- Identify associated measures
21LHII (Laboratory) Evaluating the ROI
- Select Appropriate Methodology
- Does concurrent control exist?
- Are there ongoing trends over time?
- How will the data be analyzed?
22Selecting Outcome Measures General Comments
- Generally want to pick 1-3 outcomes of primary
interest - If choose more, need to make correction (e.g.
Bonferroni) - Outcome must be sufficiently frequent to be
detectable - Rare events such as adverse events due to errors
particularly challenging - Important enough to provoke interest
- Whether study is positive or negative
- How would the results change policy (local or
national)? - Process vs. outcome
- Legitimate to measure process
- Outcome often takes too long
- In many situations link between process, outcome
clear
23Study Types
- Commonly used study types
- Before-and-after time series Trials
- Randomized Controlled Trials
- Factorial Design
- Study design often influenced by implementation
plan
24Time Series vs. Randomized Controlled Trials
- Before-and-after trial common in informatics
- Concurrent randomization is hard
- Dont lose the opportunity to collect baseline
data! - Off-On-Off trial design possible
- But may not be politically/ethically acceptable
to turn off a highly used feature - RCT preferable if feasible
- Eliminates the issue of secular trend
- Balance of baseline confounding
25Randomization Considerations
- Justifiable to have a control arm as long as
benefit not already demonstrated (usual care) - Want to choose a truly random variable
- Not day of the week
- Legitimate to stratify on baseline variables
(e.g. education for pt, computer experience for
providers) - Minimal number of arms
- More arms, less power
- Strongest possible intervention
26Unit of Randomization
- Patients
- Physicians
- Practices/wards
27Randomization UnitHow to Decide?
- Small units (patients) vs. Large units (practices
wards) - Contamination across randomization units
- If risk of contamination is significant, consider
larger units - Effect contamination-can underestimate impact
- However, if you see a difference, impact is
present - Randomization by patient generally undesirable
- Contamination
- Ethical concern
28Randomization SchemesSimple RCT
- Burn-in period
- Give target population time to get used to new
intervention - Data not used in final analysis
29Randomization schemes Factorial Design
- May be used to concurrently evaluate more than
one intervention - Assess interventions independently and in
combination - Loss of statistical power
- Usually not practical for more than 2
interventions
30Randomization SchemesStaggered Deployment
- Advantages
- Easier for user education and training
- Can fix IT problems up front
- Need to account for secular trend
- Time variable in regression analysis
31Randomization SchemesMultiple Interventions
- Time efficient design
- Every clinic gets something. (Keeps clinics and
IRB happy) - Watch out for cross-arm intervention contamination
32Inherent Limitations of RCTs in Informatics
- Blinding is seldom possible
- Effect on documentation vs. clinical action
- People always question generalizability
- Success is highly implementation independent
- Efficacy-effectiveness gap Invented here effect
33Data Collection
- Electronic data abstraction
- Convenient and time-saving, but
- Some chart review (selected) to get information
not available electronically - Get ready for nasty surprises
- Pilot your data collection protocol early
- And then pilot some more
34Data Collection Issue Baseline Differences
- Randomization schemes often lead to imbalance
between intervention and control arms - Need to collect baseline data and adjust for
baseline differences - Interaction term ( Time Allocation Arm) gives
effect for intervention in regression analysis
35Data Collection Issue Completeness of Followup
- The higher the better
- Over 90
- 80-90
- Less than 80
- Intention to treat analysis
- In an RCT, should analyze outcomes according to
the original randomization assignment
36A Common Analytical Issue The Clustering Effect
- Occurs when your observations are not
independent - Example Each physician treats multiple patients
Intervention Group
Control Group
Physicians
Patient -gt Outcome assessed
37Options for Dealing with the Clustering Effect
- Analyze at the level of clinician
- Example Analyze of MDs patients in compliance
with guideline, and make MD unit of analysis - Huge drop in statistical power.
- Not recommended.
- Generalized Estimating Equations
- PROC GENMOD in SAS, or PROC RLOGIST in SUDAAN
- Allows you to randomize at one level (e.g.
physician) and then do analysis at another (e.g.
patient) - Accounts for correlation of behaviors within a
single physician (i.e. adjusts for the fact that
observations across patients are NOT independent)
38A Word About Surveys
- Survey of user believes, attitude and behaviors
- Response rate responder bias Aim for response
rate gt 50-60 - Keep the survey concise
- Pilot survey for readability and clarity
- Need formal validation if you want plan to
develop a scale
39Looking at Usage Data
- Great way to tell how well the intervention is
going - Target your trouble-shooting efforts
- In terms of evaluating HIT
- Correlate usage to implementation/training
strategy - Correlate usage to stakeholder characteristics
- Correlate usage to improved outcome
40Studies on Workflow and Usability
- How to make observations?
- Direct observations
- Stimulated observations
- Random paging method
- Subjects must be motivated and cooperative
- Usability Lab
- What to look for?
- Time to accomplish specific tasks
- Need to pre-classify activities
- Handheld/Tablet PC tools may be very helpful
- Workflow analysis
- Asking users to think aloud
- Unintended consequences of HIT
41Qualitative Methodologies
- Major techniques
- Direct observations
- Semi-structured interviews
- Focus groups
- Adds richness to the evaluation
- Explains successes and failures. Generate Lessons
learned - Captures the unexpected
- Great for forming hypotheses
- People love to hear stories
- Data analysis
- Goal is to make sense of your observations
- Iterative interactive
42Cost Benefit Analysis
- Cost Data
- Generally available
- Caveat allocation of indirect costs
- Financial Benefit Data
- Revenue Enhancement
- Cost Avoidance
- Benefit Allocation
- Benefits may accrue to multiple parties
- Are benefits realizable (e.g. labor savings)?
- Calculation of benefits to external parties may
be of interest, even if it does not impact on ROI
43Cost Benefit Analysis
- Activity Based Costing Example
- Simply put, a method for assigning costs to
particular activities - Alternate method of assigning indirect costs to
the project - Also, may serve as a framework for capturing cost
savings
http//www.pitt.edu/roztocki/abc/abctutor/
44Concluding Remarks
- Dont bite off more than what you can chew
- Pick a few study outcomes and study them well.
- Its a practical world
- Balancing operational and research needs is
always a challenge. - Life (data collection) is like a box of
chocolates - You dont know what youre going to get until you
look, so look early!
45Thank you
- Eric Poon, MD MPH
- Email epoon_at_partners.org
- Davis Bu, MD MA
- Email dbu_at_partners.org
46Give Us Feedback!
- We are eager to hear your feedback!
- Go to http//extranet.ahrq.gov/rc
- Login with username and password
- Follow the links to provide feedback-thanks!
- Want to hear this teleconference again?
- Dial 1-800-486-4195 to replay until 5/4/05