Frameworks for Evaluating Healthcare Information Systems - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Frameworks for Evaluating Healthcare Information Systems

Description:

Equanimity 'mental calmness, composure, and evenness of temper, esp. in a difficult situation; she accepted both the good and the bad with equanimity' ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 36
Provided by: paulg57
Category:

less

Transcript and Presenter's Notes

Title: Frameworks for Evaluating Healthcare Information Systems


1
Frameworks for Evaluating Healthcare Information
Systems
  • AMIA Evaluation Working Group
  • October 24, 2005
  • Paul Gorman

2
Evaluation
  • to judge or determine the significance, worth,
    or quality (Websters Dictionary)
  • Different meanings
  • Across the spectrum of health informatics
    research, practice, and education
  • Within different disciplinary traditions
  • To different HIT stakeholders

3
Why evaluate? (v1.0)
  • Although I am happy with my research focus and
    the work I have done, how can I design and carry
    out a practical evaluation that proves the value
    of my contribution?
  • Ted Shortliffe
  • Foreward to Friedman Wyatt

4
Why evaluate? (v2.0)
  • How many believe well be better off
  • When all clinicians use EHRs?
  • When all clinicians have computer DSS?
  • These interventions are obviously beneficial
  • Editorials on reminders, decision support
  • Evaluation an academic exercise, wastes
    resources, delays implementation
  • We know what works Just Do It!

5
Lessons from clinical practicePlausible is not
enough
  • Obvious Benefit (but actually harmful)
  • 1950s Oxygen
  • 1980s Antiarrhythmic
  • 1990s HRT
  • 2000s Cox2
  • Obvious Benefit (not)
  • 1950s LIMA ligation
  • 1990s Knee arthroscopy
  • Obvious Harm (but actually beneficial)
  • Beta blockers in heart failure
  • No Plausible Benefit (but beneficial)
  • Gold for RA (the tomato effect

6
Two Edged Sword
7
  • With each new discovery and invention, the
    virtues are always oversold, the drawbacks
    understated. But everyone knows science and
    technology are inevitably a mixed blessing.
  • Michael Crichton
  • Science, 2000

8
Evaluate or Implement? Yes
  • Complex social, technical, economic system
  • Multiple players, perspectives, goals
  • Pure and applied informatics
  • Spectrum of projects
  • component development
  • full fledged system impact
  • Multidisciplinary and integrative discipline
  • Goals remote from system, confounders
  • No single method or approach

9
Integrative Discipline - Diverse Paradigms
Methods(adopted from Jochen Moehr, U Victoria)
Mathematics
Physiology
Anthropology
Computer Science
Engineering
Management
Library Science
10
Choosing Methods
  • the question being asked determines the
    appropriate research architecture, strategy, and
    tactics to be used
  • not tradition, authority, experts, paradigms,
    or schools of thought.
  • Sackett Wennberg
  • BMJ 1998

11
Frameworks
  • Why is evaluation in HIT so difficult?
  • How can we make sense of the studies and their
    findings?

12
1. Purpose of Evaluation (Chelimsky)
  • Evaluation for Development
  • Provide help to improve systems
  • eg OConnor NE CV Surg consortium
  • Evaluation for Knowledge
  • Advance understanding in a field
  • Eg Elstein diagnostic reasoning work
  • Evaluation for Accountability
  • Measure results or efficiency
  • eg Overhage RCT of corollary orders

13
Comparison by Evaluation Purpose
14
Equanimity
  • mental calmness, composure, and evenness of
    temper, esp. in a difficult situation she
    accepted both the good and the bad with
    equanimity
  • - Oxford American Dictionary

15
2. Perspective of Evaluation (Gremy)Whose
outcomes to count?Whose outcomes count?
16
Good Solution - Whose perspective?Glucose
Getting Better or Worse
17
Technology Assessment Phases (Littenberg)
  • Social Outcomes
  • Patient Outcomes
  • Surrogate Outcomes
  • Technical Feasibility
  • Biologic Plausibility
  • Is TPA cost-effective compared to SK?
  • Does thrombolysis improve survival?
  • Does thrombolysis open coronaries?
  • Can thrombolysis be done safely?
  • Could thrombolysis help patients w/ MI?

18
Technology Assessment Phases Framework Modified
for informatics
  • Social Outcomes
  • Patient Outcomes
  • Surrogate Outcomes
  • Technical Feasibility
  • Scientific Plausibility
  • Can we afford the system? (CEA, ROI)
  • Do patients benefit from system?
  • Does system improve structure or process?
  • Can the system be implemented?
  • Does the system design make sense?

19
4. AMIA Fall 1991 Theme EvaluationInformation
System Validation (Eddy)
20
Eddy Information System ValidationExample
Hypertension Advisor System
21
Combined Evaluation FrameworkClinician
information seeking system
22
Quality of EvidenceEBM Hierarchy (Oxford Centre)
23
Applicability of Clinical Epidemiology Approach
  • Whenever aeroplane manufacturers wanted to
    change a design feature . . . they would make a
    new batch of planes, half with the feature and
    half without, taking care not to let the pilot
    know which features were present
  • McManus, 1996
  • in Healthfield, Evaluation of information
    technology in healthcare

24
System Development Stage(Stead, et. Al. 1994)
25
NASA Technology Readiness
26
It Takes Time...
27
Science never proves anything
  • - Gregory Bateson

28
Type I Error
  • Finding a difference where none exists
  • Failure to accept null hypothesis when true
  • Probability is alpha, customarily p 0.05
  • Examples
  • multiple post-hoc subgroup analyses
  • early termination of controlled trials
  • data-dredging in search of a p lt 0.05
  • Remedies
  • Bonferroni adjustment
  • Tukeys Honestly Significant Difference

29
Type II Error
  • Failing to find a true difference
  • Failure to reject null hypothesis when false
  • Probability is beta, ideally known, 20
  • Alpha gets all the attention, beta often none
  • Examples
  • countless RCTs too small to find a difference
  • Remedies
  • improve measurement (reduce variance)
  • enlarge sample size (increase precision)
  • meta analysis (combine results of trials)

30
Type III Error
  • Getting the right answer to the wrong question
  • Reasons
  • Incorrect assumptions (implicit models)
  • DDSS assumption - demise of the Oracle
  • Information needs assumption- types of info
  • Availability of data (looking under the
    streetlight)
  • Adherence to method
  • (when all youve got is a hammer)
  • Over-refinement of methods

31
Measurement
  • A person who has one thermometer knows the
    temperature, but someone who has two is never
    sure.
  • (Science 20032991641).

32
Equanimity
  • mental calmness, composure, and evenness of
    temper, esp. in a difficult situation she
    accepted both the good and the bad with
    equanimity
  • - Oxford American Dictionary

33
  • The history of discourses about appropriate
    forms of computerization is littered with utopian
    visions that do not effectively engage the
    complexities of the social worlds of the likely
    users of new technologies
  • Rob Kling,
  • JASIST 2000

34
Program Evaluation
  • (1) assesses effectiveness of an ongoing program
    in achieving its objectives,
  • (2) relies on standards of project design to
    distinguish a program's effects from those of
    other forces, and
  • (3) aims at program improvement through a
    modification of current operations.

35
Frameworks for Evaluating HIT
  • Definitions Operationalize Terms (week 1)
  • Frameworks for Evaluation
  • Purpose (Chelimsky)
  • Perspective
  • Level of Evaluation (Littenberg and Eddy)
  • Level of Evidence (Oxford/EBM)
  • Stage of Development (Stead et al., NASA)
  • Methodology and Error Type I, II, III
  • Discussion Application
Write a Comment
User Comments (0)
About PowerShow.com