Contact: - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Contact:

Description:

one variable (x) used to predict second variable (y) (weight/height) ... stoplight charts effective at Leadership level. On-Line Analytical Processing ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 44
Provided by: healthforc
Category:
Tags: amy | and | ask | charts | contact | height | weight

less

Transcript and Presenter's Notes

Title: Contact:


1
Welcome to the MHQP HealthForce MN Quality
Brownbag Room Monthly Noon Brownbag Fourth
Thursday Every Month
Date Subject
Mar 27 Intro to Webcast Series Quality as a healthcare core competency
Apr 24 Management Leadership Strategic
May 22 Management Leadership Operational
Jun 26 Patient Safety Strategic
Jul 24 Patient Safety Operational
Aug 28 Information Management Data Measurement
Sep 25 Information Management Analysis Communication
Oct 23 Performance Measurement Improvement Education/Training
Nov 20 (3rd Thurs) Performance Measurement Improvement Planning Implementation Project Management
Dec 18 (3rd Thurs) Performance Measurement Improvement Evaluation/Integration
Jan 22 Dealing with change
Feb 26 About the CPHQ Test taking tips Practice questions
Mar 26 tbd
Mar ?? Weekend CPHQ Prep Date tbd
Sep 25 Information Management Analysis
Communication
  • ANALYSIS
  • Using comparative data
  • Interpreting benchmarking data
  • Interpreting incidence/event reports
  • Interpreting outcome data
  • Intuition/stories vs objective/facts
  • COMMUNICATING RESULTS
  • Event/individual patient issues
  • Performance Improvement feedback processes
  • Reports vs Scorecards vs OLAP vs mining
  • Right information for right audiences
  • Accrediting boards leaders Dir Mgr

Contact Skip Valusek MHQP Education
Chair skipvalusek_at_comcast.net
Slides are posted at http//www.healthforceminn
esota.org/pages/Programs/courses.html
2
Register your Attendance
  • Hopefully you provided your name organization
    when you signed in.
  • If so
  • Just say Hi in the Chat Pod and well
    capture your name and organization in the log.
  • If not
  • identify yourself and organization in
    the Chat Pod to the
  • left of your screen.
  • If there are more than one attending on your
    sign-in, tell us how many by saying Hi (tell us
    the number of attendees)

3
Poll Who is Attending this Session ?
  • Rural / Outstate ?
  • Metropolitan area ?
  • Organization that has (or serves) both ?

4
Poll Who is attending Organization Type ?
  • Healthcare system
  • Hospital
  • Clinic or Clinic System
  • Long term care
  • Healthplan
  • Homecare / Hospice
  • A Quality Support Organization
  • Other ? (Identify other in Chat Pod)

5
Poll What do you hope to gain by participating?
  1. I am a CPHQ and want to obtain CEUs for
    recertification. (Note this is not guaranteed
    at this time. We are still working on this)
  2. I am a healthcare quality professional and am
    interested in additional education.
  3. I am a healthcare professional interested in
    developing quality skills as a core competency.
  4. I am a healthcare professional interested in
    learning more about healthcare quality.

6
Agenda
  • ANALYSIS
  • Using comparative data
  • Interpreting benchmarking data
  • Interpreting incidence/event reports
  • Interpreting outcome data
  • Intuition/stories vs objective/facts
  • COMMUNICATING RESULTS
  • Event/individual patient issues
  • Performance Improvement feedback processes
  • Reports vs Scorecards vs OLAP vs mining
  • Right information for right audiences
  • Accrediting boards leaders Dir Mgr

7
Using comparative data as Decision
Support(clinical / financial / value
analytics)
  • Performance outcomes measurement decision
    support systems can provide focus to determine
    quality of healthcare services provided
  • Analyzing data information generated by
    effective performance outcomes measurement system
    helps identify areas for improving quality
  • Analysis emphasis
  • change over time
  • use internal and external comparison measures

25
8
Benchmarking
  • Comparison of an organization/department/individua
    ls results against a reference point
  • Ideal reference point is a demonstrated best
    practice
  • Healthcare quality professionals assist the
    organization practitioners by interpreting
    benchmarking results
  • Enables organization to set target or goal for PI
    activities
  • Data sources
  • Government data
  • Large healthcare alliances
  • Peer review organizations (American Heart STS)
  • For profit database companies
  • (e.g. Solucient/Healthgrades,
    Premier, ACS/Midas)

28
9
Interpreting benchmarking data
  • Ask the right questions
  • What are we doing?
  • How are we doing it?
  • What is the measure of how well we do it?
  • Why are we looking for improvement?
  • Essential part of clinical pathway development
    who has best practice based on what measure(s) ?

29
10
Retrospective/Analytic Decision Support Systems
  • Deal with strategic planning functions
  • Strategic planning marketing
  • Resource allocation
  • Operational evaluation monitoring
  • Product evaluation services
  • Medical management / Clinical analytics

35-36
11
The Infrastructure Required for Clinical Decision
Support
Operational Reports (e.g. patient lists)
EHR
Extract Transform Load
Other Source Systems
Real-time Decision Support
12
Monitoring performance (P D S/C A)
  • Have proposed changes actually been implemented?
    To what extent?
  • How could compliance with changes be enhanced?
  • What effect are changes having on patient
    outcomes? Are these desirable effects?
  • Should changes be modified then tested further,
    longer, ended?

13
Interpreting Incident/Event Reports
  • Are the number of reports increasing ?
  • Is this because there are more cases or
    because your culture is becoming blame resistent/
    just resulting in staff comfort with the
    process ?
  • Is the harm level constant or decreasing ?
  • Do you analyze
  • Structured data fields in your event report ?
  • Patterns in the content analysis of stories
  • Both types of data ?

14
Patient Safety Poll
  • Our safety reporting is increasing with harm
    level constant or decreasing.
  • Our safety reporting is constant with no change
    in harm
  • Our safety reporting is constant with harm
    decreasing
  • Our safety reporting is decreasing with harm
    increasing.
  • Other (identify in chat pod to your left).

15
Interpreting outcome data
16
Statistical Analysis and Interpretation of
Findings
  • Measurement Tools-instruments are devices used to
    obtain record data
  • Reliability the extent to which an instrument
    yields same results on repeated trials (scale)
  • Reliability coefficient
  • Stability of an instrument (gt70)
  • Test/rettest
  • Split-half
  • Interrater Reliability
  • two raters assign same rating
  • reliability is precision validity is accuracy

43-44
17
Validity of data
  • Validity the degree to which an instrument
    accurately measures what its intended to measure
  • Content (face)
  • adequately represents universe of content (rehab
    specialists evaluated FIM)
  • Construct
  • measures theoretical construct/trait designed to
    measure (risk adjustment scales predict
    probability of outcomes morbidity, mortality)
  • Criterion-related
  • score on instrument related to criterion
    behavior instrument supposed to measure (Multiple
    Affect Adjective Checklist anxiety, hostility,
    depression)
  • Concurrent
  • criterion variable obtained at same time as
    measurement
  • Predictive
  • Criteria measure obtained at future time

44-45
18
Statistical Techniques
  • Measures of Central Tendency-describe where a set
    of scores or values of a distribution cluster
    central (middle), tendency (trend)
  • Mean-(average) sum of all scores or values
    divided by total number of scores
  • Most commonly used
  • Most sensitive to extreme scores
  • Use with interval, ratio, ordinal data with
    normal distribution

45
19
Statistical Techniques
  • Measures of Central Tendency
  • Median- measure that corresponds to the middle
    score point on a numerical scale above which
    below which 50 of data falls
  • Arrange values in rank order if odd value, count
    up or down to middle value if total number of
    values is even, compute mean of two middle values

45-46
20
Measures of Variability - dispersion, how
measures spread out degree to which values differ
  • Range-difference between highest lowest values
    in a distribution of scores
  • Reported as values not distance
  • Quick estimate of variability unstable
    sensitive to extreme values
  • Standard deviation-average of deviations from the
    mean
  • Most frequently used statistic for measuring
    degree of variability
  • Standard-
  • average spread of scores around mean
  • Deviation-
  • how much each score is scattered from mean

46
21
Measures of Variability
  • Greater spread of distribution
  • greater dispersion or variability from the mean
  • larger standard deviation value
  • heterogeneous population
  • Values cluster around mean
  • smaller variability or deviation
  • smaller standard deviation
  • homogeneous population
  • Standard bell curve
  • All scores taken into consideration
  • Use with normally distributed interval or ratio
    data

47
22
Bell Curve
23
Parametric Tests
  • T Test-used to analyze difference between two
    means
  • When determining whether difference between two
    group means is significant, a distinction must be
    made between the two groups

48-49
24
T Test
  • Example-Test effects of educational program
  • 10 of 20 randomly assigned to experimental group
    receive videos, discussion, lectures on quality
    tools
  • Remaining 10 control group no special
    instruction
  • Both groups administered scale measuring
    attitudes towards using tool two-sample
    independent t test
  • Train all 20, give pre post test paired sample
    t test

49
25
Parametric Tests
  • Regression Analysis-based on statistical
    correlations, associations among variables
  • Correlation evaluates usefulness of prediction
    equation
  • perfect correlation r 1 or r -1, make
    perfect prediction
  • Higher correlation, more accurate degree of
    prediction
  • Simple linear regression,
  • one variable (x) used to predict second variable
    (y) (weight/height)
  • Multiple regression analysis
  • estimates effects of 2 or more independent
    variables (x) on dependent measure (y)

49
26
Nonparametric Tests
  • Chi-Square-measures statistical significance of a
    difference in proportions most commonly reported
    statistical test in medical literature
  • QI data is counted, not measured cant calculate
    averages (of gender) can describe ratio of
    counts (2X as many men as women in clinic) or
    proportions (50 male, 75 female)
  • Easiest statistical test to calculate manually

49
27
Example of Chi-Square
  • 15 of 30 men (50) 10 of 40 women (25) failed
    appointments
  • Referent rate (RR) 0.5 divided by 0.25 2 men
    are twice as likely to fail appointments could
    this have happened by chance?
  • Null hypothesis is that men women fail to show
    at same rate, or RR 1
  • Chi square indicates likelihood of noting a two
    fold difference in failed appointments
  • Chi square value 5.84, corresponds to
    significance (p) value of lt.02 (fewer than 2 out
    of 100) 2 probability that difference is due to
    chance

50
28
Confidence Intervals
  • Confidence Intervals-provides a range of possible
    values around a sample estimate (best guess about
    true value)
  • Observed that men are twice as likely as women to
    miss appointments
  • 95 CI around RR of 2 is 1.27 3.13 there is
    95 certainty that men are between 1.27 and 3.13
    times more likely to miss an appointment 90 CI
    is 1.44 2.77

50
29
Level of Significance
  • Level of Significance (p) gives the probability
    of observing a difference as large as the one
    found in the study when there is no true
    difference (null hypothesis is true)
  • Historically, when p values lt.05, results are
    statistically significant
  • p value for missed appointments .02

50-51
30
Poll
  • Considering the accuracy and validity of current
    healthcare operational PI-type data (i.e. non
    clinical-research data)
  • I still like the idea of using statistical data
    analysis to track improvement
  • Where possible it might be better to track
    changes in distributions of data (i.e. are we
    seeing a shift in the answers?)

31
Evaluating data/reporting/analysis components of
software proposals
  • Support accreditation requirements
  • Display data graphically
  • Drill-down analysis
  • Data mining reporting/statistical analysis
  • Multiple simultaneous users access
  • Open operating system (use a variety of different
    hardware platforms)
  • Networking capabilities
  • Flexibility
  • Access and manage reports via intranet web site

36
32
Intuition/Stories vs Objective/Facts
  • Emphasis today is on data/facts
  • What is the value of stories and intuition ?
  • Stories allow you to do content analysis without
    pre-defined categories for data.
  • Good for deeper insight into issue possible
    solutions
  • Bad for counting
  • Helps you find patterns that arent in current
    categories
  • Intuition
  • Allows you to act quickly and ask questions
    later.
  • Is a huge component of the timing aspect of
    Patient Safety

33
Agenda
  • ANALYSIS
  • Using comparative data
  • Interpreting benchmarking data
  • Interpreting incidence/event reports
  • Interpreting outcome data
  • Intuition/stories vs objective/facts
  • COMMUNICATING RESULTS
  • Event/individual patient issues
  • Performance Improvement feedback processes
  • Reports vs Scorecards vs OLAP vs mining
  • Right information for right audiences
  • Accrediting boards leaders Dir Mgr

34
Communicating results
  • Barriers/factors for interpretation utilization
    of information
  • Human fear of data, resentment of external
    data, unrealistic expectations lack of training
  • Statistical flawed data, untimely data, poorly
    displayed data
  • Organizational data overload, poor retrieval
    and display system, lack of resources

35
CommunicationEvent/individual Patient
IssuesIntuition/stories vs objective/facts
  • Stories reinforce the why of data-driven
    improvements and help us find better solutions.
  • How many of you include stories/content analysis
    in your database ?

36
Poll
  • How many of you include stories/content analysis
    in your mix of quality data ?
  • Yes
  • stories are an important part of our data and
    they are integrated into our reporting and
    analysis
  • Somewhat
  • we occasionally use stories as a trigger for
    analysis and reinforcement of plans
  • Infrequently or not at all
  • we havent been successful doing the content
    analysis and pattern detection required for
    effective integration of stories

37
Performance ImprovementFeedback Processes
Reporting
  • Reporting, analysis, interpretation make data
    meaningful
  • Report analyze regularly
  • Validate accurate data collection
  • Display in easily understood format
  • Brief summary (drillable to segmentation)
  • Analysis of variances identification/explanation
    of unexpected patterns

42-43
38
Data Analysis
  • Importance of Context
  • Essential to provide contextual background
  • Graph, tables
  • Report summarizing values
  • Identify removed outliers
  • Time order included
  • Scale awareness

43
39
Data Analysis
  • Variation
  • Use of SPC chart
  • Process performance varies
  • Random/common cause variation
  • Special cause variation
  • Trend Identification
  • Initiate investigation to determine cause of trend

43
40
Reports vs Scorecards vs OLAP vs mining
  • Reports
  • operational management tracking
  • Scorecards
  • progress against goals
  • (show colors over past 4 quarters)
  • stoplight charts effective at Leadership level
  • On-Line Analytical Processing
  • create cubes of data for drill-down and live,
    interactive discussion of analysis
  • Mining
  • using a data warehouse to discover correlations
    never considered

41
Right information for right audiences
Accrediting boards leaders Dir Mgr
  • Right information whats needed to make
    judgements and choices ?
  • Right audiences
  • What are the decisions made at the level in
    question?
  • Budget
  • Immediate resource allocation
  • Strategy
  • Performance improvement priorities actions
  • Clinical process changes

42
Summary
  • ANALYSIS
  • Using comparative data
  • Interpreting benchmarking data
  • Interpreting incidence/event reports
  • Interpreting outcome data
  • Intuition/stories vs objective/facts
  • COMMUNICATING RESULTS
  • Event/individual patient issues
  • Performance Improvement feedback processes
  • Reports vs Scorecards vs OLAP vs mining
  • Right information for right audiences
  • Accrediting boards leaders Dir Mgr

43
Welcome to the MHQP HealthForce MN Quality
Brownbag Room Monthly Noon Brownbag Fourth
Thursday Every Month
Date Subject
Mar 27 Intro to Webcast Series Quality as a healthcare core competency
Apr 24 Management Leadership Strategic
May 22 Management Leadership Operational
Jun 26 Patient Safety Strategic
Jul 24 Patient Safety Operational
Aug 28 Information Management Data Measurement
Sep 25 Information Management Analysis Communication
Oct 23 Performance Measurement Improvement Education/Training
Nov 20 (3rd Thurs) Performance Measurement Improvement Planning Implementation Project Management
Dec 18 (3rd Thurs) Performance Measurement Improvement Evaluation/Integration
Jan 22 Dealing with change
Feb 26 About the CPHQ Test taking tips Practice questions
Mar 26 tbd
Mar ?? Weekend CPHQ Prep Date tbd
Oct 23 Performance Measurement
Improvement Education Training (Amy Murphy
from ICSI)
  • Organizational PI training
  • (quality patient safety)
  • PI toolkit
  • Evaluating effectiveness of
  • training
  • Survey design and management

Questions? Contact Skip Valusek MHQP
Education Chair skipvalusek_at_comcast.net
Slides are posted at http//www.healthforceminn
esota.org/pages/Programs/courses.html
Write a Comment
User Comments (0)
About PowerShow.com