Demystifying Evaluation - PowerPoint PPT Presentation

1 / 50
About This Presentation
Title:

Demystifying Evaluation

Description:

Determines where problems/strengths are, scope of issues, who needs ... Annual. Bi-annual. As needed. Monthly reports. Educators. True count. Tracking. Taxes/WP ... – PowerPoint PPT presentation

Number of Views:52
Avg rating:3.0/5.0
Slides: 51
Provided by: agor60
Category:

less

Transcript and Presenter's Notes

Title: Demystifying Evaluation


1
Demystifying Evaluation
  • April 23, 2008
  • Michelle Henry Jen Keith
  • Philadelphia Health Management Corporation

2
Todays Topics
  • Context
  • Why evaluate?
  • Different types of evaluation
  • Definitions
  • Program Evaluation
  • Framework
  • Steps
  • Using Data
  • Collection Tools

3
Why Evaluate?
  • .
  • .
  • .
  • .
  • .
  • Research seeks to prove
  • evaluation seeks to improve.
  • M. Patton
  • Good news
  • You already do it! But, it may be informal.

4
Measure outcomes toImprove Quality
Communicate Value
  • Track implementation
  • Reassess strategies
  • Provide feedback
  • Identify strengths challenges
  • Sharpen focus
  • Identify training needs
  • Promote program to potential participants
  • Use in program development
  • Recruit staff
  • Enhance programs public image
  • Retain increase funding

5
Types of Evaluation
  • Community Assessment
  • Program Evaluation
  • Multi-Program Evaluation
  • Surveillance

6
Community Assessment
  • Also known as Needs Assessment
  • Determines where problems/strengths are, scope of
    issues, who needs the program/intervention
  • Data sources include
  • Surveys
  • Key informant interviews
  • Focus groups
  • Secondary data analysis

7
Program Evaluation
  • Examines the goals, processes, and impacts of
    programs and/or policies.
  • Data collection methods can be quantitative
    and/or qualitative
  • Data sources include
  • Surveys
  • Key informant interviews
  • Focus groups
  • Secondary data analysis

8
Multi-Program Evaluation
  • Examining multiple sites or program components as
    part of a comprehensive program evaluation
  • Example Statewide Evaluation of the PA TCP.
  • Data sources include
  • Surveys
  • Key informant interviews
  • Focus groups
  • Secondary data analysis

9
Surveillance
  • Broader perspective
  • Secondary data sources
  • BRFSS
  • Vital Statistics
  • YTS/ATS
  • Etc.

10
Process vs. Outcome Evaluation
  • Process Evaluation
  • Documents key activities, accomplishments,
    challenges, lessons learned
  • Descriptive, but also answers question Is
    program in place as intended?
  • Very helpful for internal program improvements
  • Outcome Evaluation
  • Assess actions, outcomes, change
  • Answers question What are results of your work?
  • Very helpful for program improvement
    discussions of funding/worth

11
What is a program objective?
  • Describes results your program intends to achieve
    and way in which they will be achieved
  • Serves as basis for monitoring progress towards
    achieving your programs goals, setting targets
    for accountability

12
Types of Objectives
  • Process Objectives Statements about programs
    products or services delivered (outputs)
  • Outcome Objectives Statements about programs
    expected results (outcomes).

13
Types of Outcome Objectives
  • Short-term initial changes in target population
    (awareness, knowledge, attitudes, skills)
  • Intermediate Interim changes towards long-term
    (norms, behavior, policy)
  • Long-term changes in health status, morbidity,
    mortality

14
Program Objectives Should be SMART
  • S Specific
  • M Measurable
  • A Achievable
  • R Relevant
  • T Time-bound

15
Indicators
  • Process indicators measure activities.
  • Number of peer lead prevention sessions
  • Number of quit plans created
  • Outcome indicators measure short-term,
    intermediate, and long-term outcomes.
  • Number of clients who report reducing number of
    cigarettes
  • Number of clients who report staying quit at six
    month follow up call
  • Number of residents who report increase support
    for clean indoor air policies

16
Evidence
  • Evidence-based Practice
  • Practice-based Evidence

17
CDC Evaluation Framework
Source www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1
.htm
18
Program Evaluation
  • Program evaluation should maximize
  • Utility What information do we need?
  • Feasibility What information can we get?
  • Propriety Who should be involved?
  • Accuracy How can we get accurate information?

19
Utility is King/Queen!
  • Who needs what information?
  • How will information be used?
  • Feasibility is often Utilitys 1st reality check
  • Given the money/time/effort we can contribute
    what proof can I get?
  • But, evaluation must be ethical reliable too!

20
Evaluation 101
  • Steps in Evaluation
  • Plan to collect data
  • Collect data
  • Report data
  • Tips for each step
  • Be systematic!
  • Triangulate!
  • Share information!

21
Step 1 Plan to Collect Data
  • Needs to answer
  • What did we do?
  • How well did we do it?
  • What difference did our program make?
  • Planning tools
  • Logic Model
  • Evaluation Plan

22
Planning Tools
Logic Model
Activities Strategies Tactics
Objectives
Goals
Evaluation Plan
Short-term/ Intermediate Outcome Evaluation
Long-term Outcome Evaluation
Process Evaluation
Program planning is linked to evaluation planning.
23
Logic Model
  • Needs to convey
  • Purpose of initiative
  • Expected actions
  • Expected results
  • Expected causal relationships
  • Serves as a map for all involved
  • Everything I needed to know about my program, I
    learned from my logic model.

Makes program theory clear, not true.
24
Source Cooperative Extension, Program
Development Evaluation, 2003
http//www.uwex.edu/ces/pdandel
25
Source Measuring Program Outcomes A Practical
Approach. United Way of America, 1996.
http//national.unitedway.org/Outcomes/Resources/M
PO/model.cfm
http//national.unitedway.org/Outcomes/Resources/M
PO/model.cfm
26
Logic Model Sample Formats
Source Taylor-Powell 2005 presentation
http//www.uwex.edu/ces/pdande/evaluation/pdf/nutr
itionconf05.pdf
27
Evaluation Plan
  • Needs to convey
  • Information to be collected
  • Data collection methods
  • Timeframes
  • A plan well begun
  • is a plan half done.

28
Evaluation Plan Sample Format
Source Taylor-Powell 2005 presentation
http//www.uwex.edu/ces/pdande/evaluation/pdf/nutr
itionconf05.pdf
29
Evaluation Plan Sample Format
Source http//www.durhamcenter.org/docs/misc/PETe
mp.doc
30
Step 2 Collect Data
  • Qualitative
  • Interviews
  • Focus groups
  • Case studies
  • Observation
  • Document Review
  • Basic levels of evaluation questions
  • Reactions feelings
  • Learning
  • Change in knowledge, attitude, perception
  • Changes in skills
  • Effectiveness
  • Quantitative
  • Survey/Questionnaire
  • Structured/semi-structured Qualitative

31
Selecting a Collection Method
  • Consider
  • Population
  • Questions
  • Cost
  • Time
  • Personnel
  • Aim for
  • Validity
  • Reliability
  • Bias awareness/Bias reduction

32
Pilot Testing
  • Feedback from a pilot test can tell you if
    questions
  • Are clear and make sense to others
  • Provide information you need
  • Are at appropriate reading and language levels
  • Need skips or different ordering
  • Obtain feedback from
  • Co-workers
  • Persons with expertise in survey research
  • Program staff, if appropriate
  • Potential respondents
  • Revise based on pilot feedback

33
Step 3 Report Data
  • Formal Data Sharing
  • .
  • .
  • .
  • Informal Data Sharing
  • .
  • .
  • .

34
Using Data
  • Plan for useful data
  • What data do you/your stakeholders need?
  • How will answers be used?
  • Is question necessary?
  • Id opportunities to use data
  • Program improvements
  • Reports Grants
  • Facilitate communication
  • Share data
  • Transparency with stakeholders
  • Inform decision making

35
Evaluating Prevention Programs
  • Your regions Busted! chapter led a one-time,
    hour-long tobacco prevention education assembly
    at a local elementary school. The assembly
    included skits about saying no to tobacco and
    the use of props (lungs) to show the harmful
    effects of tobacco on the body.

36
Evaluating Prevention ProgramsWhat do you want
to know?
  • What did we do?
  • How well did we do it?
  • What difference did our program make?
  • Information/data to be collected
  • Collection methods

37
Evaluating Prevention Programs Components
  • Process
  • Program fidelity
  • Attendance
  • Busted! youth leadership
  • Outcomes
  • Knowledge and Behavior change
  • Satisfaction

38
Evaluating Prevention ProgramsEvaluation Plan
  • Indicators
  • Timing
  • Data sources

39
Evaluating Prevention Programs Using data
  • 1.
  • 2.
  • 3.
  • 4.

40
Evaluating Policy Programs
  • Example Integration of Tobacco and Chronic
    Disease Programs
  • BE A BRIDGE - Bridges Activity Sheets
  • Brainstorm Clean Indoor Air

41
Clean Indoor Air
of workplaces educated re benefits
Ongoing
What did we do?
Educators
True count
Tracking
Monthly reports
of new policies of strengthened policies
TBD (ask PO)
How well did we do it?
Quarterly Quarterly
Workplaces
Survey
60
Survey Focus G
60 20
TBD (ask PO) PACT/BAB
Workplaces
of requests for education of violations
of people protected
Monthly reports
True count
What difference did we make?
Quarterly Annual Bi-annual As needed
Educators
Tracking
TBD
2ndary data Interviews
Taxes/WP Workers
n/a 10
COC survey PACT/TFK
42
Group Activity
  • Evaluating school programs
  • Pick program type
  • Prevention
  • Cessation
  • Policy
  • Draft an Evaluation Plan

43
Measuring Program OutcomesIn-house Data
Collection Methods
  • 1. Extract from existing records
  • - Intake forms / Attendance forms
  • 2. Pre/post testing
  • - For specific programs or events
  • 3. Surveys
  • - Mail
  • - Telephone
  • - In-person, self-administered
  • - In-person, staff-administered

44
Pre-/Post-Measures Uses
  • To establish level of participants knowledge,
    attitudes, skills or behavior before beginning
    program or activity
  • To document changes in attitudes, skills or
    behavior between beginning and end of program or
    activity
  • Pre-/post measures can include written tests or
    surveys, tests or ratings of skills or physical
    measurements

45
Pre-/Post-Measures Analysis
  • Compare pre- and post-test results for
    individuals or groups
  • Comparisons between groups are best indicator of
    real changegroups must include same people
  • Scores can be grouped and compared based on
    participant or program characteristics

46
Participant Surveys Uses
  • Provides opportunity for consumers of program or
    service to give feedback
  • May be only feasible way to measure outcomes
  • Useful as source of information even if other
    measures are used

47
Participant Surveys Design
  • Written surveys are similar to reading levels of
    participants (adults or children)
  • Keep surveys shortclosed-ended and short answer
    questions
  • Assure confidentiality or anonymity to encourage
    honest answers

48
Before Writing Your Own Questions
  • Look for existing surveys (or questions) to meet
    your needs
  • Existing instruments
    have been tested (or at
    least used) by others
  • Can use
  • U.S. or state agencies
    that serve the target
    population
  • Journals and report
    appendices

49
Writing Good Questions Issues to Consider
  • Question content
  • Question wording
  • Form of response
  • Question order
  • Layout or format

50
Wrap-Up
  • Thank you for your participation!
  • Key contact information
  • Michelle Henry mhenry_at_phmc.org
  • Jen Keith jkeith_at_phmc.org
Write a Comment
User Comments (0)
About PowerShow.com