Evaluation: Conducting a Sound Process Evaluation - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Evaluation: Conducting a Sound Process Evaluation

Description:

Review lesson plans. Have participants and instructors complete checklists ... Ratings, logs, and closed-format questionnaires yield quantitative data ... – PowerPoint PPT presentation

Number of Views:24
Avg rating:3.0/5.0
Slides: 34
Provided by: barriburru
Category:

less

Transcript and Presenter's Notes

Title: Evaluation: Conducting a Sound Process Evaluation


1
Evaluation Conducting a Sound Process Evaluation
  • Barri B. Burrus, Ph.D.and Ina F. Wallace,
    Ph.D.RTI International
  • Presented at OAPP National Care and Prevention
    Conference December 9 and 11, 2008

2
Session Overview
  • Define what process evaluation is and why it is
    important
  • Describe OAPPs expectations for process
    evaluation
  • Identify components to include in process
    evaluation
  • Discuss challenges for process evaluation in the
    field

3
What Is Process Evaluation?
  • Process evaluation describes the process through
    which an intervention is implemented
  • Designed to assess an interventions
  • Implementation
  • Is it happening? (particularly important in early
    stages)
  • What is the integrity of implementation compared
    to intervention model? (particularly important in
    later stages)
  • Dosage for participants (How much
    intervention has each of the participants
    received?)
  • Ability to effect change

4
What Is the Difference between Outcome and
Process Evaluation?
  • Outcome evaluation looks at the results or
    outcomes that occur as a result of taking part in
    the intervention relative to the comparison group
  • Answers questions about what has happened because
    of an intervention
  • How have knowledge, attitudes, behaviors, and
    practices changed?
  • How have policies, procedures, and regulations
    changed?
  • Process evaluation looks at the outputs of the
    intervention process
  • Answers questions about what has happened as a
    result of an interventions implementation
  • To whom, what, when, where, how much intervention
    has been delivered/received by participants?
  • How have participants reacted to the intervention?

5
Requirements of a Process Evaluation
  • A strong and detailed logic model to identify
    outputs
  • Identifies key intervention components that must
    be put into place
  • Outlines activities to be measured
  • Quantitative components through which
    implementation and dosage can be assessed
  • Examples include
  • Number of sessions offered
  • Number of attendees
  • Can also include some qualitative components
  • Be careful asking about sensitive topics with
    focus groups

6
Uses of Process Evaluations
  • Summative
  • To use data as mediators and moderators in
    analysis of impact
  • Exploration of dosage treatment effects
  • To help ensure intervention is not falsely
    rejected
  • To document what was done if outcome warrants
    future replication of intervention
  • Formative
  • To identify what is working well and what needs
    improvement
  • To explore whether intervention may be harming
    participants

7
Steps in Conducting Process Evaluations
  • Identify all key components of program
  • Create or revise logic model including process
    variables
  • Determine objectives of process evaluation
  • Ensure that program implementation, dosage, and
    fidelity are measured
  • Create measures
  • Determine measurement schedule
  • Collect data
  • Determine how process measures will fit in the
    analysis
  • Include process data in impact evaluation

8
Logic Model for Care Program

Activities
OutputsProcess Indicators
Inputs
Short-Term Outcomes
Long-Term Outcomes
Tutoring
Program director
of teens attending program activity
As compared to comp group, teens in intervention
arm.
Mental health counseling
As compared to comp group, teens in intervention
arm.
Tutors
Individual nurse check- ups for mom and baby
Will demonstrate greater understanding of child
development
of sessions focused on personal support and
developmental assets
Will show better interaction skills with their
babies
Social workers
8 classes to prevent repeat pregnancies
Nurses
Will have greater reduction in mental health
problems and increase in coping skills
Weekly group sessions for parenting education
Will demonstrate better attitudes regarding
sexual behavior
Family planning educators
of hours of parenting education provided
Bi-weekly group sessions for building personal
support developmental asset
Will have a higher graduation rate
Family members
Will have greater school attendance and higher
school achievement
of grandparents attending gt 1 support group
Parenting educators
Will have lower rate of repeat pregnancies
Monthly grandparent support groups
Support group leaders
of fathers attending gt1 support group
Monthly father support groups
9
Logic Model for Prevention Program
Short-Term Outcomes
Inputs
Activities
OutputsProcess Indicators
Long-Term Outcomes
Staff
8 abstinence education sessions
Increase percentage of teens who value abstinence
of teens attending program activities
Health educators
Tutoring/mentoring
Teachers
Increase in teens who remain abstinent
Increase in developmental assets among teens
Bi-weekly youth asset development sessions
of referrals and/or linkages made to services
Curricula and program materials
Referral and linkage to services
Teen advisory council
Increase communication between parents and
children about risky behaviors
of hours of parent education provided
Monthly family involvement activities
8 public middle schools
Weekly parental education sessions
Parents
of parents attending family activities
Increase in parent monitoring
Community service projects
Community partners
10
Identifying Key Program Components
  • Examine activities to determine what aspects of
    the program need to be assessed
  • When cost or time is an issue, select among
    different components
  • What are components from your interventions? How
    did you decide what to measure?

11
Program Implementation
  • Determine whether the program has been
    implemented
  • Are things going as planned? Is the model
    followed?
  • Is there a model to be followed?
  • Are all components sufficiently described?
  • Were those who delivered the program properly
    trained?
  • Assess each component identified as key
  • Use different strategies to assess whether
    implementation has occurred
  • Examples of ways to assess implementation
    include
  • Observation checklists
  • Time and activity logs
  • Records

12
Discussion Program Implementation
  • What are some examples from your programs in
    which challenges to implementation were captured
    by process evaluation?
  • Possible examples
  • Delayed or no implementation
  • Intervention model not followed
  • Other events occur simultaneously
  • Low participation in the intervention activities

13
Discussion Program Implementation (continued)
  • Some of the questions about implementation that
    could be answered include
  • Were all staff hired?
  • How many tutors were available?
  • How often did parenting education sessions occur?
  • How many visits did nurses make to mothers and to
    babies?
  • How many newsletters were sent to families?

14
Discussion Assessing Program Implementation
  • What are strategies that could be used to assess
    implementation of . . .
  • Tutoring or mentoring?
  • Classes devoted to healthy dating?
  • Nurse visits?
  • Newsletters?

15
Discussion Assessing Program Implementation
(continued)
  • Examples of strategies to assess implementation
    include
  • Tutoring
  • Schedule of availability
  • Classes devoted to abstinence education
  • Observation or activity logs
  • Nurse visits
  • Records or activity logs
  • Newsletters
  • Records of mailings

16
Discussion Assessing Program Implementation
(continued)
  • If any components were NOT implemented, you
    should examine the reasons.
  • Examples of nonimplemented components could
    include
  • Interviews with program directors
  • Interviews with school administrators
  • Interviews with participants
  • Interviews with parents
  • Site visits to facilities

17
Dosage
  • Determine how much intervention each of the
    participants received
  • What, how much, and how often did each
    participant receive intervention?
  • Assess dosage for key components
  • For programs that are value added, assess
    dosage in both intervention and comparison groups
  • Include attendance record and time spent in
    activities in assessment methods

18
Discussion Assessing Dosage
  • How do you assess dosage for . . .
  • Counseling?
  • Abstinence education sessions?
  • Home visits?

19
Fidelity of Implementation
  • Determine how well the program was implemented
  • Is the intervention modified by staff as part
    of the process?
  • What is the quality of services/activities
    offered?
  • Are there other barriers encountered or
    facilitators identified?
  • Use knowledge of the programs goals, methods,
    and plans

20
Fidelity of Implementation (continued)
  • Refer to curriculum guides
  • Use different strategies to assess fidelity
  • Review lesson plans
  • Have participants and instructors complete
    checklists
  • Have observers rate quality and participant
    engagement
  • Use focus groups and interviews to obtain
    participants opinions

21
Discussion Fidelity of Implementation
  • For your interventions
  • What are some of the things you look at to assess
    fidelity?
  • What are some of the challenges you face?
  • Is there enough detail in the intervention plan
    and model to know what the intervention is
    supposed to look like?

22
Discussion Fidelity of Implementation (continued)
  • Some of the questions that could be answered
  • How many of the components of the curriculum were
    delivered?
  • How well did parent educators or teachers cover
    specific topics?
  • How engaged were students in the groups?
  • What problems were encountered in delivering the
    curriculum?

23
Discussion Fidelity of Implementation (continued)
  • Develop a fidelity checklist for an observer
  • Did the parenting educator (PE) introduce the
    concept of attachment?
  • Did the PE provide examples?
  • Were terms defined?
  • Did PE ask students for examples?
  • Did PE make a paper chain?
  • Did PE model correct way to play peek-a-boo?
  • Did students make a memory book?
  • Did students complete worksheet?
  • To what degree did students provide examples
    about their babies?
  • A lot Some A little None
  • How much did they like making the memory book?
  • A lot Some A little None
  • How much would you rate their overall
    participation in this lesson?
  • A lot Some A little None

24
Discussion Fidelity of Implementation (continued)
  • Other means of assessing fidelity of the
    attachment lesson include
  • Asking teen participants to indicate what was
    covered and having them rate their satisfaction
  • Having parent educators complete an activity log
    indicating whether each component was covered
  • Number of students attending and length of class
    are not indicators of fidelity (but they are
    still important to assess)

25
Process Evaluation Measurement
  • Use measures to reflect nature of question
  • Ratings, logs, and closed-format questionnaires
    yield quantitative data
  • Ethnographic reports, focus groups, and
    open-ended interviews yield qualitative data
  • Consider measurement schedule
  • Activity logs should be completed at every
    encounter
  • Fidelity observation ratings should be completed
    often (perhaps 8 times per year)
  • Focus groups and interviews can be completed
    occasionally (perhaps once per year)

26
Assessing Quality of the Process Evaluation
Measures
  • Include measurement of dosage in the evaluation
    report
  • How is dosage assessed?
  • What are the actual levels of dosage?
  • Is dosage at the level appropriate for
    statistical analysis?
  • Ensure that the process measures are detailed
    enough to indicate whether implementation is
    occurring as expected (focus for early stages)
  • Examine whether the measures have been used
    consistently across intervention conditions

27
Process Measures
28
Discussion Process Measures
  • What challenges do you face in developing process
    measures?
  • How do you ensure that both intervention and
    control/comparison groups are included in the
    process evaluation?

29
Using the Process Evaluation
  • Results of program implementation evaluation will
    tell you which components were actually delivered
    and to what extent
  • Examine degree of implementation to answer
    question Is full implementation of all
    components linked to better outcomes?
  • Results of dosage evaluation will tell you how
    much of the program was delivered
  • Examine dose-response (outcome) to answer
    question Does greater dosage lead to better
    outcomes?
  • Results of fidelity evaluation will tell you how
    good the delivered program was
  • Examine ratings of quality or engagement as
    moderators of the intervention to answer
    question Do teens who are more engaged have
    better outcomes?

30
Process Evaluation Cautions
  • Establish prioritiesit is easy to get
    overwhelmed
  • Communicate these through process objectives
  • Keep records
  • Work with program staff to create data collection
    systems
  • Collect reports regularly and code as you go
  • Explore ways to make information useful to
    program staff
  • Process evaluation produces key information that
    managers should really want to know
  • Check with your project officer before making
    large corrections or changes to the intervention

31
OPAs Expectations for Process Evaluations in EOY
Reports
  • Process evaluations are required components of
    end-of-year (EOY) reports
  • Do not just report that process evaluation is
    being conducted
  • Give details about what data are being collected
    and how they are being collected
  • Include data analyses in reports when they become
    available
  • Perform analyses cumulatively over time

32
OPAs Expectations for Process Evaluation
  • End-of-year reports should include clear
    descriptions of the process evaluation plan
  • Include process evaluation objectives
  • List what is being assessed to evaluate
    implementation and fidelity of key program inputs
    and activities
  • Describe how data are being collected
  • Include results

33
Process Evaluation
  • Is it a sweet experience or just the way the
    cookie crumbles?
Write a Comment
User Comments (0)
About PowerShow.com