Program Evaluation at EPA Workshop - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Program Evaluation at EPA Workshop

Description:

... to design an effective evaluation program ... In its broadest sense, program evaluation is a systematic way to learn from past ... It is an analytical exercise. ... – PowerPoint PPT presentation

Number of Views:107
Avg rating:3.0/5.0
Slides: 33
Provided by: karenpe5
Category:

less

Transcript and Presenter's Notes

Title: Program Evaluation at EPA Workshop


1
Program Evaluation at EPAWorkshop
2
Workshop Objectives
  • Learn what we mean by program evaluation
  • What it is
  • What it is not
  • Learn why its use can further Agency goals and
    objectives
  • Understand the steps that constitute a successful
    evaluation
  • Learn how to design an effective evaluation
    program
  • Discuss next steps for program evaluation
    activities

3
Overview of Program EvaluationWhat is Program
Evaluation?
  • In its broadest sense, program evaluation is a
    systematic way to learn from past experience by
    assessing how well a program is working.
  • It is an analytical exercise.
  • Program evaluation measures inputs (e.g.,
    resources), program activities, and program
    outcomes and tests causal assumptions linking
    these three elements.
  • As opposed to performance measurement (which
    answers what occurred), program evaluation helps
    answer why observed results occurred.

4
Overview of Program EvaluationWhat is Program
Evaluation? (cont.)
  • An evaluation need not cover all that EPA might
    call a program. Evaluations can be conducted on
    broad or more specific elements of a program.
  • For example, in looking at the National Estuary
    Program, one could evaluate
  • The broad national program
  • A component of the national program, such as the
    Chesapeake Bay Estuary Program
  • A component of the Chesapeake Bay Estuary
    program. such as the impact of NPDES permits on
    water quality in the Anacostia River
  • An evaluation can be systematic and effective
    without being elaborate or expensive.

5
Overview of Program EvaluationWhat Isn't a
Program Evaluation?
  • Routine oversight of national, State, local, or
    tribal programs
  • Routine collection of performance data
  • Routine budget analyses or studies focusing
    solely on resource needs
  • Collection of success stories
  • Audits aimed at identifying waste, fraud, and
    abuse

6
Overview of Program EvaluationWhy Perform
Program Evaluations?
  • Help decision-makers take a comprehensive view of
    the program and ask serious questions about its
    performance/values
  • Help comply with requirements of GPRA and FMFIA
  • Report on progress
  • Communicate a program's value
  • Identify potential improvements

7
Overview of Program EvaluationProgram
Evaluations and GPRA
  • The evaluation (of outcomes and the process) can
    help the Agency manage for results.
  • Program evaluation supports Agency planning
    efforts through compliance with the GPRA
  • Reporting on program evaluations is an important
    component of the Annual Performance Reports and
    EPA's strategic plan.
  • Allows managers to look beyond the annual
    performance measures and ask questions about why
    results occurred and how goals, activities, or
    measures need to be adjusted, to improve
    performance.

8
Overview of Program EvaluationProgram
Evaluations and GPRA
  • The statute

9
Overview of Program EvaluationTypical Questions
Managers Want Answered
  • How would a successful program look?
  • Why is the program achieving/not achieving its
    goals?
  • What internal and external factors are affecting
    program effectiveness and efficiency?
  • Where are the needs/opportunities for
    improvement?
  • In the absence of this program, who and what
    would be affected?
  • Is the program causing unintended effects -
    either beneficial or adverse?

10
Overview of Program EvaluationVarious Types of
Evaluations
  • Formative
  • Process
  • Outcome
  • Impact

11
Overview of Program EvaluationVarious Types of
Evaluations
  • Formative evaluation Conducted in the planning
    stage or early in the implementation of a
    program, helps in defining project scope and in
    identifying appropriate goals and objectives. In
    addition, can be used to pre-test ideas,
    strategies, and communication tools.
  • Process evaluation Determines how well the
    program is progressing in relation to its design
    (e.g., Minnesota feedlot evaluation) and also
    points out areas where the design can be improved
    to better achieve program goals.

12
Overview of Program EvaluationVarious Types of
Evaluations
  • Outcome evaluation Measures how well the program
    is meeting its stated goals and objectives and
    assesses the reasons for differences between the
    program's objectives and its outputs and
    outcomes.
  • Impact evaluation Focuses on the net effect of a
    program, identifying whether outcomes can be
    attributed to specific components of a program.
    For example, how much of observed improvement in
    air quality in a region can be attributed to the
    use of reformulated gas, a increased compliance
    of large facilities, or new technologies, a
    larger economic impacts?

13
Overview of Program EvaluationLessons Learned
from Experiences of PED
  • Clearly identify the customer for the evaluation
  • Get senior management buy-in
  • Determine the appropriate kind of evaluation
  • Develop a thorough workplan
  • Analysis must provide more than just a review of
    the interview transcripts
  • Communicate the results effectively

14
Key Questions to Ask Before Starting an
Evaluation
  • What is the evaluation's purpose? Who are the
    audiences for the evaluation?
  • What are the internal politics? Why do people
    want the evaluation? Are there those who will
    impede the evaluation?
  • Is this the right time to conduct the evaluation?
  • What data are needed? How will they be
    collected?
  • How will the data be analyzed and the results
    used?
  • Who should be involved in designing the
    evaluation?
  • Should the effort use internal or external
    evaluators?
  • What are the criteria for judging success?
  • What kind of support is needed to conduct the
    evaluation?

15
Key Questions to Ask What is the Evaluation's
Purpose? Who are the Audiences?
  • What is the evaluation's overriding goal? Who
    wants it? Who may not want it?
  • Is the evaluation required by law or regulation,
    or is it being undertaken as a good management
    practice?
  • Is the evaluation part of an overall agency or
    program strategic planning process?
  • What questions about the program is the
    evaluation intended to answer?
  • Who is the audience for this evaluation? What is
    its expectation for the evaluation?
  • Does the evaluation have upper management support?

16
What is the Evaluation's Purpose? Who are the
Audiences? Review of Minnesota Feedlot
Regulations
  • Determine the adequacy of environmental
    regulation of feedlots by the Minnesota Pollution
    Control Agency (MPCA) and delegated county
    agencies
  • Determine options for state policy makers to
    consider in helping to improve state and county
    feedlot regulation practices
  • Used a process evaluation that consisted of a
    less structured approach and focused on
    qualitative problem assessment and program
    recommendations

17
Key Questions to Ask Is this the Right Time to
Conduct the Evaluation?
  • The best time to conduct an evaluation is when
    decision makers and other stakeholders need
    information to make important decisions
  • Has the program been in operation long enough?
  • Will key participants and stakeholders cooperate?
  • Will results be available in a timely manner?

18
Is this the right time to conduct the
evaluation?Review of Minnesota Feedlot
Regulations
  • Growth of animal agricultural industry in MN has
    increased the public awareness of feedlots and
    the impacts.
  • Conducted the evaluation in anticipation of
    upcoming legislative changes to the state feedlot
    rules and a time of significant growth in the
    number of large feedlots

19
Key Questions to Ask What Data are Needed? How
Will they be Collected?
  • How to determine the needed data?
  • How will the data be collected?
  • What constitutes acceptable data?
  • Should the evaluation employ qualitative or
    quantitative data, or both?
  • Are the data required available and accessible?
  • What's the cost of gathering this information?

20
Key Questions to AskWhat Data are Needed? How
Will they be Collected? (cont.)
  • Available Techniques to Collect Data
  • Focus groups - small group discussions designed
    to obtain in-depth qualitative information.
  • Questionnaires, Telephone and In-person
    Interviews - responses can be open-ended or more
    quantitative
  • Be aware of Paperwork Reduction Act Issues/ICRS
  • Review of documents and other materials
  • Comparisons with other programs - either with a
    control group or non-equivalent comparison group
    (e.g., a similar program in a different area).

21
Key Questions to AskWhat Data are Needed? How
Will they be Collected? (cont.)
22
What data are needed? How will they be
collected?Review of Minnesota Feedlot Regulations
  • Assessed program performance in six areas
    permitting, environmental review, oversight,
    county programs, feedlot rules, and MPCA
    resources.
  • Collected data through interviews and surveys
    with MPCA staff, county regulatory staff,
    livestock producers, environmental groups,
    concerned citizens, and other state and national
    experts.
  • Data collection also included a detailed
    examination of MPCA permit, enforcement, and
    environmental review files, a review of reports
    and literature on feedlot issues, and a
    comparison with other states' feedlot regulation
    programs.

23
Key Questions to Ask How Will the Data be
Analyzed and the Results Used?
PESTICIDE INFORMATION PROGRAM
External Factor
External Factor
Farmer Experience
Economics of Pesticide Alternatives
Source Wisler 1995
24
Key Questions to Ask How Will the Data be
Analyzed and the Results Used (cont.)?
  • Decisionmakers should have a clear idea of how
    data will be analyzed (at least in general terms)
    and how they tend to use the results.
  • Will the information the evaluation generates be
    sufficient to influence decisions about the
    program?
  • Are upper management and other stakeholders
    committed to making changes as a result of the
    evaluation?

25
How will the results be collected and used?
Review of Minnesota Feedlot Regulations
  • Make recommendations to increase the
    effectiveness of feedlot programs. Examples
    include re-allocating staff, improving tracking
    of complaints, improving oversight of county
    feedlot programs, regular status reports on water
    quality enforcement cases, streamlining
    permitting, and reducing risks associated with
    existing small feedlots.
  • Results can be used to justify additional state
    resources.
  • Upper management committed to respond quickly to
    implement recommendations.

26
Key Questions to Ask Who Should Be Involved in
Designing the Evaluation?
  • Early and substantial knowledgeable program staff
    involvement can often improve the process. Both
    staff and managers must be included throughout
    the process. They can
  • Ensure that the views of internal and external
    (of the Agency) constituencies are considered and
    that evaluators are made aware of important
    program nuances and sensitivities
  • Identify sources of data on the costs, quality,
    outcomes, and value of the programs
  • Communicate findings to other important
    constituencies

27
Who should be involved in designing the
evaluation?Review of Minnesota Feedlot
Regulations
  • Evaluation was conducted by the Minnesota Office
    of the Legislative Auditor.
  • Considered many stakeholders (MPCA staff and
    management, county staff, regulated community,
    concerned citizens, and environmental groups).

28
Key Questions to Ask Should the Effort Use
Internal or External Evaluators?
  • Internal evaluators (from that office)
  • Serve as a link between program managers and top
    organization officials.
  • Use their knowledge of the organization's
    decisionmaking process, traditions, and culture.
  • Can identify and work to avoid bias in data
    collection and evaluation.
  • Have a long-term commitment to the organization.

29
Key Questions to Ask Should the Effort Use
Internal or External Evaluators? (cont.)
  • External evaluators
  • Provide a fresh, focused, and potentially more
    objective look at a program.
  • Perform effective comparative evaluations.
  • Bring a level of skepticism that agency employees
    cannot.
  • Help managers and staff focus their needs and
    precisely determine their expectations of the
    evaluation.

30
Should the effort use internal or external
evaluators?Review of Minnesota Feedlot
Regulations
  • External (Minnesota Office of the Legislative
    Auditor)
  • Maintained objective tone and performed
    comparative evaluations.

31
Key Questions to Ask What are the Criteria for
Judging Success?
  • Successful evaluations are those that are
  • Relevant
  • Timely
  • Technically correct
  • Clear in the options presented to decisionmakers
  • Easy to read
  • Used to make changes and improvements, if
    necessary

32
What are the criteria for judging success?Review
of Minnesota Feedlot Regulations
  • Timely and relevant MPCA is re-structuring its
    feedlot regulatory programs and considering the
    need for additional regulations.
  • Presented clear findings and recommendations.
  • Easy to read.
  • Full detailed report and useful summary available
    on the Agency's Web site at www.auditor.leg.state.
    mn.us/ped/1000/pe9904.htm.
Write a Comment
User Comments (0)
About PowerShow.com