Planning For Evaluations Module 2 - PowerPoint PPT Presentation

1 / 62
About This Presentation
Title:

Planning For Evaluations Module 2

Description:

Module 2 Essential Skills Series: An Introduction to Evaluation Concepts and Practice Canadian Evaluation Society Date:_____ Location:_____ – PowerPoint PPT presentation

Number of Views:321
Avg rating:3.0/5.0
Slides: 63
Provided by: jkruger
Category:

less

Transcript and Presenter's Notes

Title: Planning For Evaluations Module 2


1
Planning For Evaluations Module 2
  • Essential Skills Series
  • An Introduction to Evaluation Concepts and
    Practice
  • Canadian Evaluation Society

Date_________________ Location______________
2
Workshop Agenda
Registration 830 - 900 a.m. Evaluation
Frameworks 900 - 1000 a.m. Logic Models
000 - 1015 a.m. Break
1015
- 1030 a.m. Logic Models (continued)
1030 - 1200 p.m.
Small group exercises Lunch 1200 - 100
p.m. Performance Measurement Plan 100 - 215
p.m. Small group exercises Break 215 - 230
p.m. Performance Measurement Plan (continued)
230 - 300 p.m. Evaluation Plan and Reporting
Strategy 300 - 400 p.m. Planning and
Managing Frameworks 400 - 420 p.m. Group
discussion Workshop Evaluation 420 - 430
p.m.
3
Workshop Objectives
  • Understanding of evaluation frameworks
  • Understanding of logic models
  • Awareness of performance measurement plans
  • A basic understanding of evaluation planning and
    reporting
  • Awareness of principles for managing frameworks

4
Section 1.
Evaluation Frameworks
5
Planning For Evaluations
  • Why Plan?
  • Increase understanding of evaluation process and
    rationale
  • Enhance relevance of evaluation for
    decision-making
  • Build support for ongoing performance measurement
    and evaluation
  • Help ensure data are available for eventual
    evaluation
  • Make the evaluation process manageable
  • Facilitate the coordination of efforts
  • When Does Planning Take Place?
  • Ideally, when the program or policy is being
    designed (evaluation framework)
  • Just before the evaluation is conducted

6
Evaluation and the Management Life-cycle
Design an intervention
Initial Situation
Reconsider, redesign, expand, reduce or end
Identify a need
Program/Policy start-up
FINAL OUTCOMES
IMMEDIATE OUTCOMES
INTERMEDIATE OUTCOMES
Summative/Outcome/ Impact Evaluation
Formative/Mid-term/Process Evaluation
Needs Assessment
Baseline
Evaluation Framework
Evaluability Assessment/ Planning
Ongoing Performance/Outcome Monitoring/Measurement
Source Adapted from Birch-Jones, J.,
Integrating PM and Evaluation Bridging the
Chasm, CES-NCR, 2002.
7
Preliminary Considerations for Evaluation Plan
Development
  • Why the evaluation is being conducted
  • Who the client is for the evaluation
  • The decisions the evaluation is intended to
    support
  • The evaluation questions that will provide the
    evidence to help with the decisions
  • The outline of linkages between inputs,
    activities, outputs, and outcomes for the
    proposed policy, program, initiative, or function
    (e.g., logic model, cause-and effect, and
    implementation theories)
  • Performance indicators and availability of
    performance information
  • The methods that would be appropriate for the
    evaluation
  • Time lines
  • Resource implications
  • Source Health Canada (July 2008) Evaluation
    Project Workplan Assessment Guide

8
Why is the Evaluation Being Conducted? And Who is
the Client?
  • What decisions are to be taken?
  • Accountability?
  • Learning?
  • Who is the client / are the clients?
  • Internal vs. External
  • Staff and operational management vs. senior
    management
  • Line vs. policy
  • Others

9
Whats in a Framework?
  • Key Elements
  • Profile/background/context
  • Results logic
  • Evaluation Issues/Questions
  • Performance Measurement Evaluation Strategy
  • Framework is a plan for an evaluation
  • Blueprint for results-based management
  • Provides information on what a program, policy or
    initiative is expected to achieve and how this
    will be demonstrated

10
Evaluation Frameworks
  • One key to a good framework is understanding the
    need for the evaluation
  • The need for an evaluation may come from
  • statutory/funding requirements
  • program sunset/renewal/redesign
  • senior management concern
  • combination
  • Rationale for evaluation will influence scope and
    focus
  • knowing managements rationale and level of
    support for the evaluation is key strategic
    information

11
Evaluation Frameworks
  • Another key to a good framework is understanding
    the program, policy, or initiative itself
  • How do you find out about a program?
  • review documents (reports, other studies, files,
    legislation)
  • conduct a review of literature
  • browse the website (if applicable)
  • examine program data (files and databases)
  • conduct interviews (managers, staff, clients)
  • observe program/staff in action
  • What do you find out?
  • how the program works
  • whats important to the program
  • whats working well and whats not
  • potential evaluation issues
  • key information upon which to base the profile


12
Evaluation Frameworks Profiles
  • Whats in a Profile?
  • The profile section
  • describes the policy, program or initiative
  • provides context and rationale
  • provides a clear picture of what the policy or
    program or initiative is intended to achieve and
    how it intends to do so
  • Concise description of
  • origin and rationale (demonstrated need for the
    program, policy or initiative)
  • resources allocated and how these will be used
  • key stakeholders including delivery partners and
    primary intended beneficiaries (clients or target
    population)
  • objectives of the program, policy or initiative
    and how these link to the organizations
    strategic outcomes

Source Treasury Board Secretariat (2005).
Preparing and Using Results-based Management and
Accountability Frameworks. http//www.tbs-sct.gc.c
a/eval/pubs/RMAF-CGRR/guide/guide_e.pdf. Accessed
March 17, 2009.
13
Evaluation Frameworks
  • Tips for Developing a Good Profile
  • Keep in mind that you are trying to describe the
    program, not evaluate it (yet)
  • Use the key source documents available
  • Avoid cheerleading in the profile use neutral
    wording
  • Have program staff write the profile evaluator
    provides advice/suggestions
  • Be prepared to clarify objectives/goals that are
    not clear
  • Revisit the profile after developing the logic
    model to ensure consistency between stated and
    actual outcomes (rhetoric versus reality)

14
Section 2.
Logic Models
15
Logic Models
  • A logic model identifies the linkages between the
    activities of a policy, program or initiative and
    the achievement of its outcomes
  • Serves as a road map, showing the chain of
    results connecting activities to the final
    outcomes and what progress looks like along the
    way
  • Logic models test whether a policy, program or
    initiative makes sense from a logical
    perspective
  • Provides a fundamental backdrop on which the
    performance measurement and evaluation plans are
    based


Source Treasury Board Secretariat (2005).
Preparing and Using Results-based Management and
Accountability Frameworks. http//www.tbs-sct.gc.c
a/eval/pubs/RMAF-CGRR/guide/guide_e.pdf. Accessed
March 17, 2009.
16
Logic Models Key Elements
  • Inputs The financial and non-financial
    resources used to produce outputs and accomplish
    outcomes.
  • Activities An operation or work process
    internal to an organisation, intended to produce
    specific outputs (e.g. products or services).
    Activities are the primary link in the chain
    through which outcomes are achieved.
  • Outputs Direct products or services stemming
    from the activities of a policy, program, or
    initiative, and delivered to a target group or
    population. Usually things you can count.
  • Outcomes An external consequence attributed to
    an organisation, policy, program or initiative
    that is considered significant in relation to its
    commitments. Outcomes may be described as
    immediate, intermediate or final (end), direct or
    indirect, intended or unintended. A good outcome
    statement represents the type of change wanted,
    includes reference to the target population or
    intended beneficiary and does not include
    reference to the how.

Source TBS Results-based Management Lexicon
http//www.tbs-sct.gc.ca/rma/lex-lex_e.asp
Accessed Dec 15, 2008
17
Logic Models Flow Chart Example
Inputs/
18
Logic Models Flow Chart Example (2)
Source Adapted from An Evaluation Framework
for Community Health Programs, The Center for
the Advancement of Community Based Public Health,
2000.
19
Logic Models Results Chain
Area Of Control Internal to the Organization
Area of Influence External to the Organization
Immediate Outcomes (Direct)
Intermediate Outcomes (Indirect)
Ultimate Outcome
Inputs (Resources)
Activities
Outputs
External Factors
Efficiency
Effectiveness
Source Treasury Board Secretariat,
Results-based Management and Accountability
Framework Guidance, 2001 www.tbs-sct.gc.ca/eval/pu
bs/RMAF-CGRR/rmafcgrr_e.asp
20
A Logic Model With Assumptions and Factors
Source http//www.uwex.edu/ces/pdande/evaluation/
evallogicmodel.html Retrieved November 14, 2008
from the University of Wisconsin-Extension
website.
21
Program Model Components (Recap)
OUTCOMES Short-Term ? Long-Term
OUTPUTS
INPUTS
ACTIVITIES
Resources placed into the program
Core programtasks
Products services of the program
Impact or effectiveness of the program
Examples Money Staff Operational
expenses Capital assets
Examples Change in Knowledge/awareness
Skills Attitude/opinions ?
Behaviour ? Condition Status
Morbidity, mortality
Examples Teaching Presentations
Counselling Mentoring Treatment
Examples materials distributed
classes taught sessions conducted hours
of service delivered -------------------------
--------- of participants (reach)
Source Adapted from Measuring Program Outcomes
A Practical Approach. United Way of America, 1996.
22
Benefits of the Logic Model
  • Helps with understanding overall structure,
    function of program as well as rationale behind
    activities
  • Helps to ensure that program activities and
    intended results correspond
  • Helps identify key questions for the evaluation
  • Helps communicate the elements of the program to
    policy makers, staff, external funding agencies,
    media, and colleagues
  • Helps to reveal where steps in the program break
    down


Source Adapted from K. Farell et al.,
Evaluation made very easy, accessible, and
logical, 2002.
23
Limitations of the Logic Model
  • Initially time consuming (days/weeks/months)
  • Requires patience
  • Does not always capture all aspects of the
    program (e.g., program costs may not be included
    in the model)
  • May not represent all external influences and
    factors


Source Adapted from K. Farell et al.,
Evaluation made very easy, accessible, and
logical, 2002.
24
Characteristics Of Good Logic Models
  • Activities and outputs are distinct from
    outcomes the general rule is that if you control
    it, its an activity/output, if you can only
    influence it, then its an outcome
  • Outputs demonstrate that you are busy
  • Outcomes have a distinct who and what, what
    change in whom?
  • Outcomes demonstrate that you are making a
    difference
  • Outcome statements are simply worded, contain
    only one outcome, and no hows
  • The linkages between outputs and outcomes are
    clear
  • Outcomes are not tied to particular timeframes
  • No more than 2 or 3 final outcomes
  • Linked to strategic outcomes and program goals
    but reflect reality


Sources US GPRA Guidance 1999, Mayne (various),
Montague Focusing on Inputs, Outputs and
Outcomes, Canadian Journal of Program Evaluation,
2000
25
Outcome Examples
Type of Change Illustration
Change in circumstances Children safely reunited with their families of origin from foster care
Change in status Unemployed to employed
Change in behaviour Truants will regularly attend school
Change in functioning Increased self-care getting to work on time
Change in attitude Greater self-respect
Change in knowledge Understand the needs and capabilities of children at different ages
Change in skills Increased reading level able to parent appropriately
Maintenance Continue to live safely at home (e.g., the elderly)
Prevention Teenagers will not use drugs
Source Patton, M.Q., Utilization-Focused
Evaluation The New Century Text, 1997.
26
Thinking About What to Include Where
  • Some program components are difficult to classify
    as activities, outputs, or outcomes. These
    examples provide general guidelines (although
    exceptions may be appropriate)
  • recruiting and training staff and volunteers,
    purchasing or upgrading equipment, and various
    support and maintenance activities are usually
    foundational (and not on a logic model)
  • number of participants served is sometimes an
    output, but if participation is discretionary,
    could be considered an early outcome
  • participant satisfaction is usually an early
    outcome


27
Where to Start?
  • There is no single way to create a logic model.
    Where you start often depends on the
    developmental stage of the program
  • Strive for simplicity and dont be
    over-inclusive. Dont include all the
    implementation details. Try to fit the whole
    logic model on one page.
  • Discuss the logic model with staff involved at
    all levels in the program (or involve them in a
    logic model development workshop)
  • Post-it notes are a great tool for logic model
    development


Source Adapted from Porteous, N.L. et al.,
Program Evaluation Tool Kit A Blueprint for
Public Health Management, 1997.
28
Small Group Exercise 1
  • Educational grants program (grants provided to
    students for post-secondary education, targeting
    low-income population)
  • What would be some outputs?
  • What would be some immediate outcomes?
  • What would be some intermediate outcomes?
  • What would be some final outcomes?


29
Small Group Exercise 2 Building an Evaluation
Framework
  • Clarifying the Programs Logic and Theory of
    Change
  • The problem gambling needs assessment found that
    gambling has become a very important pastime for
    young adults aged 18-24 years. Analysis of
    Rockwoods databases showed a steady increase in
    the number of young adults with gambling
    problems. Focus groups painted a bleak picture
    although casinos, lotteries and mass media are
    aggressively marketing gambling to young adults,
    the message has not gotten out to young adults
    about the dangers of problem gambling and the
    importance of seeking prompt treatment. In fact,
    a recent study has found twice as many young
    adults aged 18 to 24 developed severe or moderate
    gambling problems as the overall population.
    Other findings, such as the fourfold rise between
    2001 and 2005 of young adults now playing online
    poker, indicate this trend is likely to continue
    or increase.
  • As one response, the Rockwood Program Planning
    Team decided to offer a Gambling Prevention
    Program for young adults. The prevention program
    intends to make young adults aware of dangerous
    gambling behaviour (such as skipping class or
    work to gamble) in themselves or their friends,
    help them to see gambling as a form of occasional
    entertainment rather than a glamorous and
    lucrative lifestyle, minimize serious financial
    problems, and encourage young adults with
    gambling problems to get help when necessary.
  • Program activities include staging gambling
    awareness events in schools featuring
    celebrities, placing edgy public service
    announcements in the mass media, displaying
    posters in malls and on mass transit that
    discourage gambling, and having booths about
    problem gambling at health fairs and staffing
    them with trained peers who will provide advice
    in a non-threatening and non-judgmental way.

30
Small Group Exercise 2
  • Using case study material
  • Select a logic model template
  • Identify some key activities and outputs,
    immediate outcomes, intermediate outcomes and
    final outcomes
  • Present them in a logic model format
  • Spend a few minutes discussing what you found
    challenging when developing your logic model


31
Small Group Exercise 2 Worksheet
32
SECTION 3.
Performance Measurement Plan
33
Performance Measurement Plan
  • A performance measurement plan identifies
  • What needs to be measured on an ongoing basis
  • How
  • How often, and
  • What is the cost of the performance measurement
    plan (where possible)
  • Use performance monitoring to produce yearly
    performance reports of program results for
    funders, boards, senior managers, staff, and
    public
  • Performance monitoring systems are only tools.
    When properly developed and used, they can reveal
    problems, point to solutions and are a check on
    the effectiveness of solutions once implemented
  • Performance measurement plans help organizations
    to avoid the tendency to concentrate on the
    things that are easiest to measure, such as The
    number of people trained (service process
    measure) rather than The number of people with
    jobs (outcome measure)

34
Performance Measurement Plan (2)
  • Performance indicators show whether an output was
    produced or a specific outcome was achieved
    (success)
  • Describes specific, observable, measurable
    characteristics or changes that represent
    achievement of an output or outcome
  • Can be quantitative or qualitative (mix of both
    is good)
  • Indicators include
  • number/percent of ...
  • incidence of ...
  • proportion of

Source Hatry, Harry P. (1999). Performance
Measurement Getting Results. The Urban Institute
Press, Washington, D.C.
35
Examples Of Indicators
Smoking Cessation Program
  • Indicators
  • Number and percent of participants who report
    that they have quit smoking by the end of the
    class
  • Number and percent of participants who have not
    relapsed 6 months after program completion
  • Outcome
  • Participants stop smoking

Information and Referral Service
  • Outcome
  • Callers access services to which they are
    referred or about which they are given information
  • Indicators
  • Number and percentage of community agencies that
    report an increase in new participants who came
    to their agency as a result of a call to the
    information and referral line

Source Outcome Measurement Showing Results in
the Nonprofit Sector, United Way of America, 1999.
36
Examples Of Indicators (2)
37
Examples Of Indicators (3)
  • Number and percent of newborns weighing at least
    5.5 pounds and scoring 7 or above on APGAR scale
  • Number and percent of teen mothers who graduate
    from high school over a 4-year period
  • Percent of participants who recall content of
    brochure, posters, or presentations
  • Number of action plans developed by participants
  • Number and percent of participants who have not
    relapsed six months after program completion
  • Number and percent of participants who
    demonstrate increase in ability to read, write,
    and speak English by end of program
  • Number and percent of youth who return home
  • Number and percent of teen mothers using a
    recommended form of birth control
  • Number and percent of legislators who voted in
    support of your position before and after an
    advocacy campaign

Source Outcome Measurement Showing Results in
the Nonprofit Sector, United Way of America, 1999.
38
Selecting Indicators
  • Things to think about
  • Useful to decision makers
  • Decisions to be based on results information
  • Understandable to everyone
  • Clearly defined
  • Measurable
  • Use proxy indicators when necessary
  • Obtain baseline data and set targets (where
    possible)
  • Involve stakeholders
  • Use quantitative and qualitative indicators as
    appropriate
  • Try to limit the number of indicators

39
Criteria For Good Performance Indicators
  • Relevance
  • Importance
  • Understandability
  • Program influence or control over the outcome
  • Feasibility
  • Cost of collecting the indicator data
  • Uniqueness
  • Manipulability
  • Comprehensiveness

Source Hatry, Harry P. (1999). Performance
Measurement Getting Results. The Urban Institute
Press, Washington, D.C.
40
Measurement Through Monitoring
  • A monitoring system is typically key for
    performance measurement
  • Key monitoring system steps
  • Step 1 Obtain Top-level Support
  • Step 2 Develop Logic Model
  • Step 3 Select Measures
  • Step 4 Decide When to Measure
  • Step 5 Select a Data Collection Method
  • Step 6 Implement the Plan
  • Step 7 Analyze Data
  • Step 8 Use Data

41
Strengths and Weaknesses ofMonitoring Systems
  • STRENGTHS
  • Provides information about process and key
    results
  • Easy to understand and use
  • Improves accountability for effective programs
  • Improves program management
  • Provides early warning about potential problems
    with achievement of objectives
  • Facilitates evaluation
  • WEAKNESSES
  • Depends on setting realistic and appropriate
    measures
  • Requires good grasp of client outcomes
  • Can be expensive and time-consuming to collect
    data and to maintain the system
  • Does not establish causal relationship between
    services and outcomes (Describes rather than
    explains)
  • Difficult to compare programs in different
    settings, since indicators provide little
    contextual information

42
Small Group Exercise 3
Gambling Education Program
For each performance indicator listed have your
group identify whether it is a PROCESS or OUTCOME
indicator.
Type of Indicator Process Outcome
1. Thirty students attend the gambling
education workshops. _____ _____ 2. Six group
workshops are conducted. _____ _____ 3.
Students awareness of problem gambling
increases. _____ _____ 4. Students behave
in a way that reduces the risk of developing
gambling problems. _____ _____ 5. Students
participate in role plays and group
discussions. _____ _____
43
Small Group Exercise 4
  • Thinking about your logic model components for an
    educational grants for students program
  • For the immediate, intermediate and final
    outcomes developed in the previous exercise,
    develop at least 1 indicator for each
  • Discuss how this information could be used
  • Spend a few minutes discussing what you found
    challenging about identifying indicators

44
Small Group Exercise 4 Worksheet
Educational Grants Program
  • Outcome
  • Indicators

















45
Small Group Exercise 5
  • Using the case study material and your logic
    model
  • For the immediate and intermediate outcomes
    developed in the previous exercise, develop at
    least one indicator for each outcome
  • Discuss how this information could be used
  • Spend a few minutes discussing what you found
    challenging about identifying indicators

46
Small Group Exercise 5 Worksheet
Case
  • Outcome
  • Indicators

















47
Performance Measurement Plan
Expected Outputs and Outcomes (1) Expected Outputs and Outcomes (1) Indicator (2) Baseline Measure (3) Data Source (4) Collection Method (5) Responsibility for Collection (6) Timing / Frequency of Measurement (7) Timing / Frequency of Measurement (7) Timing / Frequency of Measurement (7)
Expected Outputs and Outcomes (1) Expected Outputs and Outcomes (1) Indicator (2) Baseline Measure (3) Data Source (4) Collection Method (5) Responsibility for Collection (6) Ongoing Measure-ment Formative Evaluation Summative Evaluation
Outputs Output 1
Output 2
Output 3
Output x
Outcomes Immediate Outcome 1
Immediate Outcome 2
Immediate Outcome x
Intermediate Outcome 1
Intermediate Outcome x
Final Outcome 1
Final Outcome2
Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs Estimated Performance Measurement(PM) Costs
Source Health Canada (2006) Results-based
Management and Accountability Framework
Assessment Guide
48
Section 4.
Evaluation Plan
49
Evaluation Plan
  • The evaluation plan sets out a strategy for a
    periodic, in-depth look at how well a policy,
    program or initiative is doing
  • It identifies
  • the key evaluation issues/questions that should
    be addressed
  • the key pieces of information that need to be
    collected to answer the evaluation questions
  • the data collection plan
  • potential costs for an evaluation
  • suggested timing for evaluation
  • reporting requirements
  • Builds on a performance measurement plan
  • adds context and depth to information being
    collected through ongoing performance
    measurement.

50
Evaluation Plan Issues and Questions
  • Evaluation questions usually relate to 3 kinds of
    issues
  • Relevance
  • Does the policy, program or initiative continue
    to be consistent with organizational priorities,
    and does it realistically address an actual need?
  • Success (Effectiveness)
  • Is the policy, program or initiative effective in
    meeting its intended outcomes, within budget and
    without unwanted negative outcomes? Is the
    policy, program or initiative making progress
    toward the achievement of the final outcomes?
  • Cost-effectiveness and alternatives
  • Are the most appropriate and efficient means
    being used to achieve outcomes, relative to
    alternative design and delivery approaches?
  • Evaluation also addresses design and delivery
    issues

51
Evaluation Plan Issues and Questions (contd)
  • Some issues and questions may be more relevant
    during an early evaluation (i.e., formative
    evaluation), some later (i.e., summative
    evaluation), and others may be relevant to both
  • The key to a good evaluation plan lies in linking
    the evaluation issues and questions back to the
    program rationale (the need) and logic model
    (what success will look like)
  • Mid-term evaluation usually looks at
    implementation and delivery issues (for a new
    program), as well as early outcomes, and the
    adequacy of the performance measurement plan
  • Data from a performance measurement plan can
    contribute to questions about design and delivery
    as well as early outcomes
  • Summative evaluation usually looks at performance
    information on immediate and intermediate
    outcomes, as well as their contribution to final
    outcomes
  • Summative evaluations may also look at relevance
    and cost-effectiveness and alternatives, as well
    as any other important issues that are identified

52
Evaluation Plan Issues and Questions (contd)
  • Evaluation issues are often identified by program
    staff and stakeholders
  • Interviews with a small number of program staff,
    senior management, partner organizations, and
    clients is an excellent way to identify potential
    evaluation issues
  • At this point, will have a long list of
    potential evaluation issues that will need to
    be prioritized and any nice to know/less
    important issues eliminated

53
Evaluation Plan Data Requirements
  • Based on the evaluation issues/questions, the
    evaluation data requirements can be identified
  • these are the key pieces of information that need
    to be collected to answer the issue/question
  • if derived from the logic model, might already
    have a measurement plan (e.g., for issues related
    to success)
  • for new issues, will need to identify indicators
    and a data collection strategy
  • Once developed, can select the best set of
    indicators, based on consideration of
  • reliability, validity and credibility of the
    indicator
  • cost-effectiveness (in terms of cost to collect
    and to process)
  • whether it is directly linked to the evaluation
    question
  • Data collection strategy identifies
  • indicator
  • data source
  • method of data collection
  • frequency of data collection
  • potential costs for the evaluation
  • Similar to performance measurement plan but more
    comprehensive

54
Evaluation Information Summary Table (Example)
Evaluation Issues and Questions (1) Indicator (2) Baseline Measure (3) Data Source (4) Collection Method (5) Collection Method (5) Responsibility for Collection (6) Responsibility for Collection (6) Timing / Frequency of Measurement (7) Timing / Frequency of Measurement (7) Timing / Frequency of Measurement (7)
Evaluation Issues and Questions (1) Indicator (2) Baseline Measure (3) Data Source (4) Collection Method (5) Collection Method (5) Responsibility for Collection (6) Responsibility for Collection (6) Ongoing Performance Measurement Formative Evaluation Summative Evaluation
Relevance
Question 1
Question 2
Implementation
Question 1
Question... x
Effectiveness
Question 1
Question 2
Question ...x
Cost Effectiveness
Question 1
Question 2
Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues Expenditure Review Committee Questions / Issues
Question 1
Question... x
Estimated Costs
Source Health Canada (2008) Evaluation Project
Workplan Assessment Guide
55
Section 5.
Reporting Strategy
56
Reporting Strategy
  • Identifies what plans are in place to
    systematically report on the results of ongoing
    performance measurement and evaluation
  • Includes information on what, where and when, for
    example
  • Baseline measurement - (typically no report)
  • Ongoing performance measurement - Annual Report
  • Formative evaluation study - Mid-term Report
  • Summative evaluation study - Final Evaluation
    Report
  • Think about who you are reporting to
  • Senior decision-makers, central agencies, funders
  • Program managers
  • Stakeholders
  • Clients/communities
  • May also be useful to incorporate other reporting
    requirements which may draw on the same evidence
    (e.g., federal government Departmental
    Performance Reports and Reports on Plans and
    Priorities)

57
Small Group Exercise 6
  • Considering your case, logic model and indicators
    as well as the typical issues shown both in
    this session and in module 1 (slides 10, 36 and
    37)
  • What would be some issues, indicators, sources
    and methods? (Address other elements of the
    Evaluation Information Summary Table if you wish.)

58
Section 6.
Planning and Managing Frameworks
59
Managing Frameworks Lessons Learned
  • Allow for some months of elapsed time between
    start and finish of an evaluation framework and
    plan
  • Managers might not see the difference between
    performance measurement and evaluation
  • Past experience will influence their perception
    of what it is/entails
  • results reporting, balanced score card, deputy
    dashboard, etc.
  • traditional frameworks and evaluations
  • May need to adjust perceptions to new way of
    thinking and approach to performance measurement
    and evaluation
  • Need to clarify terminology (early and often)

60
Managing Frameworks Lessons Learned (2)
  • Need to clearly define roles and responsibilities
    early on
  • Decide on management structure for the framework
    and plan
  • Working Group/Steering or Advisory Committee
  • internal/external
  • who is lead?
  • who reviews what?
  • who signs off/approves?
  • The simpler the co-ordinating mechanism, the
    better

61
Managing Frameworks Lessons Learned (3)
  • Logic model is the key to bridging the chasm
    between activities and reporting on outcomes
  • Facilitated inclusive approach to logic model
    development is key
  • Takes longer but critical to understanding and
    buy-in
  • Requires a unique approach - developmental
    facilitation
  • Can involve staff, Board, partners, funders,
    clients, users, regulatees
  • OK to let them know that we are all learning this
    together
  • Involve them early on in the process
  • Partners and co-deliverers need to understand and
    agree on performance logic and measures before
    they can provide results information
  • The framework doesnt have to be perfect will
    evolve as program capacity grows
  • Need to review and adjust performance measurement
    system over time
  • Implementation of a framework is key otherwise,
    it is back to outputs, opinions and anecdotes for
    the evaluation
  • Development of a framework is only the first step
    in a lengthy process

62
Managing Frameworks Lessons Learned Group
Discussions
  • Do you have any lessons learned to share around
    managing evaluation frameworks?
Write a Comment
User Comments (0)
About PowerShow.com