Title: World Bank Independent Evaluation Group
1World BankIndependent Evaluation Group
- How to Increase the Utilization of Evaluations
- Michael Bamberger
2Session outline
- Defining and measuring evaluation utilization
- Reasons why many evaluations are under-utilized
- Examples of evaluation utilization the World
Bank Influential Evaluations study - Ways to strengthen utilization
- If time permits .. Further discussion on
presenting the message
3- 1. Defining and
- measuring
- evaluation utilization
4Defining evaluation outcomes and impacts
- Use
- How evaluation findings are utilized by
policymakers, managers and others - Influence
- How the evaluation influenced decisions and
actions - Consequences
- How the process of conducting the evaluation, the
findings and the recommendations affected the
agencies involved and the target populations.. - Consequences can be
- Positive or negative
- Expected or unanticipated
5Measuring evaluation outcomes and impacts
- Changes in individual
- Knowledge
- Attitudes
- Behavior
- Changes in organizational behavior
- Changes in program design or implementation
- Changes in policies and planning
- Decisions on project continuation, expansion and
funding.
6Measurement issues
- Time horizon
- Intensity
- Reporting bias
- Many agencies do not acknowledge they have been
influenced - Attribution
- How do we know the observed changes were due to
the evaluation and not to other unrelated
factors?
7Attribution analysis
- How do we know if observed changes were due to
the evaluation and not to other influences? - Stakeholder interviews
- Surveys pretest/posttest or posttest only
- Analysis of planning and policy documents for
evidence of evaluation influence - Analysis of mass media
- Key informants
8Examples of attribution methodology used in
Influential Evaluations
- See
- Influential evaluations Detailed case studies
pp. 69-72
9Attribution analysis framework
2. Assess whether there is a plausible case
for attributing part of the effects to the
evaluation
1. Identify potential Impacts (effects)
Triangulation
3. What proportion of the effects can be
attributed to the evaluation?
10A. Comparison of 2 user surveys Stakeholder
opinion survey
-
- Comparison of sample surveys of service users (in
1993 and 1999) found reported improvement of
services potential impact - Sample of 35 public service agencies,
policy-makers, mass media and civil society
corroborated influence of survey in influencing
improvements.
Bangalore Citizens Report Cards
11B. Testimonials from stakeholders
Bulgaria Metallurgical Project and Indonesia
Village Water Supply
- Clients (Metallurgical company, Development Bank
AUSAID Project Dept, Indonesian water agency)
asked to send testimonials in letter or e-mail
confirming influence of the evaluation. - In Bulgaria clients asked to confirm validity of
benefit projections and to confirm benefits were
attributable to the evaluation.
12C. EXPERT ASSESSMENT PAPER TRAIL
India Employment Assurance Program
- Considered too bureaucratically difficult to
solicit Government views on effectiveness of
government agency (Evaluation Organization). - Opinions of Bank Resident Mission and other
experts solicited on the influence of the
evaluation and the credibility of the estimates
of cost-savings and employment generation. - Paper trail specific references in the Five
Year Plan and follow-up sector planning documents
to how the evaluation was used.
13D. Logical deduction from secondary sources
Public expenditure tracking study (education)
Uganda
- Follow-up PETS study estimated increased funds
utilization (potential evaluation impact) - Extensive coverage of the report in the media
- Government documents show how the findings were
used - Reports show how community groups use the budget
information posted in schools/ media.
14 2. Reasons why evaluations are under-utilized
15 1 Lack of ownership
- Evaluation focus and design are determined by
donor agencies or outside experts with little
real input from client. - The goals definition game alienates clients
- Limited consultation with, and feedback to,
clients. - Evaluation seen as a threat
16 2 Timing
- The evaluation findings are
- presented too late to be useful
- Too soon before policymakers or managers have
started to focus on the issues discussed in the
report
17 3 Poor communication between evaluator and
client
- Clients are not kept in the loop
- Clients may not like evaluators communication
style - Language problems
- Conceptual problems
- The objectivity paradigm limits contact and
communication between evaluator and client - Client does not share information with other
stakeholders
18 4 Lack of flexibility and responsiveness to
client needs
- Rigid design that cannot be adapted to client
needs or changing circumstances - Quasi-experimental design that cannot adapt
indicators and data collection methods to
changing circumstances. - Objective stance of evaluator limits
interaction with clients. - Timing too early or too late
- continues next page
19- Finance ministries try to force evaluation
indicators and focus to correspond to budget line
items - National evaluation systems sometimes introduce
top-down, uniform evaluation/reporting systems
not reflecting the reality of different agencies
20 5 Resource constraints
- Budget constraints affect
- Data collection
- Data analysis
- Bringing staff together to participate in the
evaluation process - Translation into local languages
- Report preparation and dissemination
- Limited local expertise
21 6 Time constraints
- Too many demands on client and stakeholders time
- The evaluators do not have enough time to
- Consult with clients during evaluation planning
- Design and implement the evaluation properly
- Discuss the draft report with clients
- Organize effective dissemination meetings
22 7 Relevance
- The evaluation does not address priority
information needs of clients - Much of the information is not considered useful
- The information is not analyzed and presented in
the way that clients want - Too detailed
- Too general
23 Factors external to the evaluation affecting
utilization
- Problems with the evaluation system
- Dissemination mechanisms
- Political ethics (attitudes to transparency)
- Clients lack of long-term vision
- Government perception of evaluation
243. Examples of evaluation utilizationThe World
Bank Influential Evaluations study
25How are evaluations used? When are they
influential?
- Evaluation never the only factor. How does the
evaluation complement other sources of
information and advice. - Political cover for difficult decisions
- Identifying winners and losers and showing
how negative impacts can be mitigated. - Credibility and perceived independence of the
evaluator may be critical -
continued next page
26- The big picture helping decision-makers
understand the influence of the social, economic
and political context. - Help managers understand how political and other
pressures limit project access to certain groups - Providing new knowledge or understanding
- Catalytic function bringing people together or
forcing action.
27Types of influence that evaluations can have
- India Employment Assurance
- Broader interagency perspective helped identify
duplications and potential cost savings. - Evaluation Office had high-level access to
Planning Commission - India Citizen Report Cards
- Alerting management to service problems and
- providing quantitative data to civil society
pressure groups - Indonesia Village Water Supply
- Making policy-makers aware of importance of
gender issues and participatory approaches - continued next slide
28- Large Dams
- Created political space for introducing new
social and environmental criteria for evaluating
dams and - launching dialogue that facilitated creation of
World Commission on Dams. - Pakistan Wheat Flour Ration Shops
- Political cover for sensitive political decision
- Showed how to mitigate negative consequences
- continued next page
29- Uganda Education expenditures
- Developed methodology to document what everyone
suspected (expenditure wastage) - provided documentation to civil society to
pressure for improvements - 7. Bulgaria Metallurgical Project
- Alerting borrowers and Development Bank to new EU
legislation - showing how to avoid fines
- how to advance launch of mineral production
- continued next page
30- China Forestry Policy
- Legitimized questioning the logging ban
- promoting more in-depth policy research
- facilitating creation of Forestry Task Force
31What difference did the evaluation make?
- Major cost savings (India, Bulgaria, Pakistan)
- Increased financial benefits (Uganda, Bulgaria)
- Forced action (Bangalore, Uganda)
- Strengthened gender and participatory planning
and management of water (Indonesia) - Introduced social assessment of dams but
discouraged future investments (Dams) - Continued next slide
32- Increased efficiency of service delivery (India,
Bangalore, Indonesia) - Facilitated creation of important policy agency
(Dams, China)
334. Ways to strengthen evaluation
utilization
34Ways to strengthen evaluation utilization
- 1. Deciding what to evaluate
- 2. Timing
- When to start
- When to present the findings
- 3. Deciding how to evaluate
- Choosing the right methodology
- 4. Ensuring effective buy-in
- Stakeholder analysis and building alliances
- The importance of the scoping phase
- Formative evaluation strategies
- Constant communication with clients
35- 5. Evaluation capacity building
- 6. Deciding what to say see next section
- 7. Deciding how to say it see following
section - Effective communication strategies
- 8. Developing a follow-up action plan
36 6. Deciding what to say
- Technical level
- Amount of detail
- Focus on a few key messages
- Target messages to key audiences
37Sources of lessons about a program
- Evaluation findings
- Experience of practitioners
- Feedback from program participants
- Expert opinion
- Cross-discipline connections and patterns
- Strength of linkages to outcomes
38Identifying evaluation lessons and generating
meaning
- Tactics for generating meaning (Handout 1)
- Identifying high quality lessons (Handout 2)
39 7. Presenting the message
- Communication style and choice of media (Handout
3) - Focus report on intended users
- Quantitative and qualitative communication styles
(Handout 4) - The clients preferred communication style
(Handout 5) - Making claims
- The importance of graphics
- Who receives the evaluation report and who is
invited to comment
40If time permits ..
- More detailed discussion on presenting the message
41Presenting the message
- Communication style and choice of media
- Utilization-focused reporting principles
- Quantitative and qualitative communication styles
- The clients preferred communication style
- Rules for written reports
- Making claims
- The importance of graphics
- Who receives the report and who is invited to
comment
421. Communication style and choice of media
- Continuous communication throughout the
evaluation1 - No surprises
- Educating the client how to think about
evaluation - Short versus long reports
- Combining verbal and written presentations
- Slide shows
- Informal versus formal
43Communication style .. continued
- Alternative media
- Internet
- Video
- Theater
- Dance
- Murals/paintings
- Posters
- Signs
44Communication style .. continued
- Using the right language
- Making sure the written report is available in
stakeholder languages - Economical ways to translate
- Personal testimony
- Project visits
- Working with the mass media
452. Utilization-focused reporting principles
- Be intentional and purposeful about reporting
- Focus reports on primary intended users
- Avoid surprising stakeholders
- Think positive about negatives
- Distinguish dissemination from use
- Source Patton (1997) 330-337
463. Quantitative and Qualitative communication
styles
- Tables versus text
- Types of evidence
- Statistical analysis
- Case studies
- Stories
- Photos
- Boxes
- Site visits
- Testimonials
473. The clients preferred communication style
- Written and/or verbal
- Quantitative/qualitative
- Multiple presentations to different audiences
48Understanding the communication style of the
decisionmaker
- High intellectual level Robert MacNamara, Elliot
Richardson - These two individuals were perfectly capable of
understanding the most complex issues and
absorbing details absorbing the complexity,
fully considering it in their own minds. - Lawrence Lynn. Prof. Public Administration. The
Kennedy School. Patton 1997. PP 58-9.
49 Decisonmakers communication styles .. continued
- Short personal stories Ronald Reagan
- Preferred Readers Digest type personal stories
and anecdotes.
50Decisonmakers communication styles .. continued
- Political animals Joseph Califano U.S. Secy
Health, Education and Welfare - Califano is a political animal and has a
relatively short attention span highly
intelligent, but an action-oriented person. The
problem that his political advisers had is that
they tried to educate him in the classical
rational way, without reference to any political
priorities .. or presenting alternatives that
would appeal to a political, action-oriented
individual. - Source Patton (1997) p.59.
515. Rules for written reports
- Show, dont tell
- Be brief and clear
- Build the story with paragraphs
- Write clear sentences
- Omit needless words
- Avoid jargon
- Vaughan and Buss
52Reporting results
- Arranging data for ease of interpretation
- Focusing the analysis
- Simplicity in data presentation
- Interpretation and judgment
- Making claims
- Useful recommendations
- A futures perspective on recommendations
- Source Patton (1997) pp 321-4
See next slide
535. Making claims
Importance of claims Importance of claims
Major Minor
Rigor of claims Strong
Rigor of claims Weak
54Characteristics of claims of major importance
- Involves having an impact
- Deals with important social problem
- Affects large numbers of people
- Saves money and/or time
- Enhances quality
- Show something can really be done about a problem
- Involves model or approach that could be
replicated
55Characteristics of strong claims
- Valid, believable evidence
- Data covers a long period of time
- Claim is about clear intervention with solid
documentation - About clearly specified outcomes and impacts
- Includes comparisons to program goals, over time,
with other groups, with general trends or norms
56Strong claims continued
- Evidence for claims includes replication
- More than one site
- More than one staff member obtains outcomes
- Same results from different cohort groups
- Different programs obtained comparable results
using the approach - Based on more than one kind of evidence
- Clear, logical linkages between intervention and
claimed outcomes - Evaluators independent of staff
576. The importance of graphics
- Line graphs trends over time
- Pie charts parts of a whole
- Cluster bar chart comparing several items
- Combination chart bar chart and trend line
- Dont overload graphs
587. Who receives the evaluation report and who is
invited to comment?
- Restricted to few key decision-makers and
managers - Participatory presentation/ consultation with
stakeholders - Public comment
- Public hearings
- Via civil society
- Copies available to the press