Title: The ALNAP Metaevaluation
1The ALNAP Meta-evaluation
- Tony Beck
- Presentation for the IDEAS Conference, Delhi,
14th April 2005
2Outline
- Background
- The ALNAP Quality Proforma
- Agency visits
- Findings from the agency visits
- Finding from the Quality Proforma
3What is the ALNAP and its meta-evaluation?
- An overview of evaluation of humanitarian action
quality - Identification of strengths and weaknesses
- Recommendations for improvement across the sector
and in individual agencies
4Process
- Review of evaluation reports against a set of
standards - Visits to and interaction with agency evaluation
offices - Focus
- 2001-2002 Accountability
- 2003-2005 Accountability and good practice,
dialogue, interaction
5The ALNAP Quality Proforma
- ALNAPs meta-evaluation tool
- Draws on good practice in EHA and evaluation in
general - Revised and peer reviewed in 2004
6The ALNAP Quality Proforma
- Made up of seven sections
- Terms of reference
- Methods, practice and constraints
- Contextual analysis
- Analysis of intervention
- Assessing the report
- Overall comments
7The ALNAP Quality Proforma
- 4 point rating scale
- A good
- B satisfactory
- C unsatisfactory
- D poor
- Guidance notes for meta-evaluators. Eg
Consideration given to confidentiality and
dignity? - Guidance The evaluation report should detail how
the overall approach and methods will protect
confidentiality and promote respect for
stakeholders dignity and self-worth.
8The ALNAP Proforma
- Coverage
- 2001-2005 197 evaluations
- Process
- 2 meta-evaluators
- Reconciliation of rating
- Analysis by section
9Mainstreaming of the Quality Proforma
- By ECHO to revise tor (lesson learning,
protection, identification of users,
prioritisation, time frame and users of
recommendations etc) - DEC Southern Africa evaluation (rated 7 agency
report) - Groupe URD (for planning of evaluations)
10Agencies included in dialogue 2003-4
- CAFOD, Danida, ECHO, ICRC, OCHA, OFDA, Oxfam,
SC-UK, SIDA, UNHCR, and WHO
11Purpose of agency dialogue
- Agency response to initial two years of use of
Quality Proforma - To discuss Quality Proforma rating and agency
strengths and weaknesses - To discuss processes leading to good evaluation
practice - To discuss goof practice
12Findings from dialogue with evaluation managers
- Areas affecting evaluation quality are not
currently captured by the QP, eg - Evaluation quality depends on subtle negotiations
within agencies - Evaluation funds in most cases are not being
allocated for follow-up - Follow-up to recommendations is complex
- More agencies are using tracking matrices
13Findings from dialogue with evaluation managers
the EHA market
- Main constraint to improved evaluation quality
is agencies accessing available evaluators with
appropriate skills - Does the EHA market need further regulation?
14Findings from the Proforma
15Findings from the Proforma
16Findings from the Proforma
17Findings from the Proforma
18Findings from the Proforma
19Findings from the Proforma
20Findings from the Proforma
21Findings from the Proforma - 2005
- Improvement in most areas noted above of between
10 and 30 per cent - Too early to disaggregate or suggest why this
improvement has taken place - Still a number of areas of generic weakness
22Conclusions
- Process
- Meta-evaluations need to include interaction with
those being meta-evaluated - Agency visits have been important is discussing
constraints to improved evaluation quality - Meta-evaluations need to maintain an appropriate
balance between accountability functions and the
need to improve evaluation quality through lesson
learning
23Conclusions findings
- EHA demonstrates some areas of strength, and
improvement over four years, eg use of most of
the DAC criteria, analysis of HR - Many evaluative areas need to be strengthened, eg
gender, identification of use and users,
participation of primary stakeholders,
transparency of methodologies used