Title: Do we know how to create jobs?
1 Do we know how to create jobs?
Evaluation lessons from a Systematic Review by
Michael Grimm and Anna Luisa Paffhausen,
University of Passau, GermanyCommissioned by
KfW Entwicklungsbank Evaluation Unit
Disclaimer The views and opinions expressed in
the Systematic Review are those of the authors
and do not necessarily reflect the official
policy or position of any agency of the German
Development Cooperation.
2Systematic reviews
...a way to overcome the problem of limited
external validity of experimental (RCT) and
quasi-experimental outcome/impact measurements.
Disclaimer The views and opinions expressed in
the Systematic Review are those of the authors
and do not necessarily reflect the official
policy or position of any agency of the German
Development Cooperation.
OECD DAC EvalNet 12-13 February 2014, Paris
3Search strategy and intervention categories
2,275 record identified through database searching
1,924 records screened after duplicates removed
20 additional records identified through other
sources
139 studies assessed for eligibility
55 studies included Incl. 27 RCT studies 93
impact estimates
Not adding up to 55 because some studies cover
more than one intervention type
- Access to finance and insurance
- 26 studies / 13 RCTs
- Entrepreneurship training
- 20 studies / 16 RCTs
Intervention types
- Business development services
- and targeted subsidies
- 10 studies / 1 RCT
- Improvements of business environ-ment Incentives
to formalise - 5 studies / 1 RCT
OECD DAC EvalNet 12-13 February 2014, Paris
4Key findings on employment
Access to finance and insurance
Entrepreneur-ship training
Business develop-ment services and targeted
subsidies
Business environ-ment Incentives to formalise
OECD DAC EvalNet 12-13 February 2014, Paris
5The method bias
Studies that are based on RCTs show a lower share of significantly positive employment effects than studies that rely on quasi-experimental methods.
- Do quasi-experimental studies over-estimate
employment effects due to un-eliminated biases? - ...or are employment effects small in
RCT-measurements because this method is very
often applied to rather small programmes in
relatively poor areas?
OECD DAC EvalNet 12-13 February 2014, Paris
6Implications for development policy, evaluation
and research
- Policy
- It is a long way from policy inputs to employment
impacts it seems easier to achieve effects on
management practices, sales or profits than
employment effects. - A major push is needed to have employment impact.
- Many of the interventions included in the review
first of all strive for income stabilisation and
poverty reduction, and not for employment
creation. - Targeting seems to be key.
- It seems to be easier to create new businesses
than to foster the growth of existing firms.
- Evaluation and research
- Evidence is still sketchy, particularly for
Sub-Saharan Africa and Asia. - There is almost a complete lack of evidence on
long-term effects and cost-effectiveness. - A dilemma For the inclusion into a systematic
review, studies have to meet highest
methodological quality standards. However, not
all programmes are suited for (quasi-)experimental
evaluation designs - How to deal with the dilemma
- Do not neglect the findings of other types of
evaluations - ...until research comes up with new quality
standards that cover a wider range of programme
designs
OECD DAC EvalNet 12-13 February 2014, Paris
7Thank you!
OECD DAC EvalNet 12-13 February 2014, Paris