Impact evaluation: The quantitative methods with applications - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Impact evaluation: The quantitative methods with applications

Description:

In comparison with results of a randomized experiment on a US training program, ... Evidence Using Panel Data from Bangladesh' The World Bank Economic Review, ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 24
Provided by: timmar7
Category:

less

Transcript and Presenter's Notes

Title: Impact evaluation: The quantitative methods with applications


1
Impact evaluation The quantitative methods with
applications
  • Evaluating the Impact of Projects and Programs
  • Beijing, China, April 10-14, 2006
  • Shahid Khandker, WBI

2
5 Non-experimental Methods to Construct Data on
Counterfactual
  • Matching
  • Propensity-score matching
  • Difference-in-difference
  • Matched double difference
  • Instrumental variables

3
1. Matching
  • Matched comparators identify counterfactual.
  • Match participants to non-participants from a
  • larger survey
  • The matches are chosen on the basis of
  • similarities in observed characteristics
  • This assumes no selection bias based on
  • unobservable heterogeneity
  • Validity of matching methods depends heavily
  • on data quality.

4
2. Propensity-score Matching (PSM) Propensity-scor
e matching match on the basis of the probability
of participation.
  • Ideally we would match on the entire vector X of
    observed characteristics. However, this is
    practically impossible. X could be huge.
  • Rosenbaum and Rubin match on the basis of the
    propensity score
  • This assumes that participation is independent of
    outcomes given X. If no bias given X then no bias
    given P(X).

5
Steps in Score Matching
  • Representative, highly comparable, surveys of
    the non-participants and participants
  • (ii) Pool the 2 samples and estimate a logit/
    probit model of program participation
  • Predicted values are propensity scores.
  • (iii) Restrict samples to assure common support
  • (iv) Failure of common support is an important
  • source of bias in observational studies
  • (Heckman et al.).

6
Density of scores for participants
7
Density of scores for non-participants
8
Density of scores for non-participants
9
(v) For each participant find a sample of
non-participants that have similar propensity
scores (vi) Compare the outcome indicators.
The difference is the estimate of the
gain due to the program for that
observation (vii) Calculate the mean of these
individual gains to obtain the average
overall gain. -Various weighting schemes.
10
The Mean Impact Estimator
11
How does PSM compare to an experiment?
  • PSM is the observational analogue of an
    experiment in which placement is independent of
    outcomes
  • The difference is that a pure experiment does not
    require the untestable assumption of independence
    conditional on observables
  • Thus PSM requires good data.

12
How does PSM perform relative to other methods?
  • In comparison with results of a randomized
    experiment on a US training program, PSM gave a
    good approximation (Heckman et al. Dehejia and
    Wahba)
  • Better than the non-experimental regression-based
    methods studied by Lalonde for the same program.

13
  • 3. Difference-in-difference
  • Observed changes over time for non-participants
    provide the counterfactual for participants.
  • Collect baseline data on non-participants and
  • (probable) participants before the program
  • Compare with data after the program
  • Subtract the two differences, or use a regression
  • with a dummy variable for participant
  • This allows for selection bias but it must be
  • time-invariant and additive.

14
Selection Bias

Selection bias
15
DD requires that the bias is additive and
time-invariant

16
Method fails if the comparison group is on a
different trajectory

17
  • Outcome Indicator
  • Where
  • impact (gain)
  • counterfactual
  • comparison group

18
  • Difference-In-Difference
  • (i) if change over time for comparison group
    reveals counterfactual,
  • and
  • (ii) if baseline is uncontaminated by the
    program,

19
4. Matched double difference
  • Matching helps control for bias in diff-in-diff.
  • Score match participants and non-participants
  • based on observed characteristics in baseline
  • Then do a double difference
  • This deals with observable heterogeneity in
  • initial conditions that can influence
    subsequent
  • changes over time.

20
5. Instrumental Variables
  • Identifying exogenous variation using a 3rd
    variable
  • Outcome regression
  • D 0,1 is our program not random
  • Instrument (Z) influences participation, but
    does
  • not affect outcomes given participation (the
  • exclusion restriction).
  • This identifies the exogenous variation in
  • outcomes due to the program.
  • Treatment regression

21
Reduced-form outcome regression where
and Instrumental variables (two-stage least
squares) estimator of impact
22
However, IVE is only a local effect
  • IVE identifies the effect for those induced to
  • switch by the instrument local average
    effect.
  • Suppose Z takes 2 values. Then the effect of
    the
  • program is
  • Care in extrapolating to the whole population
  • Valid instruments can be difficult to find
  • Exclusion restrictions are often questionable.

23
Selected References
  • Ravallion, Martin. The Mystery of Vanishing
    Benefits The World Bank Economic Review. Volume
    15. No. 1. pp. 115-140.
  • Washington D.C. The World Bank, 2001.
  • Khandker, Shahidur R. Micro-finance and Poverty
    Evidence Using Panel Data from Bangladesh The
    World Bank Economic Review, October, 2005, PP
    263-286.
  • Khandker, Shahidur R. Fighting Poverty with
    Micro-credit Experience in Bangladesh New York,
    NY Oxford University Press, 1998.
  • Pitt, Mark M. and Shahidur R. Khandker. The
    Impact of Group-Based Credit Programs on Poor
    Households in Bangladesh Does the Gender of
    Participants Matter? Journal of Political
    Economy 106 (October) 958-96, 1998.
Write a Comment
User Comments (0)
About PowerShow.com