+ All Categories
Home > Documents > MEASURING IMPACT

MEASURING IMPACT

Date post: 24-Feb-2016
Category:
Upload: maddox
View: 31 times
Download: 0 times
Share this document with a friend
Description:
MEASURING IMPACT. Impact Evaluation Methods for Policy Makers. - PowerPoint PPT Presentation
Popular Tags:
19
www.worldbank.org/hdchiefeconomist The World Bank Human Development Network Spanish Impact Evaluation Fund
Transcript
Page 1: MEASURING IMPACT

www.worldbank.org/hdchiefeconomist

The World Bank

Human Development

Network

Spanish Impact

Evaluation Fund

Page 2: MEASURING IMPACT

MEASURING IMPACTImpact Evaluation Methods

for Policy Makers

This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC (www.worldbank.org/ieinpractice). The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

Page 3: MEASURING IMPACT

1Causal Inference

Counterfactuals

False CounterfactualsBefore & After (Pre &

Post)Enrolled & Not Enrolled (Apples &

Oranges)

Page 4: MEASURING IMPACT

2IE Methods Toolbox

Randomized Assignment

Discontinuity Design

Diff-in-Diff

Randomized Offering/Promotion

Difference-in-Differences

P-Score matchingMatching

Page 5: MEASURING IMPACT

2IE Methods Toolbox

Randomized Assignment

Discontinuity Design

Diff-in-Diff

Randomized Offering/Promotion

Difference-in-Differences

P-Score matchingMatching

Page 6: MEASURING IMPACT

Choosing your IE method(s)

Prospective/Retrospective Evaluation?

Eligibility rules and criteria?

Roll-out plan (pipeline)?

Is the number of eligible units larger than available resources at a given point

in time?

o Poverty targeting?o Geographic

targeting?

o Budget and capacity constraints?

o Excess demand for program?

o Etc.

Key information you will need for identifying the right method for your program:

Page 7: MEASURING IMPACT

Choosing your IE method(s)

Best Design

Have we controlled for everything?

Is the result valid for everyone?

o Best comparison group you can find + least operational risk

o External validityo Local versus global

treatment effecto Evaluation results apply to

population we’re interested in

o Internal validityo Good comparison group

Choose the best possible design given the operational context:

Page 8: MEASURING IMPACT

2IE Methods Toolbox

Randomized Assignment

Discontinuity Design

Diff-in-Diff

Randomized Offering/Promotion

Difference-in-Differences

P-Score matchingMatching

Page 9: MEASURING IMPACT

Randomized Treatments & Comparisono Randomize!o Lottery for who is offered benefitso Fair, transparent and ethical way to assign benefits to

equally deserving populations.

Eligibles > Number of Benefits

o Give each eligible unit the same chance of receiving treatment

o Compare those offered treatment with those not offered treatment (comparisons).

Oversubscription

o Give each eligible unit the same chance of receiving treatment first, second, third…

o Compare those offered treatment first, with those offered later (comparisons).

Randomized Phase In

Page 10: MEASURING IMPACT

= Ineligible

Randomized treatments and comparisons

= Eligible

1. Population

External Validity

2. Evaluation sample

3. Randomize treatment

Internal Validity

Comparison

Treatment

X

Page 11: MEASURING IMPACT

Unit of RandomizationChoose according to type of program

o Individual/Householdo School/Health

Clinic/catchment areao Block/Village/Communityo Ward/District/Region

Keep in mindo Need “sufficiently large” number of units to

detect minimum desired impact: Power.o Spillovers/contaminationo Operational and survey costs

As a rule of thumb, randomize at the smallest viable unit of implementation.

Page 12: MEASURING IMPACT

Case 3: Randomized AssignmentProgresa CCT program

Unit of randomization: Community

o 320 treatment communities (14446 households): First transfers in April 1998.

o 186 comparison communities (9630 households): First transfers November 1999

506 communities in the evaluation sampleRandomized phase-in

Page 13: MEASURING IMPACT

Case 3: Randomized Assignment

Treatment Communitie

s

320

Comparison Communitie

s

186

TimeT=1T=0

Comparison Period

Page 14: MEASURING IMPACT

Case 3: Randomized Assignment

How do we know we have good clones?

In the absence of Progresa, treatment and comparisons

should be identicalLet’s compare their

characteristics at baseline (T=0)

Page 15: MEASURING IMPACT

Case 3: Balance at BaselineCase 3: Randomized Assignment

Treatment Comparison T-stat

Consumption($ monthly per capita)

233.4 233.47 -0.39Head’s age (years) 41.6 42.3 -1.2Spouse’s age(years) 36.8 36.8 -0.38Head’s education (years) 2.9 2.8 2.16**Spouse’s education (years) 2.7 2.6 0.006

Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Page 16: MEASURING IMPACT

Case 3: Balance at BaselineCase 3: Randomized Assignment

Treatment Comparison T-stat

Head is female=1 0.07 0.07 -0.66Indigenous=1 0.42 0.42 -0.21Number of household members

5.7 5.7 1.21Bathroom=1 0.57 0.56 1.04Hectares of Land 1.67 1.71 -1.35Distance to Hospital (km) 109 106 1.02

Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Page 17: MEASURING IMPACT

Case 3: Randomized AssignmentTreatment

Group(Randomized to

treatment)

Counterfactual

(Randomized to Comparison)

Impact(Y | P=1) - (Y |

P=0)

Baseline (T=0) Consumption (Y)

233.47 233.40 0.07Follow-up (T=1) Consumption (Y)

268.75 239.5 29.25**Estimated Impact on

Consumption (Y)

Linear Regression 29.25**

Multivariate Linear Regression

29.75**Note: If the effect is statistically significant at the 1% significance level, we label

the estimated impact with 2 stars (**).

Page 18: MEASURING IMPACT

Progresa Policy Recommendation?

Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Impact of Progresa on Consumption (Y)

Case 1: Before & After

Multivariate Linear Regression

34.28**

Case 2: Enrolled & Not Enrolled

Linear Regression -22**Multivariate Linear Regression -4.15

Case 3: Randomized Assignment

Multivariate Linear Regression

29.75**

Page 19: MEASURING IMPACT

Keep in MindRandomized AssignmentIn Randomized Assignment, large enough samples, produces 2 statistically equivalent groups.We have identified the perfect clone.

Randomized beneficiary

Randomized comparison

Feasible for prospective evaluations with over-subscription/excess demand.Most pilots and new programs fall into this category.

!


Recommended