Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank

Post on 15-Feb-2016

44 views 0 download

Tags:

description

Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real Time Decision Making. Do we know…. Can leadership and motivation alone break the poverty trap? – Rwanda rapid results initiative - PowerPoint PPT Presentation

transcript

PREM WeekApril 2009

Arianna LegoviniHead, Development Impact Evaluation Initiative (DIME)World BankImpact Evaluation for Real Time Decision Making

Do we know… Can leadership and motivation alone break

the poverty trap? –Rwanda rapid results initiative

Can participation change local norms and institutions? –Sierra Leone institutional reform & capacity building/gobifo

What type of information can enhance local accountability? –Uganda schools’ budget

What is the best way to select local projects? –Indonesia direct voting versus representatives’ decisions

Do audits increase accountability in public administration? –Indonesia, Brazil

Of course we know!?

These are difficult questions…We turn to our best judgment for

guidance and pick an incentive, subsidy level, a voting scheme, a package of services…

Is there any other incentive, subsidy, scheme or package that will do better?

The decision process is complex

A few big decisions are taken during design but many more decisions are taken during roll out & implementation

DesignEarly roll outImplementation

Developing a decision tree for public procurement …

Road Authority procurement of rural roads O&MSmall contractsFixed-cost contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiation

Adjustable contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiationLarge contractsFixed-cost contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiation

Adjustable contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiation

How to select between plausible alternatives?

Establish which decisions will be taken upfront and which will be tested during roll-out

Scientifically test critical nodes: measure the impact of one option relative to another or to no intervention

Pick better and discard worse during implementation

Cannot learn everything at onceSelect carefully what you want to test

by involving all relevant partners

Impact evaluationApplication of the scientific method to

understand and measure behavioral response Hypothesis

▪ If we subdivide contracts then more companies will bid for the contract

Testing ▪ Randomly assign which bids will be subdivided in smaller size

contracts and compare number of bidders and bidding price relative to engineering costs

Observations▪ Smaller contract size increases number of bidders ▪ Bidding prices fall▪ Costs of administering contracts rise

Conclusion ▪ Smaller contract size increases competition in the bidding

procedure

Walk along a decision tree for public procurement …

Road Authority procurement of rural roads O&MSmall contractsFixed-cost contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiation

Adjustable contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiationLarge contractsFixed-cost contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiation

Adjustable contractConceal engineering costsAllow renegotiationDeny renegotiation

Advertize engineering costsAllow renegotiationDeny renegotiation

In Okhlahoma, publication of engineering

costs lowered procurement costs by 4.6%

Andhra Pradesh e-procurement

increased bidder

participation by 25%

What is Impact Evaluation?Counterfactual analysis isolates the

causal effect of an intervention on an outcome Effect of contract size on number of bidders Effect of renegotiation option on bidding

price Effect of information on price

Compare same individual with & without option, information etc. at the same point in time to measure the effect This is impossible

Impact evaluation uses large numbers (individuals, communities) to estimate the effect

How is this done? Select one group to receive treatment (contract

alternative, information…) Find a comparison group to serve as

counterfactual (other alternative, no information…)

Use these counterfactual criteria: Treated & comparison groups have identical

initial average characteristics (observed and unobserved)

The only difference is the treatment Therefore the only reason for the difference in

outcomes is due to the treatment (or differential treatment)

How is monitoring different from impact evaluation?The Monitoring

story In 1996, transfer

to school was 20 out of 100 budget allocation

Government started publishing school allocations in newspapers

By 2001, transfers jumped to 80% of budget allocation, a gain of 60 (80-20)

%

AfterBefore

20

80

1996 2001

20

Information

Change=60

Low control

s

High control

s

How is monitoring different from impact evaluation? %

AfterBefore

20

80

1996 2001

20

Information & other

factors

36 Impact =44

The Impact evaluation story

In 1996, a low awareness year, transfers to schools were 20% of budget allocation

After the 1996 PETS were published, there was much public discussion and budgets published in local newspapers

By 2001, transfers in schools close to newspaper outlets increased to 80

Transfers to schools far from newspaper outlets increased to 36

Newspaper information increased transfers by 44 (80-36)

Low control

s

High control

s

How to use Monitoring & Impact Evaluation? Use monitoring to track implementation efficiency (input-output)

INPUTS OUTCOMESOUTPUTS

MONITOR EFFICIENCY

EVALUATE EFFECTIVENESS

$$$

BEHAVIOR

Use impact evaluation to measure effectiveness (output-outcome)

Question types and methods

Descriptive analysis

Causal analysis

Monitoring and process evaluation

Is program being implemented efficiently?

Is program targeting the right population?

Are outcomes moving in the right direction?

Impact Evaluation What was the effect of the program on

outcomes? How would outcomes change under

alternative program designs? Is the program cost-effective?

When would you use M&E and when IE?

Are transfers to localities being delivered as planned?

Does information reduce capture?

What are the trends in public procurement?

Do contractual incentives increase timeliness in delivery of services?

M&E

IE

M&E

IE

Uganda Community-Based Nutrition Failed project

In year 3 communities stopped receiving funds Parliament negative reaction Intervention stopped

…but… Strong impact evaluation results in year 1-2

Children in treatment scored half a standard deviation better than children in the control

Recently, Presidency asked to take a second look at the evaluation: saving the baby?

Why evaluate: babies & bath water

Why Evaluate? Improve quality of programs

Separate institutional performance from quality of intervention

Test alternatives and inform design in real time Increase program effectiveness Answer the “so what” questions

Build government institutions for evidence-based policy-making Plan for implementation of options not solutions Find out what alternatives work best Adopt better way of doing business and

taking decisions

Institutional frameworkPM/Presidency:

Communicate to constituencies

Treasury/Finance:

Allocate budget

Line ministries:

Deliver programs and negotiate

budget

Effects of government

program

BUDGET

SERVICE DELIVERY

CAMPAIGNPROMISES

Accountability

Cost-effectiveness of alternatives and effect of

sector programs

Cost-effectiveness

of different programs

From: Program is a set of activities designed to deliver expected results

Program will either deliver or notTo: Program is menu of alternatives with a learning strategy to find out which work best

Change programs overtime to deliver more results

Shifting Program Paradigm

Shifting Evaluation Paradigm From retrospective, external, independent

evaluation Top down Determine whether program worked or not

To prospective, internal, and operationally driven impact evaluation /externally validated Set program learning agenda bottom up Consider plausible implementation alternatives Test scientifically and adopt best Just-in-time advice to improve effectiveness

of program over time

Operational questions: managing for resultsQuestion design-choices of program

Institutional arrangements, Delivery mechanisms, Packages, Pricing/incentive schemes

Use random trials to test alternativesFocus on short term outcomes

take up rates, use, adoptionFollow up data collection and analysis

3-6-12 months after exposureMeasure impact of alternative

treatments on short term outcomes and identify “best”

Change program to adopt best alternative

Start over

Policy questions: accountability

How much does the program deliver? Is it cost-effective? Use most rigorous method of evaluation

possible Focus on higher level outcomes

educational achievement, health status, income

Measure impact of operation on stated objectives and a metric of common outcomes One, two, three year horizon

Compare with results from other programs

Inform budget process and allocations

Is impact evaluation a one shot analytical product?

This is a technical assistance product to change the way decisions are taken

It is about building a relationshipAdds results-based decision tools to

complement existing sector skillsThe relationship delivers not one but

a series of analytical productsMust provide useful (actionable)

information at each step of the impact evaluation

DIMEDevelopment Impact Evaluation

Objective To build know‐how in implementing agencies and

work with Bank operations to generate operational knowledge

Programmatic Activities Annual workshops for skill development Community of practice for South‐to‐South learning Technical advisory group to assure quality of

analytical work

In-Country Activities Technical support and field coordination through

project cycle

DIME programs

Thank you

email alegovini@worldbank.orgWeb www.worldbank.org/dime