+ All Categories
Home > Presentations & Public Speaking > Monitoring and Evaluation: Lesson 2

Monitoring and Evaluation: Lesson 2

Date post: 19-Jan-2017
Category:
Upload: meshack-lomoywara
View: 263 times
Download: 2 times
Share this document with a friend
20
DESIGNING AND BUILDING A RESULTS-BASED MONITORING AND EVALUATION SYSTEM: BY MESHACK KORIR
Transcript
Page 1: Monitoring and Evaluation: Lesson 2

DESIGNING AND BUILDING A RESULTS-BASED MONITORING AND EVALUATION SYSTEM:

BY MESHACK KORIR

Page 2: Monitoring and Evaluation: Lesson 2

2

TEN STEPS TO DESIGNING, BUILDING AND SUSTAINING A RESULTS-BASED MONITORING AND EVALUATION SYSTEM

Conducting a

Readiness

Assessment

Agreeing on

Outcomes to

Monitor and

Evaluate

Selecting Key

Indicators to

Monitor

Outcomes

Baseline

Data on

Indicators—

Where Are

We Today?

Planning for

Improvement

— Selecting

Results Targets

Monitoring

for Results

The Role of

Evaluations

Reporting Your

Findings

Using

Your

Findings

Sustaining

the

M&E System

Within Your

Organization

1 2 3 4 5 6 7 9 108

Page 3: Monitoring and Evaluation: Lesson 2

3

THE POWER OF MEASURING RESULTS

If you do not measure results, you can not tell success from failure

If you can not see success, you can not reward it

If you can not reward success, you are probably rewarding failure

If you can not see success, you can not learn from it

If you can not recognize failure, you can not correct it

If you can demonstrate results, you can win public support

Adapted from Osborne & Gaebler, 1992

Page 4: Monitoring and Evaluation: Lesson 2

4

INTRODUCTION TO RESULTS-BASED MONITORING AND EVALUATION

Results-based monitoring and evaluation measures how well governments are performing

Results-based monitoring and evaluation is a management tool!

Results-based monitoring and evaluation emphasizes assessing how outcomes are being achieved over time

Page 5: Monitoring and Evaluation: Lesson 2

5

REMEMBER

Monitoring and evaluation are two separate, but interrelated strategies to collect data and report the findings on how well (or not) the public sector is performing

During this workshop, we will be discussing:

Monitoring as a tool

Evaluation as a tool

How the two interrelate to support good public management

The ten steps to build a results-based monitoring and evaluation system to measure government performance

Page 6: Monitoring and Evaluation: Lesson 2

6

REASONS TO DO RESULTS-BASED M&E

Provides crucial information about public sector performance

Provides a view over time on the status of a project, program, or policy

Promotes credibility and public confidence by reporting on the results of programs

Helps formulate and justify budget requests

Identifies potentially promising programs or practices

Page 7: Monitoring and Evaluation: Lesson 2

7

REASONS TO DO RESULTS-BASED M&E (CONT.)

Focuses attention on achieving outcomes important to the organization and its stakeholders

Provides timely, frequent information to staff

Helps establish key goals and objectives Permits managers to identify and take

action to correct weaknesses Supports a development agenda that is

shifting towards greater accountability for aid lending

Page 8: Monitoring and Evaluation: Lesson 2

8

DEFINITION

Results-Based Monitoring

(what we will call “monitoring”) is a continuous process of collecting and analyzing information to compare how well a project, program or policy is performing against expected results

Page 9: Monitoring and Evaluation: Lesson 2

9

MAJOR ACTIVITIES WHERE RESULTS MONITORING IS NEEDED

Setting goals and objectives Reporting to Parliament and

other stakeholdersManaging projects, programs

and policies Reporting to donors Allocating resources

Page 10: Monitoring and Evaluation: Lesson 2

10

A NEW EMPHASIS ON BOTH IMPLEMENTATION AND RESULTS-BASED MONITORING

Traditional monitoring focuses on implementation monitoring This involves tracking inputs (resources, strategies),

activities (what actually took place) and outputs (the products or services produced)

This approach focuses on monitoring how well a project, program or policy is being implemented

Note: Often used to assess compliance with work plans and budget

Page 11: Monitoring and Evaluation: Lesson 2

11

Results-based monitoring involves the regular collection of

information on how effectively government (or any organization) is performing

Results-based monitoring demonstrates whether a project, program, or policy is achieving its stated goals

Page 12: Monitoring and Evaluation: Lesson 2

12

RESULTS-BASED MONITORING

Outcomes• Intermediate effects of

outputs on clients

Outputs• Products and services

produced

Activities• Tasks personnel

undertake to transform inputs to outputs

Inputs• Financial, human, and

material resources

Goal(Impacts)

• Long-term, widespread improvement in society

Imp

lem

enta

tion

Results

Binnendijk, 2000

Page 13: Monitoring and Evaluation: Lesson 2

13

RESULTS-BASED MONITORING: EXAMPLE ADULT LITERACY

Outcomes• Increased literacy skill; more

employment opportunities

Outputs• Number of adults completing

literacy courses

Activities • Literacy training courses

Inputs• Facilities, trainers,

materials

Goal(Impacts)

• Higher income levels; increase access to higher skill jobs

Page 14: Monitoring and Evaluation: Lesson 2

14

DEFINITION

Results-Based EvaluationAn assessment of a planned, ongoing, or completed intervention to determine its relevance, efficiency, effectiveness, impact and sustainability. The intent is to incorporate lessons learned into the decision-making process.

Page 15: Monitoring and Evaluation: Lesson 2

15

EVALUATION ADDRESSES

“Why” Questions – What caused the changes we are monitoring

“How” Questions – What was the sequence or processes that led to successful (or not) outcomes

“Compliance/ Accountability Questions”

Process/Implementation Questions

– Did the promised activities actually take place and as they were planned?

Was the implementation process followed as anticipated, and with what consequences

Page 16: Monitoring and Evaluation: Lesson 2

16

DESIGNING GOOD EVALUATIONS

Getting the questions right is critical

Answering the questions is critical

Supporting public sector decision-making with credible and useful information is critical

Page 17: Monitoring and Evaluation: Lesson 2

17

DESIGNING GOOD EVALUATIONS

“Better to have an approximate

answer to the right question, than

an exact answer to the wrong

question.”

Page 18: Monitoring and Evaluation: Lesson 2

18

DESIGNING GOOD EVALUATIONS

“Better to be approximately correctthan precisely wrong.”

Paraphrased fromBertrand Russell

Page 19: Monitoring and Evaluation: Lesson 2

19

RESULTS-BASED MONITORING AND EVALUATION

Monitoring Evaluation

Clarifies program objectives Analyzes why intended results were or were not achieved

Links activities and their resources to objectives

Assesses specific causal contributions of activities to results

Translates objectives into performance indicators and set targets

Examines implementation process

Routinely collects data on these indicators, compares actual results with targets

Explores unintended results

Reports progress to managers and alerts them to problems

Provides lessons, highlights significant accomplishment or program potential, and offers recommendations for improvement

Page 20: Monitoring and Evaluation: Lesson 2

20

Partner 1 Partner 3

Partner 2

Partner 1 Partner 3

Partner 2

Partner 1 Partner 3

Partner 2

ACHIEVING RESULTS THROUGH PARTNERSHIP

Goal

OutcomeOutcome Outcome

Target 2Target 1

Means & Strategy Means & Strategy Means & Strategy


Recommended