+ All Categories

M&e system

Date post: 07-May-2015
Category:
Upload: fikrtes
View: 191 times
Download: 0 times
Share this document with a friend
34
Monitoring and Evaluation A Designed Process for M & E System for the health sector By: Fikru Tessema, (B.sc, M.SC) With World Bank January 2008 Addis Ababa HOW DO WE KNOW THAT WE HAVE MADE A DIFFERENCE?
Transcript
Page 1: M&e system

Monitoring

and

Evaluation

A Designed Process for

M & E System

for

the health sector

By: Fikru Tessema, (B.sc, M.SC)

With

World Bank

January 2008

Addis Ababa

HO

W D

O W

E K

NO

W T

HA

T W

E H

AVE

MA

DE

A D

IFFE

RE

NC

E?

Page 2: M&e system

i

Table of Contents

Page

1. INTODUCTION ............................................................................................. 3

1.1 Monitoring ................................................................................................... 3

1.2 Evaluation ................................................................................................... 3

1.3 M & E Conceptual Framework ..................................................................... 5

2. PLANNING FOR MONITORING AND EVALUATION ........................................ 5

2.1 Principles of Planning .................................................................................. 5

2.2 Objectives of M & E...................................................................................... 6

2.3 Planning Work process of M & E ................................................................. 7

3. THE MONITORING WORK PROCESS .......................................................... 11

3.1 Conducting monitoring/routine data collection .......................................... 11

3.2 Scope of monitoring ................................................................................... 14

3.3 Responsibility for monitoring ..................................................................... 14

4. PERFORMANCE MEASUREMENT AND QUALITY IMPROVEMENT ............... 18

4.1 Report reviewing ........................................................................................ 18

4.2 Key stakeholder forum (KSF)/Joint coordination forum (JCF)/Outcome

forum ......................................................................................................... 19

4.3 Annual review (AR)..................................................................................... 20

4.4 Quality Improvement ................................................................................. 21

5. REGULAR FOLLOW UP/SUPPORTIVE SUPERVISION .................................. 22

5.1 Give support .............................................................................................. 22

5.2 Composition of the team ............................................................................ 22

6. THE EVALUATION WORK PROCESS .......................................................... 23

6.1 Preparing for an evaluation ........................................................................ 23

6.2 Involving partners and stakeholders .......................................................... 24

6.3 Defining the scope ..................................................................................... 25

6.4 Drafting the terms of reference (TOR) for evaluation................................... 25

6.5 Budgeting .................................................................................................. 26

6.6 Selecting the evaluation team .................................................................... 26

6.7 Managing an evaluation ............................................................................. 27

7. USE OF MONITORING AND EVALUATION INFORMATION .......................... 28

7.1 INFORMED DECISION MAKING ................................................................ 29

Page 3: M&e system

ii

7.2 FEEDBACK ................................................................................................ 29

7.3 PUBLICATION OF EVIDENCE .................................................................... 30

7.4 WORK PROCESS REDESIGNING ............................................................... 30

8. VALIDATION ............................................................................................. 31

8.1 Tools of validation: ..................................................................................... 31

8.2 Rating system for validation ....................................................................... 32

8.3 Rewarding for good performance ................................................................ 32

REFERENCES .................................................................................................. 33

Page 4: M&e system

3

1. INTODUCTION

A good monitoring and evaluation system is the only way of tracking what is

being done and knowing if the strategies/interventions in health sector being

undertaken are making a difference. To make accessible and quality of health

services in particular require establishing strong M&E system because health

service is expensive, sensitive and hinges upon the critical issue of human

development. Continuous assessment/tracking is necessary given that new

strategies/interventions are constantly being proposed. Efforts must be made to

identify strategies/interventions that are more effective and to make them more

interventional in the strategies/interventions in the health sector.

1.1 Monitoring

Monitoring is tracking of a plan during its implementation to ensure that

activities and procedures are proceeding as planned and on schedule. The

objective of monitoring is to improve the management and optimum use of

resourceand to make timely decisions to resolve constraints and/or problems of

implementation. In practical terms, monitoring HSDP and annual plan means

finding out if implementation is according to the plan and taking the required

action when deviation from the plan is detected. Monitoring happens regularly

throughout all steps of intervention in the health sector. It includes the

collection and review of information available from HMIS sources; supervisory

visits; review meetings and annual reports.

1.2 Evaluation

Evaluation is the systematic collection of information about activities and

outcomes to make judgments about the health sector strategies/interventions,

improve its effectiveness, and/or inform decision makers about future health

sector strategies/interventions development. It is an independent and impartial

assessment of the performance carried out by specialists who have not been

involved in the day-to-day health sector strategies/interventions. The purpose

Page 5: M&e system

4

of evaluation is to assess whether the implementation of health sector

strategies/interventions has gone according to the plan to channel funds for

the desired access and quality of health services to be achieved. It is carried out

systematically and objectively at the end of plan period and aims at determining

possible worthiness of activities in the light of resources utilized so as to

improve future health sector strategies/interventions in the sector.

Source: WB, Ten Steps to a Results-Based Monitoring and Evaluation System: A Handbook for Development

Practitioners

Page 6: M&e system

5

1.3 M & E Work Process High Level Mapping

2. PLANNING FOR MONITORING AND EVALUATION

2.1 Principles of Planning

M&E is used as a tool to ensure the production of outputs and progress towards outcomes of tasks. Evaluation is an important monitoring tool and monitoring is an important input to evaluation. Since monitoring and

evaluation are so interrelated, they have to be planned together at the same time simultaneously.

Meaningful information about outcomes and outputs of strategies/ interventions in the health sector needs to be captured, regardless of the unit of

analysis used by a monitoring and evaluation plan. An evaluation and

Data analysis/

aggregation

Redefine Indicators

Formatting/

sustaining

Informed Decision Making

Evaluation/Operational Research for Process

Outcomes

Accountability/respon

sibility

Support/Capacity

building

Integrated Supportive Supervision

Redefine Role &

Responsibility

Performance

monitoring

Program/Project

data/Routine data collection

Rewarding

Validation

Information for

decision making

Page 7: M&e system

6

monitoring plan should cover intended immediate results as outputs and

medium and long term results as outcomes of strategies/interventions in the health sector. All operating units in the health system being at federal, regional

or woreda level should prepare a mandatory monitoring and evaluation plan for strategies/interventions in their respective areas.

Planning of M&E is not primarily about scheduling (the timing and selection of tools); it is about determining the best approach of strategies/interventions in

the health sector depending on the needs and the nature of what is being monitored or evaluated, i.e., matching physical targets with the available resources. In the planning of the health sector the plans used as monitoring

tools are MDG, SPM/HSDP or Annual Plan to ensure the matching of physical targets with available resource to improve access and quality of health services for the nation.

Figure 1: Illustration of core elements of long and medium term and operational plan of the health sector

2.2 Objectives of M & E

Monitoring and evaluation helps managers/staffs:

to learn from experience/improve program,

to make more informed decisions,

to be accountable & reposition ourselves and

to build capacities.

The objectives are linked together in a continuous work process . Learning from the past contributes to more informed decision-making. Better decisions lead to greater accountability to stakeholders. Better decisions also improve

performance through capacity building, allowing for health sector activities to be repositioned continually.

Overall Long Term Plan:

MDG

Overall Health Sector

plan: HSDP

Larger context & contains substantive

information and management

decisions and managed by Top

management/management council

committee and monitored to keep on

track

Focuses on outcomes and the

contributions that outputs make to

outcomes that managed by

programme managers and monitored

to keep on track Operational

Health

Sector Plan:

Annual

Focuses on individual activities and

outputs that managed by Case

teams/case workers/performers

and monitored to keep on track

Page 8: M&e system

7

Figure 2: Illustration of objectives of monitoring and evaluation

2.3 Planning Work process of M & E

2.3.1 Planning monitoring

Steps in planning monitoring to assess how strategies/interventions is going on and working in the health sector:

a) Assess a need for monitoring Federal: This is done by assessing the nature of health sector that are expected

to contribute to improvement and development of health services. In assessing the nature of strategies/interventions, focus on what performance information

is needed at federal level; what elements are most important to keep tracking on and what would indicate progress or success in health sector strategies/interventions.

Regional: This can be done by assessing the nature of health sector that are expected to contribute to improvement and development of health services in

the region. In assessing the nature of strategies/interventions, focus on what performance information is needed at regional level; what elements are most

important to keep tracking on and what would indicate progress or success in health.

Woreda: This is also done by assessing the nature of health sector that are expected to contribute to improvement and development of health services in

the woreda. In assessing the nature of strategies/interventions, focus on what performance information is needed at woreda level; what elements are most

Make more

informed decisions

Be accountable and

reposition yourself

Learn from

experience/

improve program Build

Capacities

Page 9: M&e system

8

important to keep tracking on and what would indicate progress or success in

health sector.

b) Assess current monitoring

Federal: To assess current monitoring federally, look at the monitoring tools

being used in health sector intended to contribute to a given outcome of health services at federal level. In assessing the nature of the current monitoring the

following questions has to be answered: Are these tools providing the necessary information nationally? Do they involve the key development partners acting federally? Is monitoring focusing on key issues for efficiency nationally? Are

there possibilities for greater efficiency and coordination nationally? This will help to identify performance gaps in the health sector nationally.

Regional: To assess current monitoring regionally, look at the monitoring tools being used in health sector intended to contribute to a given outcome of health

services at regional level. In assessing the nature of the current monitoring the following questions has to be answered: Are these tools providing the necessary information regionally? Do they involve the key development partners acting

regionally? Is monitoring focusing on key issues for efficiency regionally? Are there possibilities for greater efficiency and coordination regionally? This will

help to identify performance gaps in the health sector regionally. Woreda: To assess current monitoring at woreda level, look at the monitoring

tools being used in health sector intended to contribute to a given outcome of health services at woreda level. In assessing the nature of the current monitoring the following questions has to be answered: Are these tools

providing the necessary information at woreda level? Do they involve the key development partners acting in the woreda? Is monitoring focusing on key

issues for efficiency at woreda level? Are there possibilities for greater efficiency and coordination in the woreda? This will help to identify performance gaps in the primary health services at woreda level.

c) Review monitoring scope or tools This is done by assessing a need for additional or specific monitoring scope or

tools to suit the strategies/interventions for work process or activity. For example, large or complex work process s may require more details about strategies/interventions, downstream activitys may require additional

participation by beneficiaries, and innovative pilot activitys may generate specific lessons learned that should be captured through monitoring. This will

help to identify potential sources of resources in the analysis as compared with effectiveness for soliciting resources.

Page 10: M&e system

9

d) Adapt and/or design monitoring mechanisms

The mechanisms used should provide sufficient tracking of progresses towards

health sector goals. Tracking mechanisms can be done at all level to monitor strategies/interventions. At Federal levels, it may involve Top management/management council committee with Directorates/work process

owners and development partners working in health sector and focus on performance outcome monitoring; midlevel management, i.e.,

Directorates/work process owners with teams and development partners and focus on progress towards intermediate performance outcome monitoring; and low level management, i.e., teams with experts and focus health sector

performance outputs and follow-up. This will help to identify efficiency in resource utilization in the analysis as compared with the available resources and institutional arrangements.

Figure 3: Illustration of tracking of progresses that contains inter-related elements

2.3.2 Planning evaluation

Steps in planning evaluation to assess outcomes with the work process of

strategies/interventions in the health sector:

a) Assess a need for evaluation

Within the pre-determined intervals of each plan year, the coordinating

department/M&E unit/section should submit a request for evaluation of

strategies/interventions in the health sector to the Top

management/management council. The M&E unit/section should conduct an

INPUTS

- Finance

DELIVERED THROUGH

- Project/programme

OUTPUTS

- Intermediary

Results

OUTCOMES

- Development

changes

Midlevel management: Directorates/process owners with

teams and development partners and

low level management, i.e., teams

with experts

Top

management/management

council: Management

committee with

Directorates/process owners

and development partners

In partnership with:

development partners &

stakeholders

Page 11: M&e system

10

evaluability assessment (EA) to examine the readiness of strategies/interventions in

the health sector for evaluation. The main purpose of EA is to make necessary

adjustments in the work process /activity documents to make strategies/interventions

ready for the evaluation, to communicate the proposed evaluation of

strategies/interventions in the health sector with stakeholders and agree on necessary

modifications. It also helps to secure maximal participation of stakeholders during the

evaluation work process in strategies/interventions in the sector.

b) Plan for evaluation

Within the second quarter of each plan year, the coordinating/M&E unit/section

prepares and submits the evaluation plan to the planning department and Top

management/management council. An evaluation plan is based on strategic and

selective decisions by Top management/management council about what to evaluate

and when. The plan is then used to ensure that evaluation activities are on track.

c) Review plans for evaluation

The plan is kept up-to-date continuously, annually or periodically depending on local

needs, becoming in essence a rolling plan. For example, if a planning department plans

its first outcome evaluation three years into the health sector work process , the M&E

unit/section may not need to revisit the evaluation plan for two years—that is, until

the year prior to the evaluation. Any revision of the evaluation plan over the course of

the cycle should be presented first by the M&E unit/section to the planning

department. Sometimes revision is required when circumstances change, such as a

loss or gain in resourceor when there is change in the National context.

d) Reach consensus on the focus of evaluation

Evaluation is not an easy task that can be done holistically. The Top

management/management council has to make a consensus on the focus of the

evaluation of health sector. It can be work process or outcome of

strategies/interventions for activitys/work process s. Within each focus of the

evaluation again a core indicators have to be selected and agreed. Based on the

decision made by the Top management/management council, for example, when

planning outcome evaluations in particular, may pick and choose which outcomes to

evaluate, covering at least the mandatory minimum number of outcomes that must be

evaluated. The M&E unit/section should assist in selecting which outcomes to

evaluate.

Page 12: M&e system

11

3. THE MONITORING WORK PROCESS

3.1 Conducting monitoring/routine data collection

The credibility of data depends to a large extent on the manner in which monitoring is conducted. Minimum standards for work process data/routine

data collecting in the health sector are as follows:

a) Use clear indicators

For better assessment of the progress towards strategies/interventions in the

health sector, develop core indicators and baselines. This will help to link to HMIS for regular data collection for strategies/interventions.

Key steps in selecting indicators for strategies/interventions:

Set baseline data and target: An outcome indicator for strategies/interventions has two components: a baseline and a target. The baseline is the situation before a work process or activity begins; it is the

starting point for results monitoring. The target is what the situation is expected to be at the end of a work process or activity. (Output indicators

rarely require a baseline since outputs are being newly produced and the baseline is that they do not exist.)

Use proxy indicators when necessary: Cost, complexity and/or the timeliness of data collection may prevent a result from being measured directly. In this case, proxy indicators may reveal performance trends and make managers

aware of potential problems or areas of success.

Use disaggregated data: Good indicators for strategies/interventions are based on basic disaggregated data specifying resource mapping to fill gap, sources for soliciting, utilization and liquidation.

Involve stakeholders: Participation should be encouraged in the selection of

both output and outcome indicators for strategies/interventions. Participation tends to promote ownership of, and responsibility for, the planned results and agreement on their achievement in

strategies/interventions. A preliminary list of out put indicators should be selected at the formulation stage of interventions in the health sector, with the direct involvement of the M&E unit/section designated to manage the

M&E for interventions in the health sector and with other stakeholders. Development partners should be involved in the selection of outcome

indicators through the coordinating department of MoH for interventions performance measurement formulation work process es.

Page 13: M&e system

12

Distinguish between quantitative and qualitative indicators: Both

quantitative and qualitative indicators should be selected based on the nature of the particular aspects of interventions in the health sector.

Limit the number of indicators: Too many indicators usually prove to be

counter productive. From the available information, develop a few credible

and well-analyzed indicators that substantively capture positive changes in interventions in the health sector. Be selective by striking a good balance

between what should be and what can be measured. A sample format that suggests how indicators fit into monitoring and evaluating

performance of the health sector is indicated below. Sample format

Intended Results (output/outcome)

Type of Indicator

Data Source(S)

Method of Data

Collection/ Analysis

Frequency of Data

Collection/ Analysis

Who is Responsible

Who will Use the

Information

b) Do regular data collection on activities

It will be part of health management information system of the health sector. It bases on agreed and identified indicator for the sector and includes indicators

specific inerventions. Sources of data will be community and health facilities/health offices and all level. There may be data to be collected from NGOs participating in the sector. Data collector will be health professionals at

all level with the support of data clerks. Information flow and frequency of collection will have the same paths to National HMIS except exceptional data for

exceptional uses like for international communication for WASH movement internationally.

The HEWs will record their daily performances & will prepare a monthly, quarterly and annual activity and progress reports and submit their report both

to the Kebele council and to the WHOs. Reporting will be done on agreed reporting formats for each level. The NGOs and private organizations will also do the same. The report from the WHOs to the RHBs will be made on each

month. Furthermore, the RHBs will once again analyze, compile and submit the reports to the FMoH and give the necessary feedback to the Woredas and all concerned bodies. The WHOs will analyze & compile the reports for submission

to the RHB. Finally, the FMoH will again analyze and compile the regional reports.

Page 14: M&e system

13

c) Focus on results and follow-up

Look for ―what is going well‖ and ―what is not progressing‖ in terms of progress

towards intended results of health sector in the health sector. Then, record this in reports; make recommendations and follows-up with decisions and action.

d) Depend to a large measure on good design (logic model)

If a activity/work process is poorly designed or based on faulty assumptions of, even the best monitoring is unlikely to ensure success in interventions in the health sector. Particularly what is important is the design of a realistic results

chain of outcome, outputs and activities in the work process of interventions for the health sector. Avoid using monitoring for correcting recurring problems that need permanent solutions.

e) Do regular visits by M&E unit/section

Verify and validate progress through regular visit. In addition, organize visits and/or joint meetings dedicated to assessing progress, look at the big picture

and analyze problem areas. Then ensure continuous documentation of the achievements and challenges as they occur and do not wait until the last

moment to try to remember what happened.

f) Conduct participatory monitoring mechanisms

It ensures commitment, ownership, follow-up and feedback on performance of

interventions in the health sector. This is indispensable for outcome monitoring of interventions in the health sector where progress cannot be assessed without

some knowledge of what development partners and stakeholders are doing in interventions. Participatory mechanisms include outcome forum, stakeholder meetings, steering committees/Top management/management council and

beneficiaries/ community.

g) Do regular analysis of reports/data aggregation

Fore stance, the annual activity/work process report is one of the minimum

standards for good monitoring of interventions in the health sector. Data from such reports should be aggregated and prepared to suit to the intended audience of the top and midlevel management and other partners and

stakeholders. The findings are used for decision-making on planning and supporting interventions in the health sector.

Page 15: M&e system

14

3.2 Scope of monitoring

Minimum standards that monitoring efforts should address in the health sector

are as follows:

a) Progress towards outcomes

This entails periodically analyzing the extent to which intended outcomes of interventions in the health sector have actually been achieved.

b) Factors contributing to or impeding achievement of the

outcome

This necessitates monitoring of interventions context of HSDP in relation to

socioeconomic developments taking place in the country.

c) Development partners contributions to the outcomes through

outputs

These outputs of interventions in the health sector may be generated by work

process s, activitys, policy advice, advocacy and other activities in the health

sector. Monitoring of outputs entails analyzing whether or not outputs are in

the work process of being produced as planned and whether or not the outputs

are contributing to the outcome of interventions in the health sector.

d) The partnership/harmonization, contracting, branding, active

resource hunting, etc interventions

This is dealing with partnership/harmonization, contracting, branding, active

resource hunting, etc interventions as well as the formation and functioning of

interventions for interventions for the health sector. M&E unit/section may add

additional elements where needed for management or analysis, while keeping a

realistic scope in view of available capacities. Scope of monitoring of

interventions includes providing decision makers and managers/department

heads, etc with information that will be used as a basis for making decisions

and taking action in interventions in the health sector.

3.3 Responsibility for monitoring

The responsibilities for monitoring of interventions in the health sector are different at each work process level, where the focus is on higher-level results

at each higher level of work process . The Top management/management council should focus on the Millennium Development Targets and country work process outcomes; department heads should focus on the performance of

Page 16: M&e system

15

intermediate outcomes and Case teams/case workers/performers should focus

on the implementation of activity/work process work process and outputs. Development partners can provide support at all levels.

a) Top management/management council

Top management/management council takes on a greater role in advocacy and partnership building in interventions in the health sector. They do strategic

choice of monitoring mechanisms of overall domestic and external of interventions in the health sector. The role of Top management/management council is to ensure that interventions developed contribute as best as possible

to the attainment of the goals of the MDG and HSDP. This is accomplished in partnership with key international and national stakeholders/partners and with particular attention to the coordinating department/M&E unit/section.

Top management/management council actively leads quarter and annual

interventions review work process es, develops advocacy and partnership interventions for domestic and external resources, promotes better monitoring for domestic and external resources results and fosters a learning environment.

In general, the Top management/management council sets the framework for managing and prioritizing in domestic and external resources plan and

partnerships for supporting the sector. Together with partners, the Top management/management council also ensures that periodic assessments review whether or not the approach followed is the best way to produce the

intended interventions outcomes. At this level, the focus is on all of MGD and HSDP as well as development

partners’ contribution to country interventions priorities. The annual review is the main vehicle for such monitoring. It draws out general lessons learned and

distills trends in interventions achievements, overall performance and problem areas whether they are related to specific outcomes of interventions in the health sector.

b) Directorates/work process owners

At this level, individual work process /activity level finance monitoring is the main responsibility of the department heads. They are responsible for

performance monitoring with progress towards effective resource utilization. They ensure monitoring report on domestic and external finance at work process level and regions and different sources and bring the reports together

to provide complete information on progress in interventions in the health sector. The coordinating Directorates/work process owners/M&E unit/section

managing monitoring of health sector in the health sector ensure the interface between the desired results and finances that mobilized and utilized to meet the expectations of the target beneficiaries.

Page 17: M&e system

16

Progress towards interventions in the health sector cannot be assessed by

activity/work process reports and indicators alone, the Directorates/work process owners have to continuously scan the environment: keep abreast of

evolving perceptions of key stakeholders and the contributions of partners, analyzing reports received from other stakeholders, using evaluations to provide feedback on progress and, ideally, conducting client/customer surveys to find

out if perceptions of progress hold true.

c) Case teams/case workers/performers

Case teams/case workers/performers are responsible to monitor implementation of interventions: input, work process and output management. They also monitor implementation of interventions carried out by other

implementers. The Case teams/case workers/performers should make periodic and annual activity/work process report on interventions in the health sector

to the Directorates/work process owners that provide critical information and lessons learned regarding the effectiveness of the implementationof interventions in the health sector to the Top management/management

council. Case teams/case workers/performers should also contribute to the implementation of a partnership/contract strategy for interventions in the

health sector developed by the top/midlevel management. The periodic and annual interventions reporting is made by Case teams/case

workers/performers with specific attention to outputs, and is analyzed by M&E unit/section for enhancing of the utilization of resources information at all level and making ready for higher level decision-making. The Case teams/case

workers/performers with M&E unit/section would ensure detailed finance monitoring of all deliverables as well as implementation tasks. Since Case

teams/case workers/performers are often experts in their fields, monitoring at activitys/work process s level may also entail some assessment of outcome status of performance and thus provide input to the top level outcome

monitoring of interventions in the health sector.

This will help to ensure the link with regional day-to-day monitoring of health sector, and in-house perfromance monitoring.

Page 18: M&e system

17

Figure 4: Monitoring responsibilities in the health sector

Top level management

monitoring responsibilities

Progress towards country-level

strategic outcomes: ensure the

interface between the desired results

and peformance inputs

Progress towards strategic outputs:

ensure the interface between the

performance input and outputs Low level management

monitoring responsibilities

Midlevel management

monitoring responsibilities

Regions performance monitoring

responsibilities: ensure the interface

between the performance processes

and strategic outcomes

Woreda

performa

nce

monitoring

responsibi-

lities

Page 19: M&e system

18

4. PERFORMANCE MEASUREMENT AND QUALITY IMPROVEMENT

Indicators are part of performance measurement but they are not the only part.

To assess performance, it is necessary to know about more than actual achievements, that is quality component. What also required is information about how they were achieved, factors that influenced this positively or

negatively, whether the achievements were exceptionally good or bad, who was mainly responsible and so on.

Figure 5: Dimensions for Performance Assessment

4.1 Report reviewing

MoH policy: It can be weekly, monthly, quarterly or annually reviewing of performances. The MoH should get the annual achievement report that serves

as the basis for assessing the performance of work process s and activitys in terms of their contributions to intended health care services’ outcomes through outputs and partnership works at all level. As a self-assessment report by

activity/work process management at all level to the MoH, the annual performance report does not require a cumbersome preparatory work process . Standard performance report format should be used based on the agreed core

indicators. It can be readily used to urge dialogue with partners and stakeholder. The performance report, which is prepared annually for larger

activitys/work process s, is essential to feed into the annual review.

Systematic analysis of achievenemts

Judgment of progress

Systematic analysis of achievenemts

against goals taking account of

reasons behind performance and

influencing factors

Judgment of progress-good or bad-based

on indicators

Can also include rating on other

performance dimensions.

Verification if

progress

towards results Verification if progress towards results

has taken place.

Page 20: M&e system

19

Timing: The reporting period of the annual performance report should be every

12th months, and coinciding with the review period, i.e., the Ethiopian Fiscal Year as part of performance reports.

Purpose: The annual activity/work process report provides a self-assessment by the activity/work process management and is part of the review of the

activity’s/work process performance. The annual performance reportshould provide an accurate update on activity/work process supporting results,

identify major constraints and propose future directions. It analyzes the underlying factors contributing to any lack of progress in interventions so that activity/work process management can learn from experience and improve

interventions performance. Preparation: The annual performance report is prepared by the coordinating

department in the ministry. The M&E unit/section often liaisons with the at all level to convey key concerns as input to the report. The annual performance

report is a report from the interventions/activity/work process management at all level to coordinating department/M&E unit/section and other stakeholders. This will help to align with implementation of harmonization (one report and

one budget).

4.2 Key stakeholder forum (KSF)/Joint coordination forum (JCF)/Outcome forum

Another important way of monitoring interventions is the use of coordination mechanisms that bring together partners and stakeholders for discussion and analysis. This is generally known as the use of key stakeholder forum/joint

coordination forum for interventions of the health sector.

MoH policy: MoH has to employ mechanisms that involve partners and allow for periodic discussion and analysis around outcomes of interventions in the sector. For ease of reference, coordination mechanisms that monitor outcomes

of interventions in the sector are referred to as key stakeholder forum (KSF)/joint stakeholder forum (JSF). Such forum focuses on the monitoring of

outcomes of interventions in the sector and the contribution of immediate outputs of interventions to outcomes of interventions in the sector. Ideally, outcome forum should use existing mechanisms such as established work

process steering committees, thematic groups or sectoral coordination groups. If regular mechanisms do not exist, the MoH may bring key partners together at periodic meetings.

Purpose: Key stakeholder forum/joint coordination forum ensure continuous

outcome assessment of interventions in the sector, which serves to enhance progress towards long term results. The forum also promotes partnerships. Bringing together different activitys/work process s concerned with a single

Page 21: M&e system

20

shared outcome may help ensure synergy and reinforce a common strategy

among MoH activitys/work process and partners towards long term results.

Participation: Participants in outcome forum include MoH, development partners, NGOs and GOs. External development partners and NGOs should participate at least on regular basis, but may not wish to attend all meetings.

Focus: What do outcome forum look at in interventions in the sector? The

outcome forum assesses the status of strategic outputs and related initiatives by partners-all of which contribute to an intended outcome of interventions in the sector. The forum does so by examining performance information from

activitys/work process s, national reports, donor reports and other sources. A central task is to agree on a monitoring plan for the outcome and oversee its implementation. It also serves as the focal team for outcome evaluations of

interventions in the sector. An outcome forum should be a vehicle for documenting and disseminating lessons learned in interventions in the sector.

This may help alert the MoH management to problems or issues that might be common across results of monitoring of interventions in the sector. Outcome forum should not increase transaction costs by looking at all activity/work

process details.

Organization: The coordinating department (e.g., MoH PPD) is responsible for ensuring that there is consultation and analysis of interventions in the sector to support the monitoring of outcomes of interventions. For practical reasons, the

department generally uses existing fora, if available. If there are many forums, some of them can be clustered, perhaps under the strategic areas of support (SAS) or thematic areas of interventions in the sector.

4.3 Annual review (AR)

MoH policy: MoH has a firm stand for annual review of interventions in the sector that connects reporting, feedback, evaluation and learning to assess

finance performance as a basis for the annual review meeting (ARM). It is essential that the annual review meting is prepared from analysis based on

consultations with partners. The AR should be held towards the end of the year in order to feed into the ARM. The review has to be fully managed at the country level by coordinating department/M&E unit/section as decided by

MoH. Purpose: The AR is a management dialogue at country level to assess

interventions progress towards results (outputs and outcomes) that can be used for building a consensus and a mutual understanding between MoH and

its partners/stakeholders around common outcomes (results) of interventions in the sector. It involves an assessment by coordinating department/M&E unit/section with partners/stakeholders. The AR is the key vehicle for learning

Page 22: M&e system

21

by determining overall and general lessons learned and reviewing

recommendations of outcome evaluations of interventions in the sector.

Participation: The entire MoH officials (Directorates/work process owners and sections), development partners, NGOs and GOs are involved in the review of interventions in the sector to varying degrees. The Regional Health Bureaus

may also decide on the extent of their involvement, perhaps taking part directly or electronically, or simply being informed of major findings of interventions in

the sector. Organization: The scope of the review of interventions in the sector must be

balanced between its complexity and added value. It would depend on how well the MoH has involved partners in the issues during the year; for example, many of the AR issues would already have been covered if outcome monitoring of

interventions in the sector with partners/stakeholders has been regular, leading to a simpler AR. A focused approach is recommended so that the key

issues and/or outcomes of interventions in the sector are addressed. What do we want to get out of the annual review of interventions in the sector? Who needs to be involved in interventions in the sector? What should we or

shouldn’t we focus on for interventions in the sector?

Documentation: There is no formal documentation required for the AR of interventions in the sector, as it may take different approaches, i.e., HMIS. This will help to align with and use of standardized data collection of the existing

system.

4.4 Quality Improvement

Work process es/activities shulod not be carried out/done for the sake of performing things/acomplishing assignments. It has to give sense and make

something and use out of it. This is practically meaning that it is meeting the needs and expectations of clients or customers of healthcare delivery services with a minimum of effort, rework and waste. Similarly, quality improvement is

the concerted effort to continuously do things better until they are done right every time (PPM&E) with a continuous measurement and improvement of the

work process . It also aims to improve the work work process es and systems needed to provide a healthcare delivery service that meets or exceeds consumer needs and expectations.

Page 23: M&e system

22

5. REGULAR FOLLOW UP/SUPPORTIVE SUPERVISION

5.1 Give support

The performance monitoring of the interventions in the health sector does not

end with weekly, monthly, quarterly or annually reviewing of performances. Rather, the findings, conclusions and recommendations made from

performance monitoring should be internalized and acted upon; hence it is need based and ensures effective and efficient resource (time, money, human and logistics) utilization. Its approah is, therfore, seeking integration of

professionals with different purposes but working for the same outcome. Therefore, the final step in managing and conducting an M&E of interventions

in the health sector is to follow up on the development change and give support nationally at federal, regionally at respective bureaus of health, locally at

grassroots level. This will help to ensure the maximum efforts forwarded to contribute to the development changes at all level and identify areas to be evaluated to check development changes there.

5.2 Composition of the team

Management Level

Composition of the team

Focus of the supervision

Time of the supervision

Expected outputs

Federal MoH Vertual team (from each work process )

Progress towards achieving development changes: outcomes

Yearly to the regions

Corrective measures at the spot for encouragement

Regional HB Vertual team (from each work process )

Progress towards performing each activity: intermediate outcomes

Twice a year to woredas

Corrective measures at the spot for encouragement

Woreda HO Vertual team (from each work process )

Progress in the work process of carrying each activity: outputs

Quarterly to health posts

Corrective measures at the spot for encouragement

Page 24: M&e system

23

6. THE EVALUATION WORK PROCESS

6.1 Preparing for an evaluation

A variety of evaluations-each with different purposes and timing-can take place during the national intervention cycle. The coordinating department/M&E unit/section should strive to identify, at least generally, the purpose, timing

and the readiness for evaluation of interventions in the health sector in a comprehensive and coherent manner-and do so as early as possible.

The timing of an evaluation of interventions in the sector should be directly linked to its purpose. The purpose of an evaluation should dictate its timing

and scope. When preparing for an evaluation, it can be helpful to looking at the scheduled dates for an evaluation of interventions in the sector, estimating the time needed to prepare and conduct the evaluation, and then anticipate by

when the preparation work process needs to begin.

a) Ensure that Interventions/work process readiness for evaluation

EA is a stepping-stone toward any type of evaluation, whether it is a large work process or outcome evaluation or a smaller, internal assessment of work

process performance. While the EA is taking place, the evaluator is also working with work process staff, funding agencies, administrators, and participants to help the work process get ready for an evaluation.

For example, he/she can help clarify work process goals by making them more realistic and meaningful. This is a major advantage of having an EA—it will

improve a future evaluation by formalizing the agreement between the evaluator and decision makers on what is important in the work process , anticipating

evaluation problems, and smoothing the overall work process .

How to Perform an Evaluability Assessment

An EA has five crucial tasks that an evaluator must successfully complete:

Task 1 Study the work process history, design, and operation;

Task 2 Watch the work process in action; Task 3 Determine the work process ’s capacity for data collection,

management, and analysis; Task 4 Assess the likelihood that the work process will reach its goals

and objectives; and

Task 5 Show why an evaluation will or will not help the work process and its stakeholders.

Page 25: M&e system

24

How to Ensure that the work process is evaluable

Each work process should have a model that is clearly structured. Within the model, the goals and objectives should be measurable so that the degree to which they have been achieved can be assessed. Work process managers must

think objectively; that is, what data can be collected that will provide clear evidence that the goals and objectives have been met? Overall, work process managers should ensure that their work process s are serving those youth they

set out to serve, that they are collecting relevant data in an organized and consistent fashion, that they are staffed with people with the appropriate

qualifications and knowledge, and that the work process activities are being implemented as designed. If all of these elements are in place, a work process will most likely prove to be evaluable.

b) Timing, purpose and duration of evaluations of interventions in the health sector

TIMING EXAMPLES OF PURPOSES DURATION

Early in the activity/work process cycle: Years 1–2

To check early strategy for a particularly ambitious results (work process /output)

Shorter

Middle of the activity/work process cycle: Years 2–3

To prompt mid-course adjustments in work process /output production

Medium

End of the activity/work process cycle: Years 4–5

To check strategy for results (outcomes) and learn lessons for the next health care supporting strategy formulation

Longer

6.2 Involving partners and stakeholders

An emphasis on results places an even greater emphasis on the involvement of development partners (those with whom MoH is actively engaged in pursuing results) and stakeholders (those with a role and/or interest in the results) in

evaluation exercises of all kinds. In particular, key partners, should be involved in every step of an outcome evaluation of interventions in the sector. Likewise,

stakeholders affected by an evaluation should also be involved, even if they are not directly involved in the work process or outcome. Stakeholder should be involved, for example, through a stakeholder meeting to discuss the initial

findings of the evaluation team.

Procedures for involving partners and stakeholders, and standards for the

entire evaluation work process of interventions in the health sector include the following:

a) Make a preliminary selection of partners and stakeholders to contact in

the early stages of evaluation planning (i.e., when selecting the outcome

Page 26: M&e system

25

evaluation of interventions in the sector, defining the scope, deciding on

timing and so on); b) Share the TORs and CVs of suitable candidates for the evaluation team

and obtain feedback from stakeholders and partners, who can play a valuable role in defining the scope of the outcome evaluation;

c) Introduce team members to partners and stakeholders;

d) Invite partners and stakeholders to workshops with the evaluation team (i.e., when they form the evaluation questions, present the evaluation

report, etc.); e) Organize a joint analysis with partners of relevant documentation for the

evaluation and make the analysis available for future examination by the

evaluation team; f) Organize joint field missions with key partners/stakeholders when

relevant;

g) Organize a meeting with partners and stakeholders after the first draft of the evaluation report is produced to discuss the findings with them;

h) Follow-up with partners and stakeholders to help ensure that the lessons learned and recommendations of the evaluation of interventions in the sector are internalized.

6.3 Defining the scope

The scope of an evaluation of interventions in the sector might address:

a) Identification of innovative methodologies of interventions in the sector to

contribute to key development issues; b) Level of participation of stakeholders in the achievement of effective

interventions in the sector;

c) Identification of direct and indirect beneficiaries and how they have benefited from interventions in the health sector;

d) Implementation and/or management issues of interventions in the health sector if they are suspected of being problematic, including the timeliness of outputs, the degree of stakeholder and partner involvement in the

completion of the outputs, and how work process es were managed/carried out (Were the work process es transparent and

participatory, for example?).

6.4 Drafting the terms of reference (TOR) for evaluation

At a minimum, it is expected that terms of reference for all evaluations will

contain the following information: Introduction: A brief description of what is to be evaluated (interventions in

the health sector);

Objectives: Why the evaluation is being undertaken and a list of the main stakeholders and partners;

Page 27: M&e system

26

Scope: What issues (interventions in the health sector) , subjects, areas and

timeframe the evaluation will cover; Products expected from the evaluation: What products the evaluation is

expected to generate (e.g. resources mapping and gap analysis work process , finance sources and soliciting work process , utilization work process , recommendations, lessons learned, rating on performance);

Evaluation focus and approach: The focus of evaluation (work process , outcome) and approach (formative, outcome);

Methodology of evaluation: The methodology suggested to the evaluation team;

Evaluation team: Composition and areas of expertise;

Implementation arrangements: Who will manage the evaluation and how it is organized.

6.5 Budgeting

Budgeting for an evaluation of interventions in the health sector depends upon the complexity of interventions to be evaluated and the purpose of the exercise.

It is not more than 2-3% of the work process /activity budget. This factor dictates the timeframe and the number of evaluators needed. When budgeting

for an evaluation of interventions, the coordinating department/M&E unit/section should consider the following factors:

a) The scope, complexity and time commitments of the evaluation

b) The need to minimize time and expense c) The use of field visits and interviews

d) The use of national consultants e) The areas of expertise needed among the evaluators

6.6 Selecting the evaluation team

Areas of expertise to be considered in the team composition include the following:

a) Technical knowledge and experience in health sector supporting mechanism/thematic areas, with specifics depending on the specific focus of the evaluation;

b) Knowledge of the national situation and context of interventions in the health sector;

c) Results-based management expertise; d) Capacity building expertise; e) Familiarity with policymaking work process es in the finance sector (design,

adoption, implementation) if the evaluation is to touch upon policy advice and policy dialogue issues.

Page 28: M&e system

27

6.7 Managing an evaluation

Tasks involved in managing an evaluation of interventions in the health sector

should touch on data collection and analysis, backstopping and feedback, reporting and follow-up.

a) Collecting and analyzing data

Both qualitative and quantitative methods are used. The methods respond to different objectives and use different instruments and methodologies yet are

highly complementary. Preparing for an evaluation normally requires a combination of both types of methods.

b) Backstopping and feedback

The coordinating department/M&E unit/section is responsible for liaisoning with partners, backstopping and providing technical feedback to the evaluation team. The M&E unit/section should be in constant liaison with the evaluation

team. These well-informed staff members push the evaluation team to justify its conclusions and back them up with evidence, and help deepen and clarify the

evaluation team’s discussions of interventions in the health sector.

c) Reporting

The evaluation team is bound by the TOR to ensure that the selected issues of

interventions in the health sector are adequately addressed in the report, although some flexibility will allow the team to add issues that it feels are particularly pertinent and to make judgmental matrix, conclusions and

recommendations. Generally, the team leader drafts a table of contents at the earliest stage of the evaluation. Once the first draft of the evaluation report is submitted, the coordinating department or relevant M&E unit/section should

analyze and provide comments. After comments are incorporated, the final draft version should be circulated among partners to obtain their valuable feedback.

The evaluation team leader is responsible for incorporating comments into the final version of the report, and then for submitting it to the Top management/management council of the Ministry.

Page 29: M&e system

28

7. USE OF MONITORING AND EVALUATION INFORMATION

Monitoring and evaluation should provide information and facts that, when accepted and internalized, become knowledge that promotes learning. Learning must therefore be incorporated into the overall work process cycle for

interventions in the health sector through an effective feedback system. Information must be disseminated and available to potential users in order to

become applied knowledge. Key principles/values for use of evaluative evidences for interventions in the

health sector: a) Record and share lessons learned b) Keep an open mind

c) Plan evaluations strategically d) Involve stakeholders strategically

e) Provide real-time information f) Link knowledge to users g) Apply what has been learned

h) Monitor how new knowledge is applied

Monitoring and evaluation should also indicate under what conditions in different focus areas lessons learned should be shared at regional and woreda level, through coordinating department/M&E unit/section.

Key principles/values of learning in the use of monitoring and evaluative evidence of interventions in the health sector:

a) Help others actively interpret rather than record information on interventionsso they can construct new knowledge for themselves;

b) Use timely, effective and innovative information management interventions for interventions;

c) Derive performance standards and learning from the various

units/constituencies/ communities of practice with which MoH works to make interventions in the health sector assessments more participatory;

d) Situate abstract tasks in authentic contexts so that the relevance of the task is apparent and others can embed new knowledge in interventions;

e) Extend to others the opportunity to work at problem solving by actively

sharing skills and expertise in interventions with one another; f) Unbind knowledge from a single specific context of interventions in order

to maximize knowledge transfer;

g) Enable others (regions, woreda) to recognize and respect what they already know about interventions as well as the knowledge that exists

within their community; h) Provide others with many examples of a new concept in interventions;

Page 30: M&e system

29

i) Strengthen own and others’ ability to judge when new knowledge of

interventions should be used.

7.1 INFORMED DECISION MAKING

Finally, at all levels informed decision making will be an important step to continue with the implementation of the work process or stop it or for

termination. It can be made collectively, by work process owner or performer for early corrective measures.

7.2 FEEDBACK

Key principles of feedback from monitoring and evaluation of interventions in the health sector: 1. Elaborate interventions based on intended outcomes;

2. Define, for each department level and development partners, the purpose for generating knowledge or decision-making information and

its scope of interventions; 3. Define monitoring priorities oriented to outputs and outcomes and have

reference points or standards against which judgments can be made

about feedback; 4. Select knowledge and information indicators based on

corporate/strategic priorities, use and user; 5. Incorporate a timescale covering future changes in interventions; 6. Constantly inquire, through feedback mechanisms, about why events

appear to have happened or to be happening in of interventions; 7. Identify the extent of the effect that health sector are having as

compared to other factors influencing a development situation;

8. Specify where, when and how information will be interpreted, communicated and disseminated, including consultations as inputs to

routine work process es of interventions in the health sector. 9. Share knowledge on documenting, analyzing and reviewing of

comparative experiences in interventions design, partnerships,

monitoring and evaluation activities; 10. Target strategically generating information on interventions in the

sector that is appropriate for different users and timely in relation to decision-making and accountability requirements;

11. Cross-check and ensure quality of evaluative evidence to produce valid

and relevant feedback on interventions in the health sector.

Page 31: M&e system

30

7.3 PUBLICATION OF EVIDENCE

Publication of monitoring and evaluation results of interventions in the health

sector should follow a clear format in order to treat the evidence fairly, to produce compelling analytic conclusions and to rule out ambiguity.

Standards for publication of monitoring and evaluation results of interventions: a) Designed for a specific audience;

b) Relevant to decision-making needs, especially for Top management/management council;

c) Available when the ― window of opportunity ‖ for decision-making arises

(i.e. timely); d) Easily and quickly understood; e) Based on sound methodological principles;

f) Delivered through recognized channels; g) Areas of uncertainty and their significance clearly identified;

h) Accompanied by full acknowledgement of data or information sources; i) Provides information on both tangible and intangible products and work

process es of development;

j) Available at minimal cost in terms of time, money and administrative costs.

7.4 WORK PROCESS REDESIGNING

7.4.1 Check for sustaining adequate performance

M&E ensures sustaining best practices of the respective work process accordingly and avoids repeating mistakes from the past and above all maximizes quality improvement with continum of care/service.

7.4.2 Check for formating/redesigning inadequate performance

Formatting/redesigning of the respective work process also remains a consequence of M&E with determining possible worthiness of performances in

the light of resources utilized. Check for standard operating procedures and manuals that may conutribute to inadequate performance. Check again for

data collected because it might be data that the work processes do not use it.

7.4.3 Verify/redifine indicators

It mgiht be the indicator that has to be responsible for adequate/inadquate implementation of the process. It might be an indicator that cannot be

measured.Check the indicator measurability and redifine it.

Page 32: M&e system

31

8. VALIDATION

Validation of each intervention’s/ activity’s/ work process’s success/

accomplishment/ acievement is a means of maximizing participation and improvements. Validation entails checking or verifying whether or not the reported progress/performance is accurate. It entails checking or verifying

whether or not the reported progress/performance accuracy with right role and responsibility, proper information flow (transfer data) and acceptable sources.

8.1 Tools of validation:

a) Field visits/on-sopt checking

Field visits are frequently used as a monitoring mechanism. Minimum

standards for field visit should include: field visit policy, timing of the visit, its purpose in terms of monitoring, and focusing what to look for in order to measure interventions progress.

b) Work process of field visit

MoH Policy: Field visits may be undertaken by top, midlevel and low level work process /activity managers and/or a group of team with partners and

stakeholders.

Timing: A field visit may be planned for any time of the year. If it is undertaken in the first half of the year, just after the annual review meeting, it may be oriented towards the validation of progress towards

achievements/accomplishments. If it is undertaken in the latter part of the year, it may be oriented towards the validation of interventions results. The reports of field visits are action-oriented and brief, submitted within a week of

return to the coordinating department/M&E unit/section.

Purpose: Field visits serve the purpose of validation of health sector interventions. They validate the interventions results reported by work process s and activitys, in particular for larger, key work process s and activitys that

are essential for health sector outcomes. They involve an assessment of health sector interventions progress, results and problems. They may also include

visits to the activity/work process management at all level. Visits are increasingly joint efforts of several partners involving clusters of work process s and activitys within a health sector interventions outcome. A team of Work

process /activity Managers, for example, may undertake a series of visits to activitys that are contributing to one particular health sector interventions outcome. Several partners might also join together to visit all health care

services financial activities within a specific geographical area. Such joint efforts are often an efficient way to obtain a comprehensive overview of health

sector interventions progress at all level.

Page 33: M&e system

32

Focus: What should we look at during a field visit? The emphasis is on observing the progress being made towards the attainment of health sector

interventions results (outcome and outputs) that are contributing to the goals of the health care services. The field visiting team/M&E unit/section should also look at the contribution of the development of strategic partnerships

(MoUs/contract signed with region regarding performance agreement) and rates progress towards outputs and outcome of health sector interventions. In a

change from past health sector interventions practice and detailed implementation issues will no longer be the main focus of field visits.

8.2 Rating system for validation

For outcomes, the rating is to compare, as measured by outcome indicators for interventions, the evidence of movement from the baseline either towards or away from the end/target.

The most commonly used rating mechanisms are:

o Positive change: positive movement from baseline to target as measured by the outcome indicator(s) for interventions;

o Negative change: reversal to a level below the baseline as measured by

the outcome indicator(s) for interventions; o Unchanged: no perceptible change between baseline and target as

measured by the outcome indicator(s) for interventions.

For outputs, the rating system should also have three points: no, partial and

yes. The three ratings reflect the degree to which an output’s targets have been met. This serves as a proxy assessment of how successful an organizational unit has been in achieving its outputs of interventions. The three ratings are

meant to reflect the degree of achievement of outputs by comparing baselines (the inexistence of the output) with the target (the production of the output).

The ―partially achieved‖ category is meant to capture those en route or particularly ambitious outputs that may take considerable inputs and time to come to fruition.

The most commonly used rating mechanisms are:

o No: not achieved;

o Partial: only if two-thirds or more of a quantitative target is achieved; o Yes: achieved.

8.3 Rewarding for good performance

For positive change and achievement a reward system has to be in place for

recognition of good performance and getting encouraged for those achieving less or no outcome or output. Check for accountability/responsibility for less

acheievement and redifine roles and responsibility.

Page 34: M&e system

33

REFERENCES

UNICEF-ROSA, Planning, monitoring and evaluation, retrieved from

http://www.unicef.org/rosa/monitoring_evaluation.html, on 14 Oct 2007 WB, Ten Steps to a Results-Based Monitoring and Evaluation System: A

Handbook for Development Practitioners, retrieved from http://publications.worldbank.org/ecommerce, on 14 Oct 2007

UNDP, (1998) The Multi-Year Funding Framework Report by the Administrator, retrieved from

http://intra.undp.org/bom/maintopics/services/bp/bpmain.html, on 14 Oct 2007

UNDP, Change Management, retrieved from http://intra.undp.org/bom/maintopics/services/bp/bpmain.html, on 14 Oct

2007 UNDP, Balanced Scorecard, retrieved from

http://intra.undp.org/bom/scorecard/index.html, on 14 Oct 2007

UNDP, (2002), Handbook on Monitoring and Evaluating for Results, retrieved from http://intra.undp.org/bom/scorecard/index.html, on 14 Oct 2007


Recommended