+ All Categories
Home > Documents > EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS...

EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS...

Date post: 12-Aug-2021
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
49
EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April 2018 This document was prepared by Partners for Global Research and Development LLC (PGRD) for review by the United States Agency for International Development (USAID). The Evaluations Project is made possible by the American people through USAID.
Transcript
Page 1: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

EVALUATIONSACTIVITYFINALPROGRESSREPORTContract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018

April2018

ThisdocumentwaspreparedbyPartnersforGlobalResearchandDevelopmentLLC(PGRD)forreviewbytheUnitedStatesAgencyforInternationalDevelopment(USAID).TheEvaluationsProjectismadepossiblebytheAmericanpeoplethroughUSAID.

Page 2: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

2

EVALUATIONSACTIVITY

FINALPROGRESSREPORTAPRIL1,2013–MARCH31,2018April30,2018ContractNumber:AID-527-C-13-00002Preparedfor:MiriamChoy,COR-USAIDSubmittedby:PartnersforGlobalResearchandDevelopmentLLC(PGRD).

Page 3: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

3

CONTENTS Acronyms and Abbreviations ......................................................................................................................................... 4

Resumen Ejecutivo ............................................................................................................................................................ 6

Executive Summary ........................................................................................................................................................... 8

Introduction ..................................................................................................................................................................... 10

Background .................................................................................................................................................................. 10

Results Framework ................................................................................................................................................... 11

Component 1: Evaluation Studies Are Used to Improve Programming ............................................................ 13

Performance Evaluations and Assessments ......................................................................................................... 13

Products and activities promoting the use of studies ....................................................................................... 14

Use of studies ............................................................................................................................................................. 17

Component 2: Implementing Partners Are Able to Manage Per Results ......................................................... 18

Support for USAID Development Objective Monitoring & Evaluation Plans .............................................. 19

Support to the USAID/Peru Evaluation Team .................................................................................................... 20

M&E Training and Technical Assistance for Partners ........................................................................................ 22

Component 3: Local Capacities for Evaluation Strengthened ............................................................................. 31

Evaluation Training and Technical Assistance for Evaluation Networks ...................................................... 32

Lessons Learned ............................................................................................................................................................. 37

Lessons Regarding the Design of Activities ......................................................................................................... 37

Lessons Regarding the Implementation of Evaluations and Studies ............................................................... 37

Lessons Regarding the Use of Evaluations ........................................................................................................... 38

Lessons Regarding Capacity Building ..................................................................................................................... 39

Recommendations .......................................................................................................................................................... 40

Annex A: Index of reports, deliverables and related products ........................................................................... 42

Annex B: USAID Evaluations Processes ................................................................................................................... 48

Page 4: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

4

ACRONYMS AND ABBREVIATIONS CDCS Country Development Cooperating Strategy CGE Compromisos de Gestión Escolar CLA Collaborating, learning and adapting COR Contracting Office Representative DO Development Objective DRE Dirección Regional de Educación DRESM Dirección Regional de Educación San Martín/ Regional Office of Education DREU Dirección Regional de Educación Ucayali/ Regional Office of Education EFS Evaluation Fact Sheets ENAP National School of Public Administration GORESAM Gobierno Regional de San Martín /Regional Government of San Martin GOREU Gobierno Regional de Ucayali / Regional Government of Ucayali GRDS Gerencia Regional de Desarrollo Social HICD Human and Institutional Capacity Development IP Implementing partners IR Intermediate result LANN Liderando los Aprendizajes de Niñas y Niños MINEDU Ministry of Education MOOC Massive Open Online Course M&E Monitoring and Evaluation NGO Non-governmental organization PEEL Enseñar es Liderar / Teaching is leading Project PEI Institutional Strategic Plan PERUME Peruvian Network of Monitoring and Evaluation PGRD Partners for Global Research and Development RPO Regional Program Office ROAA Regional Office of Acquisition and Assistance SER Asociación Servicios Educativos Rurales SIAGIE Sistema de Información de Apoyo a la Gestión de la Institución Educativa SOW Statements of work SYS Systematization TBD To be defined UGEL Unidad de Gestión Educativa local UPCH Universidad Peruana Cayetano Heredia USAID United States Agency for International Development

Page 5: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

5

Page 6: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

6

ResumenEjecutivo

Evaluations fue ejecutado por la empresa conjunta Partners for Global Research and Development (PGRD) entre abril 2013 y marzo 2018, bajo contrato AID-527-C-13-0002, con un presupuesto obligado de USD 7,141,515. Tuvo como propósito principal promover la planificación y programación basada en evidencias generadas a partir de estudios y evaluaciones, y contribuir al desarrollo de capacidades en monitoreo y evaluación (M&E) de USAID/Perú y sus socios implementadores.

La actividad se desarrolló en el marco de la política de Evaluaciones de USAID (2011 y 2016) que buscaba contar con evaluaciones más rigurosas y de calidad para mejorar la efectividad de sus programas y proveer información para la planificación de nuevas intervenciones. Evaluations se beneficia de cinco años en los cuales USAID revisa y mejora sus guías operacionales (ADS) para el diseño de proyectos, desarrollo de evaluaciones, monitoreo del desempeño y gestión del conocimiento. Al mismo tiempo, la actividad se desarrolló en un contexto local donde el gobierno peruano generalizó la gestión de su presupuesto basado en resultados e incorporó en diversos ministerios unidades funcionales de monitoreo y evaluación, buscando mejorar las políticas y su aplicación, especialmente las que tienen que ver con inclusión social, fortalecimiento de la democracia y cuidado del medio ambiente. Evaluations se desarrolló en estrecha colaboración con actores locales que contribuyeron con la validación de herramientas, especialmente de capacitación, se adaptó a los cambios tanto locales, como los que generó la política de Evaluaciones de USAID, y aportó aprendizajes a partir de la producción de estudios y evaluaciones, y las iniciativas de capacitación dirigidas a los socios y funcionarios de gobierno.

Evaluations se organizó en tres componentes para proporcionar las siguientes líneas de soporte::

1. Diseñar e implementar evaluaciones de impacto y de desempeño y desarrollar estudios en apoyo a los programas de USAID;

2. Proporcionar asistencia técnica y desarrollar capacidades en monitoreo y evaluación a USAID/Perú, socios implementadores y contrapartes del gobierno peruano;

3. Fortalecer la capacidad de organizaciones locales para el diseño e implementación de evaluaciones y estudios de acuerdo con estándares internacionales.

Bajo el primer componente, Evaluations realizó 16 evaluaciones -una de impacto- y 11 estudios, cubriendo actividades de los tres Objetivos de Desarrollo de USAID. Todas las evaluaciones generaron recomendaciones orientadas a mejorar la actividad evaluada o futuras actividades de la Misión, y 10 evaluaciones generaron planes consensuados de operativización de las recomendaciones. Al monitorear el uso de los estudios, Evaluations documentó la aplicación del 85% de ellos. El equipo no encontró evidencia de que tres evaluaciones y un diagnóstico hayan sido usados para alimentar el diseño o la planificación de nuevas estrategias o intervenciones, principalmente porque la Misión no continuó con esas líneas de intervención.

En este componente, Evaluations inició diversas estrategias para promover el uso de los estudios y evaluaciones, como involucrar desde el inicio del estudio a los actores vinculados con la intervención -jefes de oficina, supervisores de la actividad e implementadores-, presentar y discutir los resultados iniciales una semana después de concluido el trabajo de campo, promover presentaciones a una audiencia más amplia, colgar los informes en el Development Experience Clearinghouse (DEC) de USAID y en la página web de PGRD. En el último año, Evaluations elaboró 15 notas breves de dos páginas con los hallazgos más relevantes, conclusiones y recomendaciones de los estudios, presentó los resúmenes ejecutivos en formatos gráficos más amigables, y generó discusiones con los equipos de USAID a partir del análisis transversal de los diferentes estudios y evaluaciones, algunos de los cuales se escribieron como notas de aprendizaje en temas de interés para la cooperación para el desarrollo.

Page 7: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

7

Bajo el segundo componente, Evaluations apoyó al equipo de M&E de USAID/Perú en la implementación de las normas operacionales para mejorar la práctica del monitoreo de sus diversas actividades, y diseñó y puso en marcha un plan para mejorar las capacidades en M&E de los socios que implementan actividades de USAID, especialmente bajo los mecanismos de acuerdo cooperativo y donación. Para la primera tarea, Evaluations se desempeñó como un elemento de engranaje entre USAID y los socios, principalmente para reconocer cómo las diversas actividades de los socios contribuían a los resultados y objetivos de la Misión, estandarizando indicadores, instrumentos de medición, formatos de planificación y mejora de la calidad de los datos.

Para el diseño del plan de desarrollo de capacidades, Evaluations diseñó dos instrumentos de diagnóstico, uno para medir las necesidades de capacitación de las personas que ejercen funciones de M&E y otro para medir el desempeño de las organizaciones en las áreas funcionales de planificación, monitoreo, evaluación y gestión del conocimiento. Los resultados de la aplicación de ambos instrumentos funcionaron tanto como línea de base para medir la mejora en las capacidades, como para identificar el tipo de capacitación y asistencia técnica que contribuiría a mejorar la práctica de M&E en socios implementadores. Evaluations diseñó un programa de postgrado con rango universitario (Diplomado), cuyos contenidos académicos fueron validados por expertos locales y latinoamericanos, antes de su aplicación, y posteriormente validados con dos promociones de alumnos. 52 responsables de M&E de 20 organizaciones -7 organizaciones gubernamentales- socias de USAID participaron en el programa de postgrado. Además de las anteriores, 11 organizaciones socias participaron de talleres de M&E y recibieron asistencia de Evaluations para elaborar sus planes de M&E.

Con el aprendizaje y el material desarrollado para el programa de postgrado, así como con la colaboración de varios egresados, Evaluations desarrolló un programa de capacitación en los Gobiernos Regionales (GR) y las Direcciones de Educación de San Martin y Ucayali para la instalación de sus sistemas de M&E, incluyendo la presentación de tableros de control en las páginas web de ambos GR. Estos procesos se consolidaron en una Caja de Herramientas, que presenta las orientaciones, los instrumentemos de diagnóstico, las guías, los manuales de capacitación y las normas para instalar un sistema de M&E en una oficina de gobierno.

Evaluations amplió el alcance de sus productos de fortalecimiento de capacidades de M&E a través del desarrollo de un curso interactivo en línea basado en el Módulo de Gestión de Evidencias. En virtud de un acuerdo entre la Escuela Nacional de Administración Pública (ENAP) y USAID, Evaluations colaboró con ENAP para desarrollar este ‘Curso en Línea Abierto y Masivo’ (MOOC), titulado 'Comunicación de Evidencias de M&E'. ENAP ofrecerá el curso en línea utilizando su plataforma educativa.

Para trabajar el tercer componente, Evaluations formalizó convenios con dos redes de evaluación locales: Eval Perú, conformada por consultores de evaluación y Red Perume, integrada por funcionarios de las áreas de M&E de diversas instituciones, principalmente de gobierno. Eval Perú fue un aliado importante en las actividades de capacitación y asistencia técnica de Evaluations y para la difusión de los productos de Evalulations en la comunidad internacional de evaluación. Evaluations apoyó el desarrollo institucional de ambas redes de acuerdo con las necesidades identificadas en un plan de mejora ad-hoc a cada red, y financió la participación de 21 miembros a cursos seleccionados del programa de postgrado en M&E. De los 21 participantes, 13 obtuvieron el Diplomado financiando por su cuenta los demás cursos.

El Diplomado está instalado en la Universidad Peruana Cayetano Heredia y se encuentra en su segunda convocatoria, después de Evaluations; los materiales académicos del Diplomado han sido distribuidos a otras universidades de Lima. Un año después de concluido, se visitó a 40 participantes del Diplomado en 16 organizaciones, y todos declararon que están aplicando sus aprendizajes, introduciendo mejoras en los sistemas de M&E y en los instrumentos, con énfasis en monitoreo, en menor medida en evaluación.

Page 8: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

8

ExecutiveSummary

Evaluations was implemented by the joint venture, Partners for Global Research and Development (PGRD), between April 2013 and March 2018, under contract AID-527-C-13-0002, with an obligated budget of USD 7,141,515. The main purpose of the activity was to support evidence-based planning and programming through evaluations and studies, and to develop the monitoring and evaluation (M&E) capacilities of USAID/Peru and, especially, of its implementing partners.

The USAID Evaluations policy (2011 and 2016) provided the framework for the activity. This policy sought to establish more rigorous and high quality evaluations to improve the effectiveness of programs and inform the planning of new interventions. Evaluations benefited from a period in which USAID consistently reviewed and improved its operational guidelines (ADS) for project design, evaluation development, performance monitoring, and knowledge management. At the same time, the activity seized on a local context in which the Peruvian government instituted results-based budget management and incorporated functional monitoring and evaluation units in various ministries in order to improve policies and their implementation, especially those addressing social inclusion, strengthening democracy and caring for the environment. Evaluations was developed in close collaboration with local actors. These stakeholders contributed to the validation of tools (especially training), adapting to changes at the local level and within USAID policy, learning through the production of studies and evaluations, and strengthening M&E capacity among implementing partners and public institutions.

Evaluations was organized into three components to provide the following lines of support:

1. Design and implement both impact and performance evaluation studies and conduct assessments supporting USAID programs;

2. Provide performance monitoring and evaluation technical assistance (TA), and capacity building for USAID Peru, its implementing partners, and host country government counterparts; and

3. Strengthen the capacity of local evaluation institutions to design and implement evaluation studies and assessments according to international standards.

Under the first component, Evaluations conducted 16 evaluations, including one impact evaluation, and 11 assessments related to the activities implemented under the three USAID/Peru Development Objectives. The evaluations generated recommendations aimed at improving the evaluated activity or future activities of the Mission, and Evaluations facilitated the development of plans to operationalize the recommendations for ten of the studies. In measuring the use of these studies, Evaluations documented the application of 85% of the studies to improve programming. The team did not find evidence showing that three evaluations and one assessment were used to inform the design or planning of new strategies or interventions, most likely because the Mission did not continue with these lines of intervention.

In this component, Evaluations adopted various strategies to promote the use of studies and evaluations, such as involving intervention stakeholders from the beginning of the study, presenting and discussing the initial results a week after the field work is completed, promoting presentations to a wider audience, and posting study reports to the USAID Development Experience Clearinghouse (DEC) and on the PGRD website. In the last year, Evaluations produced 15 two-page study fact sheets to present the studies’ most relevant findings, conclusions and recommendations, produced stand-alone versions of each report’s executive summary, and generated discussions with the USAID team based on cross-sectional analyses of different studies and evaluations. Some of these analyses took the form of learning briefs on topics of interest for development cooperation.

Under the second component, Evaluations supported the USAID/Peru M&E team adopt operational standards to improve Mission M&E practices, and designed and implemented a plan to strengthen M&E

Page 9: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

9

capabilities among USAID partners, especially those implementing cooperative agreements and grants. To achieve the former, Evaluations helped USAID coordinate the performance reporting from its partners, primarily to strengthen the measurement of activities contributed to Mission results and objectives by standardizing indicators, measurement instruments, planning formats and improving data quality.

To develop a capacity development plan, Evaluations designed two assessment tools: one to measure the training needs of professionals performing M&E functions; and, another to measure the organizational performance in the areas of planning, monitoring, evaluation and knowledge management. The results of the application of both instruments allowed Evaluations to define the type of training and technical assistance that would contribute to improving M&E practices and provided a baseline to measure performance improvement. Evaluations developed a postgraduate program with a university degree (‘Diplomado’), whose academic contents were validated by local and Latin American regional experts. The post-graduate program was then delivered to two groups that included 52 M&E managers from 20 organizations, including seven partnering government institutions. In addition, eleven implementing partners participated in M&E workshops and received assistance from Evaluations to develop their M&E plans.

Based on the learning and materials from the postgraduate program, and in collaboration with several graduates, Evaluations developed a training program for the Regional Governments (RG) and the Directorates of Education from San Martin and Ucayali for the development of M&E systems, including the integration of performance indicators to the web pages of both GRs. The successful conclusion of these processes culminated in the development of a Monitoring Toolbox, which presents the guidelines, assessment tools, guides, and training manuals for developing an M&E system for a public institution.

Evaluations further extended the reach of its M&E capacity building products through the development of an interactive online course based on the Diplomado Evidence Management Module. Under an agreement between the National School of Public Administration (ENAP) and USAID, Evaluations collaborated with ENAP to develop this Massive Open Online Course (MOOC), entitled ‘Communicating M&E Results’. ENAP will offer the course using its online educational platform.

To strengthen local evaluation institutions, Evaluations established formal agreements with two local evaluation networks: Eval Peru, with a membership of evaluation consultants, and Red Perume, composed of employees from the M&E units from various, primarily public, institutions. Eval Peru was an important ally in the Evaluations training and technical assistance activities and in the dissemination of Evaluations products throughout the international evaluation community. Evaluations supported the institutional development of both networks according to the needs identified in each network’s organizational strengthening plan. Evaluations also financed the participation of 21 members to select courses of the M&E Diplomado. Of the 21 participants, 13 financed the remaining courses on their own and obtained full credit for the Diplomado.

The post-graduate Diplomado continues as an academic offering from the Universidad Peruana Cayetano Heredia and is in its fourth cycle. To further promote the use of the program, Evaluations has disseminated the Diplomado academic materials to seven additional universities and organizations in Lima. One year after their graduation, an analyst surveyed 40 Diploma participants at 16 organizations. All of the graduates indicated that they are applying their new skills and their supervisors confirm they are introducing improvements in M&E systems and instruments. All of the graduates are applying monitoring skills and, to a lesser extent, evaluation and dissemination skills.

Page 10: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

10

Introduction

The Final Report for the USAID/Peru Evaluations activity presents an overview of the activity’s evaluation and capacity building interventions as well as an accounting of their achievement of performance results in each of these areas. This report also delves behind the activity’s deliverables and performance results to examine what factors helped the activity achieve successes as well as what obstacles challenged the achievement of activity outcomes, such as the use of studies to support programming and improved performance monitoring by USAID implementing partners.

The Evaluations activity provided USAID/Peru with monitoring and evaluation services and assistance to implement the USAID Evaluation Policy. Its ultimate goal was to promote and foster evidence-based programming by carrying out studies and evaluations and by contributing to the M&E capacities of USAID and its implementing partners (IPs).

The Activity was organized into three areas of support:

1. Design and implement both impact and performance evaluation studies and conduct assessments supporting USAID programs;

2. Provide performance monitoring and evaluation technical assistance (TA), and capacity building for USAID Peru, its implementing partners, and host country government counterparts; and

3. Strengthen the capacity of local evaluation institutions to design and implement evaluation studies and assessments according to international standards.

These translated into three activity Components:

Component 1: The performance of high quality evaluations and studies concerning the Peru bilateral and South America Regional portfolios. The evaluations produced under Evaluations measured and documented activity achievements and limitations while making recommendations to improve activity results as well as inform policies and related activities. Assessments answered development questions intended to provide USAID with empirical evidence supporting programmatic decision-making and planning.

Component 2: Providing technical assistance to USAID Peru to improve its performance monitoring, and providing technical assistance and training to USAID Peru’s implementing partners —especially to local and host country government counterparts— to strengthen their performance monitoring and evaluation capacities.

Component 3: Strengthening the technical capacities of two local monitoring and evaluation institutions so that they are able to design and conduct effective performance and impact development evaluations according to international standards.

Background

The Evaluation Policy, released in January 2011 and updated in October 2016, aimed at improving accountability, learning and evidence-based decision-making. It was designed under USAID Forward reform, which included a renewed effort to strengthen the use of monitoring and evaluation (M&E), to build local capacity to improve aid effectiveness, and to strengthen collaboration and partnership with other donors.

The Evaluation Policy gives priority to building local evaluation capacity and using local expertise in evaluations, as well as supporting partner government and civil society capacity to use performance management, undertake evaluations and use the results generated.

Page 11: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

11

By design, Evaluations drew primarily on local capacities while contributing to the development of new local capabilities at the same time. The activity began with two Peruvian partner institutions with deep research experience. Over the implementation period, PGRD worked with six additional partner organizations. To advance its capacity development strategy, Evaluations convened experts from universities and non-governmental organizations that validated tools and content as well as contributing to the development of interventions. Collaboration with these actors resulted in new initiatives and created a constructive feedback loop between the activity and the Peruvian monitoring and evaluation community. In current USAID terminology, Evaluations promoted and implemented Collaboration, Learning and Adapting (CLA) practices.

The policy context in Peru facilitated the activity’s efforts to defined and respond to priority M&E capacity building needs. The National Policy for the Modernization of Public Management (PNMGP) seeks to modernize all public entities through the adoption of results-based management. This policy recognizes monitoring, evaluation and knowledge management as one of the pillars or results-based programming. In addition, it is a national priority to strengthen the evaluation culture for public management, and the Organization for Economic Cooperation and Development (OECD) country program prioritizes improvements the methodologies for monitoring and evaluation, dissemination and training.

ResultsFramework

The Evaluations framework reflects the unique position it held in the USAID/Peru. First, the results of its research and capacity development activities are exclusively related to providing institutional support and capacity building to USAID and its designated partners and development stakeholders. These results were not tied to a single USAID Development Objective. Second, in supporting the implementation of USAID Evaluation Policy, the activity functioned as an extension of the mission’s M&E capacity. To be responsive to these characteristics, activity interventions were carried out in very close coordination and collaboration with multiple USAID staff, including the Activity Contracting Officer’s Representative (COR) and alternate COR, representatives of the Program Office, the individual members of the USAID M&E team, managers of activities that were evaluated, and technical stakeholders in the various assessments.

The Results Framework links the activity’s tasks to one main objective, to foster evidence-based programming, and three intermediate results (IRs), as illustrated in Graphic 1.

Graphic 1: USAID/Peru Evaluations Results Framework

Page 12: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

12

Under IR1, end-users, including USAID representatives, would use the results of studies and evaluations conducted by the activity to improve programs and projects.

Under IR2, implementing partners would use results-based management after improving their monitoring capacity.

Under IR3, selected local partners would improve their performance in M&E.

As presented in the M&E Plan, Evaluations had a results-oriented monitoring approach that included monitoring the implementation and achievement of planned activities and deliverables as well as the activity’s highest-level indicators. For this purpose, the activity developed a system for recording and organizing performance data that includes the following instruments:

1. Annual survey on the use of evaluation and study findings or recommendations submitted to USAID;

2. Biweekly activity management reports;

3. An assessment tool for measuring implementing partners’ use of best practices in planning, monitoring, evaluation, management, and the use of evidence;

4. An M&E self-assessment survey to define competency strengthening needs;

5. Attendance records and training registration sheets;

6. An assessment tool for measuring evaluation networks performance of best practices;

7. Registry of M&E Plan activities;

8. Registry of support activities for USAID; and,

9. Documentation of follow-up with participants of the Postgraduate Program.

The first instrument was complemented with in-depth qualitative interviews to USAID staff about the reasons why evaluations and studies findings were, and were not, fully used.

Since the objective of the activity was to achieve programming based on evidence, the team also focused on the knowledge generation and use. The Evaluations team reflected on the lessons learned through the implementation of the three components and shared their conclusions regulary with USAID. These lessons were included in the semi-annual and annual reports. To disseminate this learning, Evaluations prepared learning briefs for specific themes relevant to other USAID projects and for the use of the development community in general. In addition, the Evaluations team sought spaces, such as regional conferences, to disseminate the activity’s evaluations, documents and tools so that they could be used to inform similar activities.

Complete data from the Evaluations M&E system and supporting documents remain with the USAID/Peru Program Office. The reports, learning briefs, dissemination and training materials are published in the USAID Development Experience Clearinghouse (DEC) and on the PGRD Evaluations web page.

Page 13: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

13

Component1:EvaluationStudiesAreUsedtoImproveProgramming

The foundation of this component is the performance of high-quality evaluations and assessments of the Peru bilateral and South America Regional portfolios, in accordance with the USAID Evaluation Policy. In addition to the performance of evaluations and assessments, the Evaluations endeavored to create products and carry out activities that promoted and facilitated the use of studies to improve programming. Table 1 presents a selection of the most important Component 1 performance indicators. These indicators are discussed throughout the discussion of Component 1.

Table 1: Key Indicators for Component 1

N° IndicatorYear1 Year2 Year3 Year4 Year5 Total

Target Result Target Result Target Result Target Result Target Result 1.1 %ofevaluations/studiesusedto

improveprogramsorprojects 100% 75% 100% 85% 100% 85% 85%

1.1.1 #ofrecommendationplansagreedforimplementation

0 2 2 5 1 10

1.2.1 #ofevaluation/studiesreportsapproved

8 2 9 4 3 8 4 10 4 3 27

1.2.2 #ofevaluation/studiesfinalreportscompleted 2 9 9 6 4 7 4 3 27

PerformanceEvaluationsandAssessments

Performance evaluations (PE) assess the results of USAID programs and identify lessons learned concerning which development interventions accomplish the expected results and which do not. Assessments support learning and planning by analyzing specific or crosscutting issues relevant to USAID technical areas, such as gender. The Evaluations Activity adopted a 10-step approach to guiding the implementation of studies from the moment of conception to the timely delivery of studies that meet the standards of the USAID Evaluation Policy. This approach provided a blueprint for the participation of multiple stakeholders in the research process, including USAID technical leads, the evaluation team, and implementing partners.

Evaluations completed a total of 27 studies, including 16 performance evaluations and 11 assessments.1 Table 2 presents a breakdown of these studies according to the Development Objective to which they pertain. A total of 24 studies directly addressed the needs of one of three DOs, while three studies provided analyses on cross-cutting strategic issues. The activity conducted three performance evaluations for regional activities. An index of all of the studies is presented in Annex A.

Table 2: Studies and derivative products by Development Objective DevelopmentObjective

Studies ExecutiveSummaries FactSheets

EvaluationRecommendationPlansEvaluations Assessments Total

DO1 2 3 5 7 2 2DO2 10 3 13 12 8 6DO3 4 1 5 5 4 2Crosscutting 4 4 3 1 TOTAL 16 11 27 27 15 10

1 While the contract considered the implementation of 28 studies, USAID ultimately determined that the project would implement 27

and not 28 studies. 2 The requirement of an action plan was included for the first time in ADS 201 09/07/2016 revision; Evaluations included these

Page 14: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

14

The Activity conducted ten final evaluations, and in most of them, the preliminary findings were presented when the implementing partners had already closed the activity. One of these final evaluations was sequenced to inform a new project, where the evaluation recommendations could be used as inputs to the design. In the case of the four mid-term evaluations, study recommendations were applied to the second phase of the projects and were available to inform follow-on designs. The balance of two evaluations were composed of one special evaluation providing an overview of the results from five activities and one retrospective impact evaluation of the Alternative Development Program activities that had ended before Evaluations began. The activity’s eleven assessments provided empirical evidence on a range of topics to allow USAID to reach conclusions about existing strategies deployed under multiple projects at a time, address cross-cutting issues such as gender mainstreaming, and inform the elaboration of its new country development strategy.

Productsandactivitiespromotingtheuseofstudies

The Activity understands that the value of its deliverables is proportional to the use USAID gives them. Throughout the activity period, Evaluations endeavored to exploit every opportunity to promote the uptake of study results in programming. These efforts ranged from the inclusion of early presentations of study results to the production of polished study summaries and post-study presentations and dissemination. The following paragraphs summarize these efforts and products.

Planning the study process to improve quality and timing: Evaluations and USAID recognized that the first key to promoting the use of studies was to compress the study implementation period in order to get results into the hands of decision-makers. However, we also recognized that imposing arbitrary time limitations on study implementation can threaten the quality of results by undermining the collaborative process required to ensure that studies are well-calibrated to their end-users’ needs and eroding the methodological rigor with which the study is implemented. Therefore, to deliver study results in a defined time period while enhancing study quality, Evaluations established and continually refined a 10-step implementation process (see Graphic 2). Annex B presents a detailed explanation of the 10-step process. This process incorporated design elements that codified study stakeholder collaboration, set clear expectations regarding the study timeline, and preserved methodological rigor.

Graphic 2: The ten evaluation ‘moments’

1stmomentDefiningstudyquestions

2ndmomentSOW

3rdmomentSelectionofproposals

4thmomentSigningofcontract

5thmomentLaunchingofevaluation

6thmomentInceptionReport

7thmomentPreliminaryfindingsand

fieldworkreport

8thmomentDraftreport

9thmomentRecommendation

Plan

10thmomentFinalreportapproved&

uploadedinDEC

Page 15: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

15

The process achieved collaboration and ownership by specifying where USAID and other stakeholder involvement was essential to ensure high quality and useful results. Key participants’ roles were defined and scheduled in the study implementation process before the study began. This approach promoted active participation and ownership by USAID activity managers, implementation partners and others with a stake in the study while also establishing a joint commitment for turn-around times.

The implementation system enhanced studies’ rigor and their responsiveness to purpose by engaging evaluation teams intensely during the early stages of implementation. This engagement, initiated through an orientation workshop, ensured that the evaluation team shares an understanding of the evaluation questions, client expectations, and the intended use of findings and recommendations. Evaluation teams concluded this stage with an inception report, which defined expectations on how and when fieldwork, analysis, and other stages were to be executed and defined the relationship between evaluation questions, methodology and data.

The formalization of this process at the beginning of the third year generated immediate results, including the completion of 12 study designs in just one year, where in the first two years the activity concluded the design of 13 studies. The process succeeded in coordinating more engagement by USAID technical staff at clearly defined moments with a specific agenda for contributing to the study completion and quality.

Preliminary Findings Presentations – To accelerate the delivery of empirical data and analysis in support of programming, Evaluations included the delivery of ‘Preliminary Findings’ presentations with each study from Year 2 onward. The presentation of initial findings included a targeted audience of USAID staff members that are expected to use evaluation results. Participants were given a first look at study findings and conclusions and provided feedback to help inform the evaluation team’s continued analysis of findings and the elaboration of conclusions and recommendations.

The PGRD team worked with evaluation teams to refine the set of findings immediately following the conclusion of fieldwork and data analysis. USAID representatives valued these presentations for providing early evidence to support programming. Evaluations found that the preliminary findings presentations often garnered the largest USAID audiences and enthusiastic responses. For example, during the Conflict Management and Mitigation (CMM) evaluation, USAID found the preliminary findings and recommendations so compelling that they scheduled an additional presentation for the USAID Washington Conflict Management and Mitigation Office. The presentation of an ethnographic study conducted in communities participating in Alternative Development interventions was particularly popular, generating questions and comments from USAID experts from across the mission. Based on this presentation, USAID had Evaluations make a presentation of the study results to the Military Group at the US Embassy. A presentation of a capacity building assessment for DEVIDA was highly valued by DO1 leaders and provided early input for the DO results framework.

Recommendation Implementation Plans: As per ADS 201.3.5.18, “…Mission and Washington [Operating Units] must develop a post-evaluation action plan upon completion of an evaluation.” Evaluations sought to support USAID in addressing this ADS initiative by developing recommendations implementation plans2 for the application of the most salient evaluation recommendations, either as an annex or submitted as a separate document. On select evaluations, the activity included a statement of work (SOW) that mandated that the contents of these plans should serve as a tracker that links recommendations to specific actions and, particularly, to the timeframe agreed for their implementation.

To enhance the utility of these plans to USAID, Evaluations facilitated consensus-building regarding the recommendation implementation plans with USAID and key stakeholders. Following the completion of a

2 The requirement of an action plan was included for the first time in ADS 201 09/07/2016 revision; Evaluations included these

recommendation implementation plans in the SOWs prepared in 2015.

Page 16: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

16

study, Evaluations facilitated discussions with activity managers, the Program Office and the evaluation team to unpack an evaluation or study’s conclusions and recommendations into specific actions, which culminated in the development of an evidence-driven action plan. This action plan was then submitted for consideration of Mission decision makers as part of USAID internal processes. This consensus building process resulted in the approval of ten recommendation implementation plans, a list of which is presented in Annex A.

Since the implementation of recommendations falls outside the activity’s purview, the activity could not measure how these tools facilitated the use of evaluations. In the absence of information regarding the application of these plans, for Year 5, Evaluations agreed with USAID to only prepare recommendation plans for mid-term evaluations or when there is an explicit request from USAID for a plan before the evaluation is complete.

The ten recommendation implementation plans applied to nine evaluations and one assessment. Since Evaluations initiated the recommendation plan process in Year 2, no plans were established for five evaluations completed before that time. Of the remaining evaluations, only two were omitted from the recommendation planning process. One of these, a rapid evaluation of the Peru Cacao Alliance activity, provided a specific analysis that supported the mission’s decision regarding a follow-on activity and did not require a recommendation plan. The other was a final evaluation of the ProDecentralization activity concluded after USAID determined that plans should be prepared only for mid-term evaluations.

Evaluation Fact Sheets: The PGRD team produced a series of 15 Study Fact Sheets that present a targeted selection of the most applicable and relevant findings and recommendations from evaluation and assessment final reports. The contents and messaging for each fact sheet is defined in collaboration with USAID representatives and designed to facilitate the dissemination of each study’s most important findings, conclusions and recommendations. The products are visually engaging and designed to facilitate consumption of study results and stimulate interest in the full reports among individuals that would not otherwise read the full reports. In addition to the Fact Sheets, Evaluations produced ad hoc informational products responding to the needs of specific audiences. For example, the team prepared an info poster synthesizing the results of an Adaptation to Climate Change assessment as well as informational maps depicting what had been accomplished in each of five activities considered in the assessment. Annex A presents a list of these communication products.

Executive Summaries: Studies include executive summaries that provide a concise yet thorough accounting of each study’s purpose, methodology, findings, conclusions and recommendations. Evaluations has converted these summaries into professionally formatted stand-alone documents. Similar to Fact Sheets, these documents facilitate the dissemination of each study’s most important results.

Learning briefs and products: Evaluations has prepared three reader-friendly Learning Briefs, which analyze study findings, conclusions and recommendations across multiple studies and promote the application of activity design and evaluation best practices. This includes an analytical focus on the cumulative learning and best practices drawn from the implementation of the evaluations and assessments. Although there is no explicit USAID learning agenda, themes were selected with USAID based on their potential to contribute to mission and implementing partner practices. These included: an analysis of how activities benefited from designs that reflected a thorough knowledge of targeted beneficiary populations; an analysis of how USAID/Peru and Evaluations have implemented best practices for promoting the use of evaluations; and, an analysis of what design elements contribute to the success of knowledge and skill transfer interventions. Evaluations disseminated the briefs through presentations, discussions and learning sessions initially with USAID and with IPs. These briefs are available for wider dissemination in digital format through the activity website.3

3 http://www.pgrd.org/projects/peru-evaluations/publications/

Page 17: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

17

In addition to formal learning briefs, the Evaluations team prepared four analytical products to support USAID in learning from evaluation results and the evaluation process. Evaluations presented these analyses to USAID in various presentations and workshops, drawing lessons using common themes from a variety of evaluations and assessments. The topics included: Lessons for activity design; Contents & format for a learning final report; An initial approach to implement Collaborating, Learning, and Adapting (CLA) in USAID/Peru; and, An initial approach to embed sustainability in activity design.

Dissemination activities: presentations, participation in conferences, publications: All studies include the presentation of the final report. The activity also presented all learning products and analyses at USAID with the participation of implementing partners, when possible. Presentations were directed to audiences of USAID’s choosing, including the IP, counterpart representatives or others. When USAID requested, the Activity provided customized presentations for select audiences. This included presentations of the CMM evaluation to the USAID/Washington CMM office, the presentation of the ethnographic study to the US Military Group, the presentation of an Environmental Justice Sector Assessment to the Minister of Justice, and the presentation of an Adaptation to Climate Change assessment to the new implementing partner that will continue with a similar initiative, as well as to representatives of other international donors, among others. To avoid interfering in strategic or programmatic relationships, the USAID Evaluations COR provided direction regarding the partners and stakeholders that the Activity should engage in briefings.

Useofstudies

Over the last three years, Evaluations has attempted to establish a quantitative measurement of USAID/Peru study use through a voluntary online survey of staff members that have been involved in developing, or have been targeted as clients of, activity studies. The study survey provides a minimum count of the studies that were used and the purposes they served. Because the survey relies on voluntary responses, the 65% of non-respondents may have used studies but not reported these uses. To complement the survey, Evaluations also used evidence from external audiences regarding the use of studies to influence public policy and programming.

The activity has gathered evidence demonstrating that that 85% studies (23 of 27) were used to influence programming or policies. The USAID survey indicated that 78% of studies (21 of 27) were used to influence programming. Additionally, we have confirmed that at least two external audiences used two additional studies to inform public programming. The Ucayali Economic Analysis was conducted to support the Ministry of Production, which confirmed having used this study to support their Productive Diversity project. The Environmental Justice Sector Assessment included a recommendation for the Ministry of Justice to separate two environmental prosecutors’ offices. During a presentation of the study to the Minister of Justice, the Minister confirmed their agreement with this recommendation and mandated to staff present at the meeting that this separation should be carried out. Indeed, the Ministry proceeded to implement this recommendation.

Survey respondents indicated that the ‘used’ studies served three different purposes, on average, i.e. the study informed the modification of the activity being evaluated; informed the design of other activities; influenced USAID programming, strategies or policy; or was used in policy dialogue with other entities. Survey and external data indicated that all six DO1 and all four crosscutting studies supported programming. Respondents put the usage of DO3 studies at 80% and DO2 studies at 77%, or four of five and ten of thirteen studies, respectively. It is informative to consider the possible explanations for why respondents and external audiences may not have used the four studies for which Evaluations has no evidence of study use. These included:

1. The evaluation of the Cordillera Azul responded to an audit finding, not to a specific need identified by the mission.

Page 18: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

18

2. The evaluation of the regional Maximus activity did not have a similar follow-on activity targeting handicapped populations.

3. The evaluation of the Pro-Integridad activity went through a drawn out review process that may have delayed its utility to users beyond the activity manager. We have no other specific information on why it may not have been put to use.

4. The Education Sector Assessment provided an analysis for a sector for which no immediate follow-on project was planned. This may have influenced its use among USAID staff. However, it may have served the needs of an audience in the education sector outside of USAID.

Component2:ImplementingPartnersAreAbletoManagePerResults

Evaluations supported USAID and its IPs to comply with USAID Evaluation Policy requirements while strengthening the performance of organizational performance monitoring. Our capacity building agenda was anchored in providing support that had concrete measurable objectives. The most prevalent of these performance targets were the following processes and management documents:

• USAID M&E plans for each of the three Development Objectives, and the Mission itself, are included in the Country Development Cooperation Strategy (CDCS).

• Implementing partners’ have M&E Plans that are aligned with the USAID results framework.

• Tools for tracking implementing partners and DO’s M&E Plans

• Strengthening the capacity of local IPs to develop and implement effective M&E Plans.

Table 3: Key Indicators for Component 2

N° IndicatorYear1 Year2 Year3 Year4 Year5 LOP

Target Result Target ResultTargetResult TargetResult Target ResultTargetResult

2.1#ofIPthatimprovetheirinstitutionalperformanceinM&Ebasedonbestpractices

- - - - - - 9 2 6 4 6 6

RegionalGovernments - - - - - - - - - 3 3 3 Non-governmentorganizations - - - - - - - 2 - 1 3 3

2.2 #ofIPwithPMPandreportingresults - - - - - - 9 4 6 2 6 6

2.1.1 #ofnewormodifiedM&EPlansapproved,cumulative 3 0 10 7 15 14 15 25 15 28 15 28

2.1.2 #ofUSAID'smonitoringprocessesand/oractivitiesstrengthened,cumulative

- 3 - 4 - 5 - 6 - 6 - 6

2.2.1 #ofparticipantsoftheM&EPostgraduateProgramwhoobtainacademiccertification

- - - - - - 52 44 - - 52 44

2.2.4 #ofimplementingpartnerorganizationsparticipatinginM&Eworkshops

-

Cumulative 5 14 10 22 15 31 15 33 - 33 15 33

Annual - 14 14 20 24 5 2.2.5 #ofindividualstrainedinM&E

workshops 25 134 25 184 25 185 - 285 - 108 75 896

2.2.7 #ofcapacitybuildinghoursinM&Edelivered 400 258 1,040 1,4251,7602,528 1,920 3,897 - 1,1345,120 9,242

The activity sought to achieve these processes and products while creating a lasting improvement in local M&E capabilities among the partner organizations by adopting a capacity building approach to the

Page 19: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

19

delivery of technical assistance. This overall approach promoted an M&E culture during program implementation, and the use and management of M&E evidence for early identification and resolution of issues before they become problems. Whenever possible, this capacity building approach served as the framework for the delivery of a) TA to USAID to improve its performance management tasks; and b) technical assistance, training and coaching to USAID’s implementing partners, especially to local and host country government counterparts, to strengthen their M&E performance capacities.

Component 2 included three main activity areas:

• Facilitate the preparation of USAID/Peru M&E Plans

• Support to the USAID/Peru Monitoring and Evaluation Team, and

• M&E training and technical assistance for partners

SupportforUSAIDDevelopmentObjectiveMonitoring&EvaluationPlans

Evaluations supported two of the three Development Objective teams to complete M&E plans as well as supporting an overall mission M&E Plan (see Table 4). USAID formally approved the M&E plan for DO2. While Evaluations provided TA to DO3 in completing its M&E plan, USAID leadership did not formally approval for the plan before the elaboration of a new CDCS, which began in 2016. The activity initially provided TA and support for the elaboration of the M&E plan for DO1, however the DO team suspended its work on the plan for internal reasons and did not take the plan up before work on the new CDCS began.

Table 4: Progress supporting USAID M&E Plans

DO

M&EPlanelaboration

LFandRF

LFandRFapproved PIRS

ResultsTrackingTable

AIDTracker

Evaluationquestions

ScheduleforM&Eactivities

M&EPlanelaboration

M&EPlanapproved

DO1 Y2 DO2 Y2 Y2 Y2 Y2 Y2 Y2 Y2 Y2

DO3 Y1 Y1 Y2 Y2 Y2 Y2 Y2 Mission Y2 Y2 Y3 Y3 Y3 Legend: =Formallyapproved, LF and RF = Logical Framework and Results Framework,

PIRS = Performance Indicator Reference Sheets.

Support for CDCS and PMP: The Evaluations team provided support to USAID in developing the M&E Plan for the Mission. To this end, the team incorporated a series of updates into the M&E Plan, including: evaluations planned for the Mission, the DO2 results framework and PIRS, and an updated Results Tracking Table with information from the DO1 and DO2 indicators as included in their Performance Plan Reports (PPR). In addition, the team defined and consolidated data from years 2010 to 2014 for Mission performance and context indicators. Finally, the team transitioned the draft Mission M&E Plan into a new USAID format.

DO1 M&E Plan: PGRD supported the DO M&E point of contact (MPOC) to review indicators and develop the mission M&E plan, primarily in Year 2 of the activity. However, the temporary assignment of the MPOC to a foreign post and the simultaneous change in leadership at the Peruvian counterpart agency in early 2015 (the end of Year 2) led to the suspension of this effort by the Mission. The DO did not resume the elaboration of its M&E plan before the mission initiated the development of a new CDCS in 2016.

Page 20: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

20

DO2 M&E Plan: In activity’s second year, Evaluations provided assistance to the DO2 team to elaborate, finish and secure approval for the DO M&E Plan in December 2014. To achieve this, the Evaluation team supported the following processes:

• Elaboration and approval of the DO2 Logical Framework (LF) and Results Framework (RF). • Facilitation of a meeting with the participation of ten representatives from USAID and twelve

implementing partners. Purpose of the meeting was to present to the IPs, the Country Development Cooperation Strategy (CDCS) and the DO2 results framework and its indicators. Another purpose of the meeting was to identify the contribution of each IP to the DO2 performance indicators and to define these indicators. The products of this meeting served as input for the development of the M&E Plan.

• Preparation of the DO2 M&E Plan in coordination with the activity COR. • In November 2014, the M&E Plan was submitted for approval to the Mission Deputy Director,

who approved it on December 12th.

DO3 M&E Plan: In Year 2, Evaluations supported USAID to draft the DO3 M&E Plan based on the logical framework and results framework approved in the first year. The team drafted several sections of the document, including: Performance Indicators Reference Sheets (PIRS), Results Tracking Table (RTT), evaluation questions and a schedule for M&E activities. Regular collaboration with the DO3 MPOC enhanced the efficiency of Evaluations support. USAID leadership did not formally approve the complete DO3 plan before work began on the new CDCS.

SupporttotheUSAID/PeruEvaluationTeam

In addition to M&E plans, the activity provided TA to support USAID performance monitoring functions over the life of the activity. All Evaluations TA to USAID responded to direct requests for technical assistance and Evaluations played a support role in facilitating USAID’s performance of its M&E obligations. Through the life of the activity, Evaluations was able to strengthen six USAID performance monitoring processes (see Table 3, indicator 2.1.2) through the provision of technical assistance. These processes included the design and strengthening of tools to track performance monitoring and its compliance with USAID ADS across the mission and across all implementing partners activities, the calculation of mission indicators under the PMP, supporting data quality reviews, supporting the mission’s transition to the AIDTracker Plus performance monitoring information system, and providing technical facilitation of meetings with implementing partners. The following paragraphs summarize the results of these activities.

Supporting USAID migration to AIDTracker Plus: The activity supported USAID to be an early adopter of the AIDTracker Plus monitoring information system. To achieve this, PGRD synthesized 503 indicators from the USAID Mission and 23 implementing partners. By harmonizing similar indicators, Evaluations was able to eliminate redundancies and unnecessary indicators. This exercise resulted in the distillation of 503 mission indicators to just 301, or from 20 per activity to 14 per activity. This reduction made an important contribution to streamlining the performance monitoring system and enhancing the monitoring system’s analytical power by increasing the amount of comparable data while reducing the overall administrative burden.

Support for the application of DO M&E Plans among implementing partners: Once M&E Plans had been completed with DO2 and DO3, PGRD planned and facilitated workshops with implementing partners to initiate the use of these DO M&E plans. These meetings, hosted by USAID DO representatives, presented the M&E plans and guided the standardization of DO indicators among all DO implementing partners. PGRD continued planning and implementing follow-up meetings on an annual basis. These annual meetings served as a venue for USAID and its partners to review monitoring issues, to identify

Page 21: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

21

and resolve indicator definition and calculation issues, and to confirm the universal adoption of standardized indicator measurement across the DO partners.

Ad hoc support for USAID performance monitoring: Evaluations played an ongoing role in supporting USAID performance-monitoring functions based on USAID requests. This support often included technical assistance in reviewing and improving M&E plans submitted by partners. Specifically, the team reviewed M&E plans to ensure they met ADS standards, were consistent with the USAID PMP and DO M&E plans and other related IP M&E plans, and used indicators that would contribute directly to the measurement of USAID development objectives. The team also supported the calculation of USAID PMP and DO M&E Plan indicators. Table 5 presents details regarding some of the support Evaluations provided in response to USAID requests.

Table 5: USAID Evaluations support for USAID performance monitoring functions

Target PGRD Support Year ElaboratedM&EPlansforfouractivities

AtUSAID’srequest,PGRDprovidedTAandaseriesofworkshopstosupportthreeinternationalandonelocalimplementingpartnertoelaborateM&EPlans,allofwhichwereapprovedbytheiractivitymanagers.4

Year2

FacilitatingimplementingpartneradoptionofDO2M&EPlan

AactivityM&EworkshopforDO2educationpartnerspresentedtheDO2ResultsFramework,indicatorsandtoupdateindicatordefinitions.

Year3

FinalizetheM&EPlanforthe“CapacityDevelopmentandEngagementprogram”

TAtomaketheM&EPlansfromtwoUPCHactivitiesconsistent.Theactivities:“LiderandolosAprendizajesdeNiñasyNiños”(LANN)and“EnseñaresLiderar”(EEL)

Year4

ImprovedatacollectionandindicatorreportingamongnineDO3partners

PlanningandfacilitatingameetingtopresentnewguidelinesfortheUSAIDPerformanceProgressReport,includingnewstandardindicatorsandreportingmatrix.

Year4

PromotingimprovedM&Eamongactivitymanagers

OrganizationofaworkshopsupportingactivitymanagerapplicationofM&Eresultsandtools,includingsixteenparticipantsfromDO2andDO3.

Year4

AdaptingtoanewversionofADS201(July2016)

PreparationofanassessmentofimportantADS201revisionsandthenraisingawarenessamongactivitymanagersregardingthesechanges.

Year4

CDCSdevelopmentobjectiveindicatormeasurement

Reviewandupdateoffourperformanceindicatorsandtwelvecontextindicatorsfortheyears2010to2016toenableUSAIDtomeasurethedegreetowhichtargeteddevelopmentobjectiveshadbeenmet.

Year4

USAID|PeruM&EteamparticipatedintechnicalmeetingsinWashington,DCwithotherM&Especialists

ProvisionofM&Ecapacitybuildingmaterials,e.g.validatedbestpracticesformeasuringinstitutionalperformanceandrevisedvalidatedlessonsregardingtheimplementationofM&Eprocesses.

Year5

Annualcalculationofcontextandperformanceindicators

AnnualsupportfortheupdatingofDOresultstrackingtablesaccordingtoM&EPlansusingthelatestPPRdata.

Years1-5

USAIDPMPperformanceindicatorcalculations

Usingnationaldata,calculationoffourmissionindicatorstomeasureresultsacross632schoolsassistedbythreeUSAIDactivities.

Years2-5

Calculateeducationactivityperformanceindicators

Calculationoftheachievementindicatorsfor193schoolsparticipatinginUSAIDactivities(PEELandLANN)and135controlschools,including34schoolsfromtheProgramaLogrodeAprendizajes

Year5

Annual USAID M&E team workshops: The Evaluations team worked with the USAID M&E team to develop annual workshops oriented towards strengthening USAID M&E performance and practices. Held in April/May, these events evolved from USAID M&E planning into capacity building activities.

4 These projects included: USAID/Health Policy (Abt. Associates), USAID/Healthy Communities and Municipalities II (Management

Science for Health), International Institute for Democracy and Electoral Assistance – International IDEA, and Grupo Propuesta Ciudadana – PROPARTICIPACION.

Page 22: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

22

• The first annual workshop with the USAID M&E team and representatives of each development objective took place in April 2013, shortly after the activity initiated. During this workshop, the USAID M&E Team worked with Evaluations to define the research and capacity building agenda that would drive the activity’s first annual work plan.

• The second M&E team workshop included the participation of technical office representatives in reviewing progress in the development of DO M&E plans, defining research agendas and providing monitoring TA to DO implementing partners. The workshop concluded with inputs for programming activities for the following year.

• The Year 4 USAID M&E team workshop, held in April 2016, included presentations from Evaluations regarding the critical issues affecting the implementation and evaluation of activities that had been identified in activity design and in M&E plans over the first four years of the activity. Participants also conducted an analysis of selected indicators using the SMART criteria, and discussed the selection of indicators to measure gender and capacity building for inclusion in the M&E Plan of the new CDCS.

• For Year 5, PGRD held a two-day M&E team workshop in April 2017. The workshop focused on the use of evidence in program design, the format and contents of final reports, dimensions and indicators to measure capacity building, and the effective communication of evidence.

M&ETrainingandTechnicalAssistanceforPartners

Strengthening the capacity of select USAID implementing partners and stakeholders to design and carry out effective performance monitoring was the signature activity carried out under Component 2 of the activity. USAID Evaluations supported the building of partner M&E capacities and systems in parallel with program implementation. This approach naturally integrates training, technical assistance and the guided application of new M&E skills and tools. Beyond performance measurement, M&E capacity building promoted the use and management of M&E evidence for the early identification and resolution of implementation challenges.

The Activity organized its capacity building using the Human and Institutional Capacity Development (HICD) model, the USAID model for sustainable performance improvement. The model consists of two components: the first aimed at improving the performance of the institution based on best practices for monitoring and evaluation, and the second to improve the competences of the professionals that perform monitoring and evaluation functions.

USAID tasked Evaluations with strengthening the M&E capacities of at least 15 implementing partners. By providing partners with three major forms of assistance, the activity was able to provide support to 33 organizations identified by USAID. The volume of support to these organizations was impressive: Evaluations provided 9,242 hours of expert technical assistance and trained 896 individuals (see Table 3).5 These organizations included a mix of USAID implementing partners and public sector stakeholders with key roles in contributing to USAID DO objectives. The three types of support provided by Evaluations included:

1. Intensive instruction and practical application through a Postgraduate M&E Diplomado program established by the Evaluations activity;

2. Training workshops on M&E topics defined according to the needs identified in organizational M&E assessments; and,

3. The implementation of ad-hoc technical assistance plans customized to each organization needs.

5 Hours of technical assistance count the time experts spent delivering assistance. Individuals trained counts how many individuals

attend separate trainings, i.e. individuals may be counted multiple times.

Page 23: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

23

The participating organizations were able to achieve results that are emblematic of the activity’s thorough and holistic capacity building approach. USAID and Evaluations selected key performance benchmarks to ensure that capacity building was demonstrably contributing to improved M&E performance among the supported organizations. Chief among these was the design, approval and implementation of M&E plans. Evaluations supported a total of 22 organizations to prepare 27 M&E plans approved by a corresponding authority, including:

• 9 implementing partner plans that were approved by USAID; • 13 plans were approved as part of the Diplomado, i.e. by UPCH for graduation; and, • 5 approved by national and regional public institutions (MINAM, GRSM, GRU, DRE from both

Ucayali and San Martin).

Each of these partners, as of the end of activity, is implementing their M&E plan, including the collection and processing of data and the production of monitoring reports. USAID has independently verified that the 16 participating partners that have ongoing USAID activities are submitting performance monitoring reports that are consistent with their approved M&E plans. The successful establishment and implementation of M&E Plans by so many participating organizations is the product of applying the HICD model to understanding and addressing both institutional and individual professional capacity building needs. Our model consists of two components: the first aimed at improving the performance of the institution based on best practices for monitoring and evaluation, and the second to improve the competences of the professionals that perform monitoring and evaluation functions. The model is presented in Graphic 3.

Graphic 3: Approach for capacity building in M&E

For the first component, the team used the Performance Improvement Methodology (MMD), which standardizes and streamlines organizational processes through the adoption of best practices and performance milestones. The process for improving performance consists of four stages: (i) current measurement and identification of performance gaps, (ii) analysis of gaps and identification of root causes, (iii) development of a performance improvement plan, and (iv) implementation of the improvement plan and feedback actions involving recognition and reinforcement of progress.

The second component strengthened and improved individual professionals’ M&E skills, and comprised three practicum-focused activities: (i) implementation of a postgraduate Diploma in M&E, (ii) regional training workshops in M&E, and (iii) virtual M&E courses.

Page 24: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

24

The activities of the two components are coordinated to ensure that the individual capacity building activities directly contribute to the improvement of institutional performance by closing M&E performance gaps and improving the use of evidence to inform program management. Evaluations created capacity measurement tools to support the measurement of capacities for both institutions and individuals. These tools are accompanied by automated analytics that synthesize the results from multiple applications of the capacity measurement tool. These tools are available for download at the USAID Development Experience Clearinghouse and at the Evaluations publications download page: http://www.pgrd.org/projects/peru-evaluations/publications/.

PostGraduateMonitoringandEvaluationDiplomado

USAID Evaluations establishment of an accredited post-graduate program to prepare development sector professionals to carry out monitoring and evaluation activities is the signature intervention carried out under the activity’s capacity building activities. This Post Graduate Diploma in Monitoring and Evaluation of Development Plans, Programs and Projects not only contributed to achieving the objectives of strengthening implementing partner capabilities as well as strengthening evaluation capacities under Component 3, it established this program as an ongoing academic offering at a major national university, the Universidad Peruana Cayetano Heredia (UPCH).

The foundation of the post-graduate program is based on four modules designed to develop and strengthen all of the essential M&E competencies required to develop and implement M&E functions at organizations implementing social and economic development activities. These modules included:

1. Key Skills for Monitoring and Evaluation; 2. Monitoring of Plans, Programs, and Projects; 3. Evaluation of Plans, Programs, and Projects; and, 4. Evidence Management (i.e. dissemination and application of results to support decision making).

Evaluations developed the content and related activities for these modules expressly to cultivate participant competencies reflecting M&E best practices for the four dimensions of M&E. Over Year 2, the team guided each of the modules through the process of being reviewed, modified and validated by a group of experts representing universities, public sector, NGOs and USAID. The activity also incorporated elements to address the specific M&E requirements of GOP public institutions and programs based on comments and suggestions received from members of the public sector and international donors.

A competitive solicitation culminated in the selection of the Universidad Peruana Cayetano Heredia (UPCH) to conduct the Diplomado as a formal academic offering. After adopting the curricula and materials, on June 23, 2015, the University Council of UPCH formally approved the Postgraduate Program through a resolution that describes the Diploma curriculum in terms of the Peruvian University Law and institutionalizes the Diploma under the University’s Postgraduate School. In its final form, the Diplomado revolved around the development of five essential M&E products and included classroom-based instruction, incorporated the use of an online platform and video lectures, and coached team practical applications for the development of the products. These products included:

• M&E Plan; • Monitoring Report template; • Evaluation design and Statement of Work; • Monitoring results communications plan; and, • Organizational M&E code of ethics.

USAID identified twenty organizations to send staff to participate in two activity-funded instances of the Diplomado. This group included three Ministries, four regional government institutions, eight Peruvian NGOs, four international NGOs, and the UPCH. These organizations sent 53 professionals to

Page 25: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

25

participate in the Diplomado, 49 of which completed the entire program. All of these professionals were responsible for M&E functions at their respective employers, were personally interested in the program and committed to completing the specific M&E products that the program was designed to ‘produce’ through the practical application of learned competencies.

In completing the Diplomado, these 49 graduates collaborated with the other representatives from their organizations to produce the M&E products for each of the 20 partner organizations. As a result, three quarters of the organizations adopted all five products and all of the organizations had adopted at least three of the five products. Table 6 provides summary details regarding the participating institutions and the products their participants elaborated for their benefit.

Table 6: 20 Implementing Partners Development of M&E Tools M&EProducts PartnerType

M&ECodeofEthics 19 NationalGovernment(Ministry) 3PlandeM&E 19 RegionalGovernment 3MonitoringReports 19 LocalNGO 5SOWanddesignofanevaluation 17 InternationalNGO 3ResultsDisseminationPlan 17 University 1

Productsdevelopedperorganization PartnercorrelationwithUSAIDDOs 5Products 15 DO1 34Products 1 DO2 63Products 4 DO3 5

Crosscutting 1

Determined to understand how the Diplomado did or did not improve M&E performance at the individual and organizational level, Evaluations conducted a study following up with students at various stages. This study provides a rare opportunity to see what results this type of intensive capacity building produces in the medium term. We hope that the results of this study are used to contribute to supporting this multi-faceted approach and to strengthening the Diplomado and the delivery of M&E strengthening support, specifically to closing gaps measured through individual and organizational competency assessments. The study is based on different data sources taken at different points in time:

1. Measured responses of participants during the program and immediately following graduation; 2. Measured learning using baseline and post graduation testing; 3. Assessed the application of learning by documenting their professional activities, as observed by

the graduates and their supervisors one year later, wherever they are working; 4. Documented institutional changes and improvements, as observed by graduates and supervisors

one year later.

Page 26: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

26

Graphic 4: M&E competency development by Diplomado participants6

Graduates that took pre and post-Diplomado assessments (44/49) demonstrated increases in all four of the competency areas (see Graphic 4). These increases were greatest for the skills related to monitoring, evaluation and evidence management, where participants increased their assessment scores by 1.0 -1.25 on average. Notably, the average competency for all four areas exceeded 2.5, which is approaching the advanced level of three and well over the minimum level of competency (2 = “adequate”) expected for this group of professionals.7

All of the graduates indicated that they were regularly applying the skills they acquired in the Diplomado at work, and supervisors confirmed that graduates are contributing in the M&E field at their institutions. The study found that all respondent graduates reported applying their monitoring competencies, followed by two thirds that apply their evaluation skills (especially as they informed monitoring vis-à-vis the development of an evaluation framework and questions to direct monitoring efforts), and one third that use their knowledge of communicating M&E results. There was an interesting dichotomy in terms of graduates identifying the ‘Key Competencies’ as being skills they were applying at work, while supervisors emphasized the technical skills the graduates were applying.

Most supervisors indicated that graduates had strengthened the monitoring functions of the host organizations. In some NGOs, these participants had succeeded in elevating the level at which M&E functions are applied at their organizations, that is, they had managed to foment the application of new M&E functions at the organizational level rather than just for the activity that had originally been targeted by USAID and Evaluations. Supervisors at these organizations had fully appreciated the value of

6 “Contribuciones del Programa de posgrado en monitoreo y evaluación de planes, programas y proyectos en el desarrollo de las

capacidades institucionales y de las personas de instituciones involucradas, 2017”, Graphic 5, Luis Soberón A., PGRD, March 2018

7 The targeted minimum level of achievement was level 2. The scale was defined as: 1 = Beginner, 2 = Adequate, but would benefit from reinforcement, 3 = Advanced, 4 = Complete competency/no training needed.

0.0

1.0

2.0

3.0

4.0

KeyCompetencies

Monitoring

Evaluation

EvidenceManagement

Beginning

End

Page 27: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

27

having and applying an updated monitoring plan. In some cases, organizations were using M&E data to inform program budgeting.

One quarter of the respondents had changed employers, but all had remained in relevant public and private entities in the development sector. Study results suggest that these graduates had continued to emphasize the utility of monitoring functions, and were contributing to an expansion of these functions at their new employers. These findings suggest that the investment in skill building will contribute to development objectives despite personnel rotation

The case of three graduates that work as an M&E team at the NGO AIDER highlight the significant advances that an M&E team can make with sufficient capacity building complemented by supportive leadership. Using their course as a platform, the participants worked at the national and regional levels to integrate M&E work across three levels of the NGO, substantially strengthening the organization’s monitoring functions. Faced with a lack of funding for organizational evaluations, the NGO’s leadership supports the group to hold ‘learning’ meetings to draw conclusions from monitoring and answer key questions that help approximate the benefits that regular evaluations would provide.

Perpetuating the sustained use of the Diplomado and associated M&E tools

One of the Evaluations objectives was to ensure that an M&E course initiated under the activity was offered and conducted into the future. To achieve this, the Evaluations activity transferred the program to local institutions in several ways. First, after two cohorts had completed the Diplomado, USAID authorized the transfer of the program and all related materials to the partner university, UPCH. As an incentive and in recognition of their contributions to establishing the program, USAID provided the UPCH with one year of exclusive rights to the program content and materials. After successfully recruiting and conducting a third offering, UPCH is now recruiting for a fourth cycle of the post-graduate program.

After the one year of exclusive rights to the Diplomado program ended, Evaluations reached out to institutions potentially interested in offering the Diplomado or using its contents for similar courses. Based on this interest, Evaluations conducted a workshop to present and transfer the Diplomado contents and materials to seven organizations: the Escuela Nacional de Administración Publica (ENAP), the Centro Nacional de Planeamiento Estratégico (CEPLAN), the Universidad Nacional Mayor de San Marcos, the Universidad Nacional la Agraria, the Universidad del Pacifico, the Pontificia Universidad Catolica de Peru, and GRADE. GRADE is already providing post-graduate programs in M&E, and they now credit the Diplomado materials in their promotional materials.

Interest on behalf of ENAP over the last year led USAID to establish an agreement with ENAP to have Evaluations develop an interactive online course based on the Evidence Management Module. This Massive Open Online Course (MOOC), entitled ‘Communicating M&E Results’, was adapted by Evaluations in collaboration with ENAP, which took responsibility for the pedagogical rigor of the course. Now complete, ENAP will provide the technological platform to offer the course the public free of charge. Course graduates will receive a certification from ENAP upon passing an optional final exam.

Evaluations created a competency profile for M&E professionals to support its work assessing professionals and developing the Diplomado curricula. By establishing straightforward standards for the competencies related to the four general M&E dimensions, this profile provides the basis for assessing M&E capacities, defining an M&E professional development plan, designing capacity building interventions, etc. Evaluations has transferred this profile and the associated assessments to the two M&E networks supported under Component 3 as well as integrating it to the Evaluations M&E System Toolbox. In

Page 28: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

28

addition to being published online by these networks, the profile and the associated assessment tools are available on the USAID DEC and PGRD website.8

In parallel to the competency profile, Evaluations articulated the best M&E practices for organizations to support the assessment of M&E capacity and to design capacity building plans. This Best Practices tool allows organizations to measure their M&E performance as well as to define and target improvements. The tool defines 14 best practices divided into planning, monitoring, evaluation, use of evidence, M&E capacity development, principals and ethical values. This tool also forms part of the Evaluations M&E System Toolbox and has been transferred and published online by the two networks working under Component 3. The Organizational Best Practices and the associated assessment tool are available on the USAID DEC and PGRD website.5

DevelopmentofM&ESystemsbyThreeRegionalPublicPartners

In Year 4, USAID requested that Evaluations provide technical assistance for four USAID public implementing partners in addition to the capacity building, training and TA provided through the M&E Postgraduate Diploma and the Regional Workshops. These partners included the San Martin Regional Office of Education (DRESM), the Ucayali Regional Office of Education (DREU), the Regional Government of Ucayali (GOREU), and the Regional Government of San Martin (GORESAM). Technical assistance focused on closing performance gaps related to M&E best practices prioritized by each institution.9

These technical assistance activities evolved into a robust assistance package for the development of M&E systems for select activities supported by USAID. At the request of USAID, Evaluations provided a dedicated M&E expert in San Martin and in Ucayali to work directly with these regional teams. Evaluations and each organization worked to implement capacity development plans that culminated in the implementation of monitoring information systems providing online access to regional performance indicators.

The plans required each public partner to adopt formal resolutions validating their advance before advancing to the next stage. These key milestones included:

• Institutional Strengthening Plans (TA plan): the ISP included commitments by each partner to implement and participate fully in the establishment and strengthening of monitoring and evaluation capacities, including the inclusion of such commitments in the regional budget.

• Institutional M&E Plans; • Monitoring Information Systems.

Three of the four partners completed the entire M&E system development plan, including the Regional Government of Ucayali, the Ucayali DRE, and the San Martin DRE. Each of the partners concluded the activity with functioning online monitoring information systems that provide designated authorities with performance management data as well as the public with access to information on how their government is performing in the delivery of services related to the targeted activities. In the case of the Regional Government of Ucayali, the partner scaled up the intervention to involve activities and public services that were not originally included in the M&E system development plan. The following provide links to the three online monitoring systems:

o http://monitoreo.regionucayali.gob.pe/, o http://www.indicadores.dreucayali.gob.pe/app/, o http://www.dresanmartin.gob.pe/sistema-monitoreo-evaluacion,

8 http://www.pgrd.org/projects/peru-evaluations/publications/ 9 Performance gaps were identified jointly with the implementing partner, as part of the assessment sessions organized for this purpose.

Page 29: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

29

The Regional Government of San Martin initially participated with the objective of developing M&E system for the Prevention of gender based violence. However, after a change in leadership, a lack of engagement led USAID to suspend support for the capacity building assistance. The regional government approved an institutional strengthening plan, but never approved the M&E plan. They did, however, establish monitoring indicators and had collected data to measure these indicators at the time support ended.

The entire M&E system development process completed by the three government entities is depicted in Graphic 5.

Graphic 5: Processes and Tools for the Development of an M&E System

The success of this comprehensive intervention was the product of committed institutions carrying out a series of replicable steps and with tools standardized by USAID Evaluations. The elements of this process included:

• An institutional decision to undertake M&E system building focused on specific objectives or activities. As an institutional counterpart, USAID played a key role in achieving this commitment. Team members worked with the regional teams to build support for the intervention based on a recognition of its value to the mutually supported activities.

• Each participating institution designated members of its staff with the mandate to lead the establishment of the M&E system.

• Institutional capacity measurement using the Evaluations Organizational Best Practices assessment tool.

• Develop Institutional strengthening plans and a TA plan reflecting host organizational priorities and a shared recognition of the performance gaps being addressed.

• Capacity assessments of the professionals responsible for implementing the M&E system as well as the participation of these professionals in skill-building activities, as needed.

• Each partner institution had Diplomado graduates working on the capacity building activities, which gave them shared expectations regarding the strengthening plan as well as a complete set of M&E skills.

• Three workshops supported the development of three critical products: the M&E Plan, the elaboration of report templates; and, a results communications plan.

• Evaluations TA coordinators collocated with the institutions to support the elaboration and implementation of their plans. The sustained support and close collaboration that the coordinator in Ucayali had with the partner teams positioned them to contribute to the Ucayali

Page 30: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

30

government’s decision to expand the scope of the M&E systems the institutional level rather than simply serving the select activities.

• Evaluations provided TA supporting the elaboration of the monitoring information systems. While the primary development of the information systems was financed by the host institutions, Evaluations worked with the implementing contractors to provide technical oversight and guidance to ensure the systems complied with the contractual standards and system specifications.

These institutions are already demonstrating the benefits of the recent completion of these monitoring systems.

In San Martin, the DRE took the initiative to implement a national standardized test, the Regional Learning Evaluation (ERA). Their independent implementation of the exam suggests the USAID Teaching is leading Project (PEEL) motivated the region to implement the test despite the fact that widespread strikes had derailed its implementation at the national level. The DRE used the online information system developed with Evaluations as a platform to publish these results (see http://web.regionsanmartin.gob.pe:8080/DRE/resultados_era). This is an important success in several respects: the DRE demonstrated that it values and has taken full ownership of the monitoring system; the DRE has already demonstrated the ability to expand on the online platform; and, the DRE used the system as a vehicle for providing transparent access to education sector results, which is promoted as an important M&E value during capacity building.

In Ucayali, the GOREU worked with USAID/PGRD and UNICEF to sponsor the forum, "Use of evidence in development plans and programs - Experiences of the Ucayali Region and in the Amazon context” on Oct 18, 2017. This forum formed part of the “2nd Latin American Evidence Week” in Pucallpa (organized with the support of Evaluations) and was attended by 325 participants from 81 institutions.10 The success of the forum as a venue to publicize the results of the monitoring system led GOREU to institutionalize the week of evidence in the Ucayali Region as an annual event to be held each October. This was approved with Regional Executive Resolution No. 0884-2017-GRU-GR, on November 17, 2017.

M&E System Toolkit – Evaluations used the systems building experience to produce a toolkit containing all of the tools and processes undertaken with the three regional partners to establish their M&E systems. The toolkit is organized into six products that correspond to the M&E System Process (see Graphic 5):

1. Guidelines for the implementation of a Monitoring and Evaluation System 2. M&E Strengthening Plan: Guide for Public Institutions 3. Preparation of a Monitoring and Evaluation Plan: Guide for Public Institutions 4. Monitoring and Evaluation Plan: Facilitator's Guide 5. Monitoring Report: Facilitator's Guide 6. Monitoring and Evaluation Communication Plan: Facilitator's Guide

Evaluations disseminated the toolkit in digital and physical formats to a number of partners as well as producing a number of kits for USAID. The closing event for the two M&E networks, EvalPeru and PERUME, working under Component 3 included their presentation of the toolkit, where they demonstrated a strong sense of ownership of these tools. Dissemination efforts have resulted in several institutions posting the toolkit for download on their websites, including the M&E networks, the UPCH,

10 A synthesis of the event can be found at https://www.facebook.com/region.deucayali/videos/1890560537928973/

Page 31: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

31

and the Regional Government of Ucayali. USAID posted the toolkit to the agency’s official Facebook page, where the account manager registered 3,360 visitors having accessed the tools in the first day.11

Component3:LocalCapacitiesforEvaluationStrengthened

The purpose of Component 3 was to develop and strengthen the technical capacities of two local evaluation institutions so that they are able to design and conduct effective performance and impact evaluations according to international standards. Ultimately, the activity achieved this objective with more than two organizations by working on two fronts.

First, the formal implementation of Component 3 consisted of the elaboration and implemented a plan to strengthen the capacities of two local M&E networks, non-governmental organizations of professionals dedicated to the application of M&E methodologies meeting the highest professional standards, as defined by competencies, skills and organizational structure, based on the Human and Institutional Capacity Development (HICD) approach adopted by USAID. One of these organizations, the Red Peruana de Monitoreo y Evaluación (PERUME), is dedicated to strengthening monitoring and evaluation as a contribution to citizen oversight and to supporting collaboration between different actors of the public and private sector, civil society and international organizations linked to the development and evaluation of public policies for decision-making. The other, the Asociación Red Peruana de Evaluación (EvalPerú), consists of professionals in the private sector dedicated to the monitoring, evaluation and systematization of policies, programs, and projects. It is affiliated with the Latin American Network for Monitoring, Evaluation and Systematization (ReLAC).

Second, Evaluations made deliberate use of the evaluation implementation process, under Component 1, to strengthen the capacity of subcontracted evaluation teams to implement evaluations. This ‘learning-through-doing’ approach addressed the need for evaluation products that responded to the rigor demanded under the USAID Evaluation policy as well as the push to ensure products were optimally useful to and used by their intended audiences. Evaluation and assessment team leaders from Evaluations partners were interviewed to assess the contribution of the activity to improve the quality of studies conducted under the activity. The following quotes highlight some of the most important of these contributions:

Dr. Willy Lescano (Universidad Peruana Cayetano Heredia): “I have to say, this first experience has helped us a lot, because it was the first of three consecutive program evaluations that we did in a year and a half. After evaluating AMI we evaluated the MIDIS performance incentive fund in Peru, and later we did an evaluation of public policies on adolescence and youth for UNICEF. So, in this sense, the learning that occurred from successfully competing for and implementing this evaluation with PGRD and USAID provided the experience needed to do something relatively similar in other areas and with other donors.”

Dr. Edmundo Beteta (Centro de Consultoría y Servicios Integrados de la Pontificia Universidad Católica del Perú): “PGRD… on one hand generates a rigorous standard for evaluation methodologies and on the other engages evaluators and those being evaluated in dialogue to generate mutual learning so that evaluation results are based on rigorous methodologies that lead to recommendations that are feasible to implement”

Dr. José Luis Escaffi (AC Pública): “The rigor of well-planned research before beginning fieldwork, thoroughly agreeing on the hypotheses, thoroughly agreeing on the instruments, the actors that were going to be interviewed, those who had to work in workshops and plan fieldwork, including the appointments. That discipline I think is good discipline.”

11 EvalPeru posted the tools to this page: http://evalperu.org/destacados/el-legado-del-proyecto-evaluations-caja-de-

herramientas-para-el-diseno-e-implementacion.

Page 32: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

32

Dr. Raúl Andrade (AC Pública): “They put us on alert to identify improvements that could be made to the reports. I believe that we delivered high quality final products, not just because of the work we did, but because of this interaction and this continuous dialogue with PGRD staff.”

Dr. Nathan Nadramija (Metis Gaia): “In addition to us at METIS GAIA, this has helped us to improve parts of other evaluations we have completed, especially preliminary findings. Preliminary findings had never been requested by any other clients, and it is a key milestone.”

Dr. Manuel Glave (Grupo de Análisis para el Desarrollo): “[Evaluations] played a strategic role in helping the evaluation process make a significant leap in quality beyond the dynamics of the day-to-day.”

Table 7: Key Indicators for Component 3

N° IndicatorYear1 Year2 Year3 Year4 Year5 Total

TargetResultTarget Result Target Result TargetResult Target ResultTargetResult

3.1 #oflocalevaluationorganizationsthatimprovetheirperformance - - - - - - - - 2 2 2 2

3.1.2#oflocalevaluationorganizationswithevaluationtoolsmeetinginternationalqualitystandards

- - - - - - 2 2 - - 2 2

3.1.1#ofparticipantsoftheM&EPostgraduateModuleswhoobtaintheacademiccertification

- - - - - - - 13 - - - 13

KeyCompetenciesModule - - - - - - 21 20 - - 21 20 EvaluationModule - - - - - - 18 14 - - 18 14 UseofEvidenceModule - - - - - - 21 16 - - 21 16

3.1.4 #ofindividualstrainedinM&Eworkshops - 50 20 39 40 37 60 39 - 16 120 181

#ofwomentrainedinM&Eworkshops - 30 - 20 - 22 - 23 - 8 - 103

3.1.6 #ofcapacitybuildinghoursinM&Edelivered - 6 200 17 200 241 200 183 - 7 600 453

EvaluationTrainingandTechnicalAssistanceforEvaluationNetworks

In its first year, Evaluations received approval from USAID for a plan to strengthen the evaluation networks of EvalPerú and PERUME under the activities of Component 3. Under this plan, Evaluations implemented capacity building interventions using the HICD approach with the two networks, in parallel. Because the two institutions constitute networks rather than development or academic organizations, the strengthening of these networks was oriented strongly towards building the capacity of their members, as well as positioning each network to continue contributing to the best practices of M&E across the development sector of Peru. Therefore, the plan had the following objectives:

• Improve each organization’s performance based on the assessment of organizational good practices. This improvement would be measured by comparing the baseline measurements taken in 2013 and follow-up assessments from late 2017;

• Identify new performance gaps at the end of the activity; • Identify and analyze the new causes that generate the performance gaps; and, • Update organizational Performance Improvement Plan.

Evaluations approached the strengthening of these institutions and their members using the same set of capacity building tools developed under Component 2, namely:

• The post-graduate Diplomado program; • M&E Workshops to build competencies; and,

Page 33: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

33

• Technical Assistance to the organizations in developing and strengthening institutional tools, such as strategic plans, and the adoption of other good practices.

Evaluations provided the members of both networks with entry to three of the four post-graduate modules under the Diplomado: Key Competencies, Evaluation and Evidence Management. A total of 21 network affiliates attended the M&E postgraduate program. Of these, 13 participants attended the three modules subsidized by USAID and then opted to pay themselves to attend the fourth Monitoring module in order to obtain the M&E Diploma. The other participants that did not attend all four modules obtained a certificate for each satisfactorily completed module (see Table 8). Two participants from the EvalPeru Network applied and obtained a certification of competences under the validation option under the M&E program. They paid for the validation exam and after passing, obtained the Diploma.

Table 8: Number of Diplomado Participants from Evaluation Networks Evaluation Network Participants with

Diploma Participants with

Certificate Total number of

participants

PERUME 11 7 18

EvalPerú 4 1 5

Total 15 8 23

PGRD measured the level of competencies in M&E of 15 of the participants before the program started and after it finished. Used as a sample, the baseline data was used to identify which areas needed strengthening among the networks’ members. The final data was used to measure progress made by the individual participants.

Graphic 6 shows that network participants needed to develop their technical competences mainly in evaluation and in evidence management, where their competencies were originally assessed as basic. Their competences in monitoring only required marginal strengthening. After participating in the M&E program, participants improved all of their technical competencies and those related with monitoring and evaluation had reached an advanced level (a score of 3+).

Page 34: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

34

Graphic 6: Competency development by Diplomado participants from PERUME and EvalPerú12

Measurement of key competencies —soft skills such as communication, leadership, negotiation— rated high in the first measurement and increased only slightly in the final measurement. After discussing these results with the participants and the training staff, we reached the conclusion that participants were more critical of their abilities in the second self-assessment. This is a common result under self-assessments, as participants gain a better understanding of the complexity of acquiring soft competencies during the learning process.

Almost all of the graduates responding to a post Diplomado survey recommended the program for the development of M&E skills, and they valued the development of competencies under the Managing Evidence module most among the other aspects. The respondents confirmed what their willingness to fund their own participation in the Monitoring Module suggested - all believed that they would be able to put their new skills to use in their professional work, and 87% felt that they would achieve a better position in their workplace.

Among those interviewed in 2018 regarding their application of M&E skills, a total of 75% attested to working on evaluations, while all were applying monitoring skills and just 25% were working on the communication of evidence. Exactly half of respondents cited roles that they were playing in planning and managing evaluations at their workplaces. Examples of network members playing an active role in the strengthening the field of evaluation of public projects abounds.

Two of the EvalPeru members participated on evaluation teams under the USAID Evaluations activity: Susana Guevara and Alejandro Bardales. Another four taught Diplomado modules: Amalia Cuba, Emma Rotondo, Brenda Bucheli, Luis Soberon.

Members of the PERUME network are all are employees of public ministries, which suggests that documented skills increase that has been confirmed as being applied in the post-Diplomado study will

12“ContribucionesdelProgramadeposgradoenmonitoreoyevaluacióndeplanes,programasyproyectoseneldesarrollode

lascapacidadesinstitucionalesydelaspersonasdeinstitucionesinvolucradas,2017”,Graphic6,LuisSoberónA.,PGRD,March2018

0.0

1.0

2.0

3.0

4.0

KeyCompetencies

Monitoring

Evaluation

ManagingEvidence

EntranceExit

Page 35: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

35

contribute to improving programs and influencing policies. Over time, this should offer substantial value added to the public sector. For example, several graduates have developed highly visible ministerial publications. For example, at the Ministry of Production, graduates served as the technical specialists producing the “Monitoreo y Evaluación del Desarrollo Productivo” and “Evidentia: Evaluaciones Ejecutivas”, while at SERFOR a graduate served as the technical coordinator for the “Formulation Process of the Principal Standards, Criteria and Indicators for the Monitoring and Evaluation of the National Policy for Forest and Jungle Fauna”.

A large proportion of the members of the Red PERUME work for the Ministry of Women and Vulnerable Populations (MIMP), which has served as the ministerial anchor for the network. These graduates have contributed to strengthening the ministry’s monitoring system through their work in a diverse set of ministerial positions.

OrganizationalCapacityDevelopmentoftheM&ENetworks

By the activity’s end, PERUME had succeeded in adopting best practices that brought its institutional adoption of best practices to 61% from 39%, exercising an intermediate level of organizational best practices by exceeding 60%.

The PERUME network’s baseline assessment concluded that the performance gaps had to do with the limits of voluntary work for the rapid formation of groups that could be responsible for elaborating network products. Despite this limitation, the network had managed to maintain its institutional health through an effective electronic network and an annual congress. At the baseline assessment, the network had not established any of the following: training to address members’ performance gaps in evaluation, a competency-based certification system for evaluators, strategies to inform national and sub national government policies or communicating and disseminating evaluation results, methodologies and practices.

PERUME progress in adopting best practices towards the end of the activity period was inhibited by the loss of its institutional sponsorship from the MIMP, which had previously provided its personnel with authorization to use their time to manage the network. Regardless, the network succeeded in institutionalizing a number of best practices, including:

• PERUME created a strategic plan for the first time with activity support as well as adopting a code of ethics. The network also established a workplan for the implementation of their strategy in 2016.

• The network maintained its active online member communication platform. • Through its collaboration with Evaluations, the network provided its members with adequate

evaluation training opportunities. • The network also succeeded in “promoting the use of evidence” to support public policies; and, • Dramatically improved its ‘Communication and dissemination of evaluation results,

methodologies and practices’ through its support for annual M&E conferences, held in collaboration with Evaluations.

In 2013, the EvalPeru baseline showed strong performance in three of eleven Good Practices. The network was formally organized around instruments and guidelines; the network communicated internally and managed itself democratically; and it established alliances with clearly defined objectives and responsibilities. Like the PERUME network, it had made no progress in terms of providing training to address members’ performance gaps in evaluation, developing a competency-based certification system for evaluators, or developing strategies to inform national and sub national government policies. It had made some progress (50%) in establishing good practices in communicating and disseminating evaluation results, methodologies and practices. The network also lacked effective governance, including

Page 36: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

36

a strategic plan or even a forum to discuss such issues. This owed to a lack of time among its members, which was exacerbated by the network being 100% volunteer dependent.

By the end of 2017, EvalPerú had incorporated new best practices that increased its overall best practices score to 72.9% from 43.8%, reaching an intermediate level. The organization made the following improvements:

• Despite not having an updated strategic plan, the network is carrying out activities that are aligned with the current mission, such as the 2015 Latin American Evaluation Conference and the 2016-2017 P2P with IOCE-Evalpartners.

• EvalPerú established a code of ethics and benefited from white papers developed by participants in the Diplomado.

• The network has improved internal financing from its members, and has temporary IOCE financing for specific activities. A new business plan will be formulated with the new members.

• With Evaluations support, the network has established an active online platform. EvalPerú does not yet have an electronic library (under construction) or an online response system due to the lack of time from volunteers.

• The network has established standards and now provides continuous capacity building to its members.

• The network improved its communications and dissemination of evaluation results, methodologies and practices, as well as establishing links with academic and other specialized programs, with Evaluations support.

Over the course of the activity, Evaluations collaborated with the two networks to organize and participate in several events advocating for high quality M&E standards and disseminating evaluation products. These events provided a platform to disseminate activity materials and methodologies such as the organizational and individual assessments, the M&E competency profile and the M&E System Toolbox. The networks acted as advocates for these tools and their positive effects in their organizations and individuals. These events included three international events, of which two were conducted in Lima and one in Mexico. Evaluations also worked with the PERUME network to organize four national events, the ‘Encuentros Nacionales’.

Page 37: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

37

LessonsLearned

The Evaluations team documented lessons learned over the last three years of the activity. The following paragraphs present a selection of these lessons that have been discussed internally as well as with the USAID M&E team.

LessonsRegardingtheDesignofActivities

Based on ADS guidelines13 and best practice, the Activity identified several activity design elements whose faults negatively impacted the evaluation process:

• Activity designs have difficulty in articulating the activity’s theory of change, which is critical for the identification of how activity components and assumptions are linked to the achievement of objectives, and therefore how an interventions effectiveness may be evaluated.

• Statements defining measurable activity objectives are often unclear and reflect results beyond the scope of the activity.

• Activities generally include too many indicators, have inadequately defined targets, do not always measure the intended objectives, and sometimes do not reflect activity performance.

The measurement of impact or the attribution of results directly to an activity requires that activities include several essential design elements and ample advanced planning. These design elements include:

• Identification of the variable(s) that the activity intends impact (e.g. income, poverty reduction, and change in behavior).

• The inclusion of sufficient budget and time in advance of activity initiation for the generation of a baseline and post-intervention data for two randomly selected, significantly comparable populations of intended beneficiaries that will serve as control and intervention groups (e.g. households, schools, communities).

• Assurance of adequate data availability for the design and implementation of activities aimed to measure impact/attribution.

• Assurance of sufficient time to obtain measurable impact, depending on the variable(s) to be studied, the scale of the project, and the characteristics of the host population.

LessonsRegardingtheImplementationofEvaluationsandStudies

Over the course of the activity, Evaluations strengthened and refined the evaluation process so that it performed as a tool to constantly shape studies. This led to a number of lessons learned:

• External peer reviewers were valuable. We would use them even more often and provide quality control and enhancement. Reviewers should complement the team’s technical and methodological profile. Their function does not need to be formal, as sometimes the best support can be applied at different moments, such as providing oversight to field work. This role should be adapted to the needs of each study.

13 We note that activity designs were analyzed using current ADS standards. These activities were designed when these

standards were not in use. This analysis considers how the quality of these design factors affected the evaluation processes.

Page 38: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

38

• There is an essential difference in the approach to writing reports between the USAID focus on concise spare narrative and the Peruvian researchers and academics tendency to include large amounts of detail directly in narrative reports.

• It is more productive to work with companies and their directly employed teams because they have an established work dynamic. Additionally, outside experts tend to be excluded from the work after their budget is spent, which can affect the mix of expertise available from the team later in the study process.

• It was worth being strict in the management of evaluation teams, despite the fact that teams universally expressed their desire for more flexibility. Even those who complained of the strict management approach recognized that it was a value added by the time the product was complete. This approach helped guide the evaluators down a path that would produce a product meeting USAID expectations and standards. Detailed planning and fieldwork supervision were particularly important. We anticipated problems in order to prevent them by learning what was likely to take place from previous studies.

• Despite our success with imposing a strict implementation of the study and methodological processes, evaluation teams often had blind spots when it came to technical approaches and decisions. For example, one team failed to interview any female key informants because they did not think this exclusion would affect the results at all (but had no evidence why this might be the case).

LessonsRegardingtheUseofEvaluations

In considering what lessons can be drawn from the Evaluations activity in terms of maximizing the use of studies, it is useful to use a supply and demand perspective. Evaluations represents the supply side, while USAID represents the demand for this research. While our survey of USAID study users established that at least 89% of studies were used for an average of three programming purposes, the activity had hoped to achieve a 100% rate of usage. In retrospect, it may have been almost inevitable that some studies would not be used; for example, studies that are conducted because they are mandated by an audit, but which do not respond to a programmatic need. Nevertheless, there is room for improvement.

An understanding of the factors driving the demand for study results and the factors that determine the degree to which studies are used or not could help USAID tailor both the supply of studies as well as influence the factors that shape demand. With more information regarding this demand, USAID will be positioned to optimize its planning and use of empirical evidence in the future.

To better understand the dynamics behind study use within the mission, Evaluations complemented its quantitative survey with a qualitative study exploring this topic at the end of the activity. Based on USAID staff interviews from this study, the following factors contribute to the greater use of studies and evaluations:

1. Timing: Reports that are not timed correctly do not allow the inclusion of their results in the design of these activities. DOs and Mission leadership need plans that map out the design and decision needs to the studies and evaluations they need to inform them. In Evaluations interviews, USAID staff overwhelmingly affirm that mid-term evaluations are more useful than close of activity evaluations because of their timing. They support both improvements to ongoing activities and inform the design of follow-on activities before the end of the current activity. However, there is a tendency to prioritize end of activity evaluations, which constituted 75% of the evaluations under this activity.

Page 39: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

39

2. Report Length: The current evaluation reporting format requirements result in long reports. If the current reporting formats had not been mandated by USAID policy, it may have made reports more accessible to adopt a shorter format in order to keep them at an executive length that would encourage busy USAID staff members to read the full reports. Achieving this given local norms would require coaching regarding the increased use of annexes to organize data supporting the core findings of a shorter report format. Evaluations use of Fact Sheets and professionally produced Executive Summaries in English and Spanish supports the use of studies among busy professionals. These products would benefit from more dissemination among mission staff.

3. Accessibility: Easy-to-use repositories containing the executive summaries, fact sheets and infographics of the studies and evaluations would facilitate their use. Mission staff cited the desire for an internal digital repository of study materials.

4. Participation: The decision of activity managers to get involved in the different evaluation stages increases the probabilities of use and learning. Studies in which USAID counterparts participated actively in the design and implementation of studies were better used and better received, in other words, they responded well to the research ‘demand’.

LessonsRegardingCapacityBuilding

Capacity building activities benefited from the continuous application of the HICD model, which helped Evaluations continually target its activities to address existing needs. Evaluations drew a number of lessons regarding how to design and deliver HICD activities to make them effective and efficient.

• Designing the Diplomado based on the gaps measured among institutions was critical to its success. Basing the approach on the needs of the institutions helped the program to be highly relevant to both institutions and professionals, rather than having it be driven by a theoretical agenda.

• Evaluations developed the curricula for the Diplomado before selecting an academic institution to implement the program. Similarly, the Diplomado established the curriculum for the online course in results communications before engaging with its host ENAP. Reflecting on these experiences, we realize that the process would have been more efficient if the coursework was developed in collaboration with the implementing institutions from the beginning. While the results were excellent, the process of revising and adapting the materials duplicated some efforts and was time consuming.

• Having a regional advisor was critical to the success of the support for regional governments. These advisors were most effective when collocated with the teams they supported, and when they were oriented towards providing comprehensive support and not simply producing deliverables. This worked well in Ucayali, where the advisor supported the regional teams to adopt the monitoring system at the institutional level, and where the coordinator worked in a participatory manner. This helped build capacity and to magnify the impact of his support. In San Martin, the coordinator worked in a more isolated manner and did not focus on building capacity, that is, he was delivering a product, not strengthening the team and getting buy-in for the adoption of a system.

The success of the joint institutional strengthening plans benefited from the several design characteristics, all of which contributed to fully engaging the partners:

• The clear identification of a host institutional team with a mandate and clear functions vis-à-vis achieving the objectives of the instructional strengthening plan. This step takes different forms in

Page 40: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

40

different organizations, such as public institutions versus NGOs. But in every case, clear counterparts for the TA with a mandate from institutional leadership was key for the plans to succeed.

• The inclusion of commitments for all parties that were based on milestones that required institutional approval provided a start/stop litmus test for the continuation of assistance to partners. Ultimately, this feature prevented the waste of resources on TA to a partner whose new leadership had abandoned the institutional commitment to capacity building.

• The inclusion of co-financing in TA plans ensured that our partners had skin in the game – they were fully vested and valued the results. This was most apparent in the development of information systems, where the regional governments were financing the system implementation while Evaluations provided technical oversight of the process.

• Including the development of tools that will facilitate the continuation of results provides valuable resources that an organization may not otherwise be able to obtain after assistance has ended. This is the case for the information systems in the case of the regional governments.

The M&E networks are vulnerable to destabilization. Both networks demonstrated significant organizational and technical development. They also both exhibit their resolve to continue developing their organizations within the particular constraints of member-driven models.

• In the case of PERUME, the MIMP acted as a sponsoring public institution by providing resources for the management of the network. However, a change of government upset plans to have the network transferred to MIDIS. Its resulting ambiguous status with MIMP is eroding its leadership, which has slowed down its development and activities. It is also suffering from the loss of its international affiliate, the REDLACME, a regional network that endorsed them and that has now become part of COPLAC (another IDB project). Despite these setbacks, PERUME network members indicate that they consider the network as remaining relevant and intend to analyze their institutional situation when reviewing their strategic plan.

• EvalPerú generally improved its organizational capacity, however, progress was slow. Organizational leadership has identified the root cause of this slow development as its dependence on volunteer work, which limits the time dedicated by its members, especially the Board of Directors. In this context of limited resources, the network management has resolved to optimize its selection of strategic activities and opportunities that will have the most impact on its management and its members’ participation.

Recommendations

The recommendations included in the following paragraphs are selected for their relevance to future USAID programming, efforts to conduct evaluations and assessments and to continue capacity building.

1. To strengthen the potential for each activity to be successfully evaluated, conduct early reviews of activities to ensure that each ongoing and new activity establishes a plan that reflects USAID priorities for the activity, evaluation questions, clear life-of-activity objectives, assumptions and indicators that will allow for the measurement of success. This process can be facilitated by asking the question, “what do we want to share and substantiate in 5 years?” These should drive the evaluation and learning agenda for each activity, and activity designs should be reviewed to ensure results can be measured to provide evidence supporting these post-evaluation objectives.

2. Standardize indicators and concepts that need to be compared across activities in order to provide evidence for USAID learning objectives. USAID has models and standards that would

Page 41: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

41

support this process (e.g. HICD model). Topics that cut across the USAID portfolio include: institutional capacity, capacity building, gender, and sustainability.

3. Use regulations as a platform for systematizing staff participation in, and use of, studies. Establish criteria to guide staff in the active participation and use of studies. Ideally, leadership would lead a learning agenda that drives a research plan determining the timing and content of evaluations, assessments and data collection. This would help establish a systematic approach to optimizing the utility of research products to USAID.

4. In establishing an agenda and schedule for evaluations, consider emphasizing the implementation of mid-term evaluations rather than the more common default of post-intervention evaluations. USAID professionals found mid-term evaluations were useful for strengthening ongoing activities as well as providing input for the design of follow-on activities. Post-evaluations are generally completed after new activities have been designed and cannot influence ongoing activities, which limits their impact.

5. The activity’s success in supporting the development of M&E systems and building capacity with public and private partners was attributable to a formula for success and sustainability that should be replicated in similar interventions: Secure commitment from the beneficiary, condition continued support on meeting performance benchmarks, include shared funding, include the development of tools that will contribute to continuing advances, identify a qualified team with a formal mandate and an organizational role to achieve the targeted results.

6. Capacity building or training should include a follow-up on the learning. This should be explicitly defined at the beginning of the activity, with a design based on the results that are being pursued. Should be measured at the level of individual and their organizational role. Sharing this intention at the beginning also serves as a positive motivation to apply their skills. Generally, people have been very open to this follow-up, including those that were remotely located (all except for one student). There is no reason not to include such follow up – it is inexpensive (the study cost about $10k for about 40 student follow ups, DO1 excluded.)

7. Consultants providing broad technical assistance with the objective of building sustainable capacities should collocate with the institution they are supporting. To effectively influence team performance, involve targeted teams in building and sustaining systems and take advantage of ad hoc capacity building opportunities, the TA providers should work directly with the corresponding teams, demonstrate good interpersonal and coaching skills, and have demonstrable expertise in the relevant technical themes.

Page 42: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

42

AnnexA:Indexofreports,deliverablesandrelatedproducts

Table 9: List of Study Reports Uploaded to the Development Experience Clearinghouse N° Report Name Type of Study Date Link Languag

e

1 Performance Evaluation of Promoting Long-Term Sustainability of Parque Nacional Cordillera Azul Project

Performance Evaluation

2013 http://pdf.usaid.gov/pdf_docs/pa00jjsf.pdf

English

2 Fortaleciendo el Sistema Descentralizado de Salud a partir de la Reducción de la Desnutrición Crónica Infantil. Una Mirada a los Sistemas Regionales de Perú: San Martín y Ucayali

Sector Assessment

2014 http://pdf.usaid.gov/pdf_docs/pa00m38k.pdf

Spanish

3 Enhancing forestry governance in the Peruvian Amazon: Mid Term Evaluation of Peru Forest Sector Initiative

Performance Evaluation

2014 http://pdf.usaid.gov/pdf_docs/pa00jx3d.pdf

English

4 Genero para Asegurar el Desarrollo Sostenible: Análisis del Portafolio de Programas de USAID/Perú

Sector Assessment

2014 http://pdf.usaid.gov/pdf_docs/pa00m38b.pdf

Spanish

Gender Analysis for Strategic Plan Implementation: Analysis of USAID/Peru Program Portfolio

http://pdf.usaid.gov/pdf_docs/pa00m38h.pdf

English

5 Mid-term performance evaluation of Amazon Malaria Initiative

Performance Evaluation

2014 http://pdf.usaid.gov/pdf_docs/pa00k8cz.pdf

English

6 Final performance evaluation of the Peru Quality Basic Education Reform Support Program

Performance Evaluation

2015 http://pdf.usaid.gov/pdf_docs/pa00khdf.pdf

English

Evaluación final de desempeño del Programa de Apoyo a la Reforma de la Educación en el Perú

http://pdf.usaid.gov/pdf_docs/pa00m9x6.pdf

Spanish

7 Retrospective Impact Evaluation of Alternative Development Program in Huanuco, San Martin and Ucayali (2007-2012)

Impact Evaluation

2014 http://pdf.usaid.gov/pdf_docs/pa00kcbv.pdf

English

8 Evaluación Final del Proyecto Maximus – Deporte y Discapacidad

Performance Evaluation

2015 http://pdf.usaid.gov/pdf_docs/pa00kz2j.pdf

Spanish

9 Midterm Evaluation of the Peru Decentralization and Local Governance Project - Executive Report

Performance Evaluation

2015 http://pdf.usaid.gov/pdf_docs/pa00krvc.pdf

English

10 Mid-term Evaluation of the Technical Assistance Program for the Ministry of Environment Peru

Performance Evaluation

2015 http://pdf.usaid.gov/pdf_docs/pa00ks6h.pdf

English

Evaluación de medio término del Programa de Asistencia Técnica al Ministerio del Ambiente de Perú

http://pdf.usaid.gov/pdf_docs/pa00m9xb.pdf

Spanish

11 Diagnóstico de las Barreras al Crecimiento Económico en Ucayali - Peru

Sector Assessment

2015 http://pdf.usaid.gov/pdf_docs/pa00m38t.pdf

Spanish

12 Estudio del Sistema de Justicia Penal en Materia Ambiental en el Peru - Informe Ejecutivo

Sector Assessment

2015 http://pdf.usaid.gov/pdf_docs/pa00krvq.pdf

English

Study on the Criminal Justice System in Environmental Matters in Peru - Executive Report

http://pdf.usaid.gov/pdf_docs/pa00krvp.pdf

Spanish

13 Evaluación de las Actividades de Manejo y Mitigación de Conflictos en Perú

Performance Evaluation

2015 http://pdf.usaid.gov/pdf_docs/pa00kxnz.pdf

Spanish

Page 43: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

43

N° Report Name Type of Study Date Link Language

14 Rapid Evaluation of the Peru Cocoa Alliance Performance Evaluation

2016 http://pdf.usaid.gov/pdf_docs/pa00m1n4.pdf

English

15 Evaluación de la Transferencia, Expansión y Sostenibilidad del Modelo Municipios y Comunidades Saludables en Perú

Performance Evaluation

2016 http://pdf.usaid.gov/pdf_docs/pa00m676.pdf

Spanish

16 Evaluación Final de Desempeño del Proyecto Promoción de la Justicia y la Integridad en la Administración Pública

Performance Evaluation

2016 http://pdf.usaid.gov/pdf_docs/pa00mfqr.pdf

Spanish

17 Estudio de la Situación Actual del Sector Educación en el Perú

Sector Assessment

2017 http://pdf.usaid.gov/pdf_docs/pa00mhcs.pdf

Spanish

18 La economía del VRAEM. Diagnóstico y opciones de política

Sector Assessment

2016 http://www.cies.org.pe/sites/default/files/files/otrasinvestigaciones/archivos/01-vraem_final.pdf

Spanish

19 Toma de Decisiones en Hogares y Cultivo de Coca en Shanantia

Ethnography 2017 http://pdf.usaid.gov/pdf_docs/PA00MPD3.pdf

Spanish

Toma de Decisiones del Antiguo Productor de Coca en una Comunidad del Monzón

Ethnography http://pdf.usaid.gov/pdf_docs/PA00MPCZ.pdf

Spanish

Toma de Decisiones del Antiguo Productor de Coca en una Comunidad del Monzón: Río Espino

Ethnography http://pdf.usaid.gov/pdf_docs/PA00N45T.pdf

Spanish

20 DEVIDA Institutional Capacity Sector Assessment

2017 https://pdf.usaid.gov/pdf_docs/PA00MR7C.pdf

Spanish

https://pdf.usaid.gov/pdf_docs/PA00T5SB.pdf

English

21 Evaluación Final de Desempeño del Proyecto Peru Bosques

Performance Evaluation

2017 http://pdf.usaid.gov/pdf_docs/PA00MHCR.pdf

Spanish

22 Evaluación Final del Proyecto Apoyo a la Expansión de la Metodología de Escuelas Activas en Peru

Performance Evaluation

2017 http://pdf.usaid.gov/pdf_docs/PA00MNMR.pdf

Spanish

23 Evaluación Final de Desempeño de la Iniciativa Amazónica contra la Malaria

Performance Evaluation

2017 http://pdf.usaid.gov/pdf_docs/PA00MNMM.pdf

Spanish

24 Análisis de Género: Perú 2016 Sector Assessment

2017 http://pdf.usaid.gov/pdf_docs/PA00MNMG.pdf

Spanish

Gender Analysis: Peru 2016 https://pdf.usaid.gov/pdf_docs/PA00T2C2.pdf

English

25 Desarrollo de Capacidades en Proyectos Seleccionados de USAID|PERÚ

Cross-Cutting Assessment

2018 http://pdf.usaid.gov/pdf_docs/PA00N45V.pdf

Spanish

Capacity Building in Selected Projects in Peru https://pdf.usaid.gov/pdf_docs/PA00T4ZM.pdf

English

26 Evaluacion Final de Desempeño del Programa ProDescentralizacion

Performance Evaluation

2018 https://pdf.usaid.gov/pdf_docs/PA00SZWN.pdf

Spanish

Final Performance Evaluation of Peru Decentralization Program

Pending English

27 Sistematización de la Participación Comunitaria en Cinco Proyectos de Cambio Climatico

Case Study 2018 https://pdf.usaid.gov/pdf_docs/PA00SW56.pdf

Spanish

Documentation of Community Participation in Five Climate Change Projects

https://pdf.usaid.gov/pdf_docs/PA00T4XZ.pdf

English

Page 44: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

44

Table 10: Evaluation Recommendation Use Plans agreed upon with USAID

Recommendations plans Year approved

Retrospective Impact Evaluation of Alternative Development Program in Huanuco, San Martin and Ucayali (2007-2012)

Year 2

Midterm Evaluation of the Peru Decentralization and Local Governance Project Year 2 Mid-term Evaluation of the Technical Assistance Program for the Ministry of Environment, Peru Year 3 Evaluation of Conflict Management and Mitigation Activities in Peru Year 3 Evaluation of the Transfer, Expansion and Sustainability of the Healthy Municipalities and Communities Model in Peru

Year 4

Final Performance Evaluation of the Promotion Justice and Integrity in Public Administration Project, Pro-Integridad

Year 4

Final Evaluation of the Peru Bosques Project Year 4 Final Performance Evaluation of the Amazon Initiative Against Malaria Year 4 DEVIDA Institutional Capacity Assessment Year 4 Final Evaluation of the Support Project for the Expansion of the Active Schools Methodology in Peru (CEPCO) Year 5

TOTAL 10

Table 11: Dissemination Products based on Studies and Evaluations Dissemination Products Series English Spanish

Fact Sheets Barriers to Economic Growth in the VRAEM (Apurimac, Ene, Mantaro Rivers’ Valleys)

Diagnosis x x

Decision-Making Regarding the Cultivation of Coca Ethnography x x Healthy Communities and Municipalities: A Versatile Model for Articulating Stakeholders to Promote Development

Evaluation x x

Peru Bosques Project: Support For Forest Sector Reform Evaluation x x Peru Forest Sector Initiative: Contributions to Improving Forest Sector Governance

Evaluation x x

Support and Expansion of the Active Schools Methodology in Peru: Productive Enterprises in Secondary Schools

Evaluation x x

Amazon Malaria Initiative: Capacity Strengthening for Prevention and Control

Evaluation x x

Situational Assessment of The Education Sector in Peru: Opportunities to Strengthen Basic Education

Diagnosis x x

Technical Assistance Program: Supporting the Management Strengthening of the Ministry of the Environment of Peru

Evaluation x x

Evaluation of Conflict Management and Mitigation Activities in Peru: Improving Attitudes to Facilitate Dialogue

Evaluation x x

Project for the Promotion of Justice and Integrity in Public Administration (Pro-Integridad): Fighting Corruption

Evaluation x x

Maximus Project - Disabilities and Sports: Promoting Social Inclusion Evaluation x x Project Pro Decentralization (PRODES III): Strengthening Management in Subnational Governments

Evaluation x x

Capacity Development Strategies and Methodologies Applied in Seven USAID Projects

Case Study x x

Climate Change Adaptation Experiences from Five Projects in Rural Communities

Case Study x x

Learning Briefs Applying Recommendations of Evaluation Utilization in USAID/Peru x

Page 45: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

45

Dissemination Products Series English Spanish Knowing Your Beneficiary Leads to Increased Effectiveness x x Knowledge Transfer to Achieve Outcomes x Executive Summaries Mid-Term Performance Evaluation of the Amazon Malaria Initiative (AMI) x x Final Performance Evaluation of the Amazon Malaria Initiative x x Final Evaluation of the Project Support and Expansion of the Active Schools Methodology x x Evaluation of Conflict Management and Mitigation Activities in Peru x x Institutional Needs Strengthening Assessment for the Comision Nacional para el Desarrollo y Vida sin Drogas de Peru

x x

Brief Assessment of the Current Situation Within the Education Sector in Peru x x Household Decision-Making and Coca Cultivation: Household Ethnographies in Shanantia, Huipoca in the Department of Ucayali, Peru

x x

Decision-Making of Former Coca Producer in One Community in Monzon: Agua Blanca x x Gender Analysis for Strategic Plan Implementation: Analysis of the USAID/Peru Program Portfolio

x x

Evaluation of the Transfer, Expansion and Sustainability of the Healthy Communities and Municipalities Model in Peru

x x

Strengthening the Decentralized Health System Based on the Reduction of Chronic Child Malnutrition: A Look at the Regional Health Systems of Peru - San Martin and Ucayali

x x

Final Evaluation of the Maximus Project - Sports and Disability x x Mid-Term Performance Evaluation of The Technical Assistance Program (TAP) For the Ministry of Environment (MINAM)

x x

Rapid Evaluation of the Peru Cocoa Alliance x x Final Performance Evaluation of the Peru Bosques Project x x Enhancing Forestry Governance in The Peruvian Amazon: Midterm Evaluation of Peru Forest Sector Initiative

x

Final Performance Evaluation of the Promoting Justice and Integrity In Public Administration Activity (Pro-Integridad)

x x

Retrospective Impact Evaluation of Alternative Development Program in Huanuco, San Martin And Ucayali (2007-2012)

x

Final Performance Evaluation of the Peru Quality Basic Education Reform Support Program (SUMA)

x x

Diagnóstico: Barreras al Crecimiento Económico en Ucayali – Perú x Capacity Development in Selected USAID/Peru Projects x x Final Performance Evaluation of the Prodecentralization Program III x x Case Study of the Experience of Five USAID Projects Implementing Climate Change Adaptation in Rural Communities

x x

Decision-Making of Former Coca Producer in One Community in Monzon: Rio Espino x x Infographics Peru: Análisis de Género 2016 x La Experiencia de Cinco Proyectos de Adaptación al Cambio Climático en Comunidades Rurales

x

¿Cómo Nos Estamos Adaptando Al Cambio Climático? Trabajo Conjunto con la Asociación Especializada para el Desarrollo Sostenible (AEDES)

x

¿Cómo Nos Estamos Adaptando Al Cambio Climático? Trabajo Conjunto con el Centro Agronómico Tropical ee Investigación y Enseñanza (CATIE)

x

¿Cómo Nos Estamos Adaptando Al Cambio Climático? Trabajo Conjunto con Lutheran World Relief (LWR)

x

¿Cómo Nos Estamos Adaptando Al Cambio Climático? Trabajo Conjunto con el Instituto de Montaña

x

¿Cómo Nos Estamos Adaptando Al Cambio Climático? Trabajo Conjunto con The Nature Conservancy (TNC)

x

Dissemination Article

Page 46: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

46

Dissemination Products Series English Spanish El Sistema Regional de Salud de San Martín y su política de reducción de la desnutrición infantil: aplicación en el Perú del Enfoque de Evaluación de Sistemas de Salud/Health Systems Assessment. An Fac Med. 2015;76(3):269-76 / http://dx.doi.org/10.15381/anales.v76i3.11238

x

Table 12: Toolbox for the Design and Implementation of a Monitoring System

Type and Name Printed

Document Electronic files

Document PPT Excel Introductory Documents 1 Sistema de Monitoreo y Evaluación: Orientaciones

para su Implementación 1

2 Formats and documents 10 Performance Improvement Planning 3 Plan de Mejora en Monitoreo y Evaluación: Guia para

Organizaciones de Gobierno 1

4 Formats and documents 4 M&E Planning 5 Elaboración del Plan de Monitoreo y Evaluación: Guia

para Organizaciones de Gobierno

6 Formats and documents 10 M&E Planning Workshop 7 Plan de Monitoreo y Evaluación: Guia del Facilitador 1 8 Check Lists 2 9 Workshop Guidelines 8 10 Evaluation Format 3 11 Technical Assistance Guidelines 3 12 Selected bibliography documents 8 13 Power Point Presentations 9 Reports Elaboration Workshop 14 Reporte de Monitoreo: Guia del Facilitador 15 Check Lists 2 16 Workshop Guidelines 5 17 Evaluation Forms 3 18 Power Point Presentations 4 M&E Communication Workshop 19 Plan de Comunicación de Monitoreo y Evaluación:

Guia del Facilitador 1

20 Check Lists 2 21 Workshop Guidelines 5 22 Workshop Formats 2 23 Evaluation Forms 3 25 Power Point Presentations 2 Measuring Tools 25 M&E Competences Auto diagnosis 1 26 M&E Good Practices Assessment 1 Table 13: Diplomado Materials

Type and Name Printed

Document Electronic files

Document PPT Videos General Documents 1 Plan Curricular 1

Page 47: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

47

Type and Name Printed Document

Electronic files Document PPT Videos

2 Monitoring and Evaluation Plan 1 3 Professors guidelines 1 4 Participants guidelines 1 Introductory Workshop 5 Syllabus 1 6 Power point presentations 2 7 Evaluation formats 3 Final integration Workshop 8 Syllabus 1 9 Professors guidelines 1 Module on Key Skills for Monitoring and Evaluation 10 Syllabus 1 11 Manual on Key Competences (7 Units) 1 12 Selected bibliography documents 27 13 Power point presentations 7 14 Evaluation formats 1 15 Videos 3 Module on Evidence Management 16 Syllabus 1 17 Manual on Evidence Management (6 Units) 1 18 Selected bibliography documents 11 19 Power point presentations 15 20 Workshops guidelines 1 21 Deliverables guidelines 2 22 Evaluation formats 1 23 Videos 4 Module on Monitoring 24 Syllabus 1 25 Manual on Monitoring 1 26 Selected bibliography documents 59 27 Power point presentations 2 28 Workshops guidelines 3 29 Deliverables guidelines 4 30 Evaluation formats 2 31 Videos 2 Module on Evaluation 32 Syllabus 1 33 Manual on Evaluation (10 Units) 2 34 Selected bibliography documents 45 35 Power point presentations 15 36 Workshops guidelines 8 37 Deliverables guidelines 1 38 Evaluation formats 4 39 Videos 4

Page 48: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

48

AnnexB:USAIDEvaluationsProcesses

Table 14: The Study Implementation Process, “10 Moments” 1st moment – Defining Study Questions

Evaluation questions identified in consensus with client, responding to client needs. Client participation is critical to ensure that the client identifies and understands the genesis of the questions that it wants to answer since they will drive the study design and results.

2nd moment – SOW Timely SOW approval is critical to start the implementation process. SOW provides USAID with a suggested implementation design and timeline.

3rd and 4th moments – Selection of Proposals and Signing of Contract

As of Y3, call for proposals are sent via direct invitation to a greater pool of established Peruvian firms and advertised to qualified individual consultants through online evaluation networks. Additionally, Project Evaluations also screens and recruits international, highly qualified experts where local talent is scarce and/or unavailable in USAID specific mechanisms, eg. global development alliance.

5th moment – Launching of Evaluation

Implemented in Y3, orientation workshops ensure alignment of expectations among USAID, the evaluation team(s) and Project Evaluations. Herein, the evaluation team hears firsthand client expectations and context regarding the project, alongside a more strategic and programmatic perspective. The presence of project and Program Office CORs provides an institutional memory irreplaceable by Project Evaluations.

6th moment – Inception Report

This report serves to ensure that the evaluating team understands the nuances of the evaluation questions and objectives, establishes a detailed plan and is prepared to implement. The detailed context and methodology allow the evaluating team and the Project to align expectations on how and when fieldwork, analysis, and other stages will be executed. Identifying the relationship between evaluation questions, methodology and data sources is aimed at ensuring the evaluation questions will be answered through evidence-based findings.

7th moment –Fieldwork Report and Findings Presentation

Upon fieldwork completion, findings are presented to USAID in order to receive comments and suggestions to address in the final report. This presentation takes place approximately two weeks after fieldwork is completed, allowing the team sufficient time to process the data.

8th moment– Draft report sent

Project Evaluations submits draft report of evaluation/assessment study, which incorporates USAID feedback received during the presentation of findings taking place upon fieldwork completion.

9th moment– Plan of Recommendations

USAID, the evaluations team, and Project Evaluations discuss and agree on a set of recommendations that will be evaluated and hopefully implemented by USAID, implementing partners, and other stakeholders. Recommendations relate to the design and or implementation phases, either by USAID and/or by implementing agencies.

Page 49: EVALUATIONS ACTIVITY FINAL PROGRESS REPORT · 2018. 7. 5. · EVALUATIONS ACTIVITY FINAL PROGRESS REPORT Contract Number: AID-527-C-13-00002 April 1, 2013 – March 31, 2018 April

Contract # AID-527-C-13-00002

49

10th moment – Final report approved and Uploaded in DEC

Refers to submission of final report and approval by USAID. This final report incorporates all comments from USAID and Project Evaluations. This final report must satisfy the Project´s quality standards regarding the methodological implementation of the evaluation to provide evidence-based findings. Lastly, the approved final report is uploaded to the DEC.


Recommended