:
Webinar on Meta-evaluation Approaches
to Improve Evaluation Practice
25 October 2019
ZimbabweKenya
Mónica Lomeña Gelis, Principal Evaluation Officer at
Independent Development Evaluation of the African Development Bank Group,
Abidjan (Ivory Coast)
Maria Bustelo Ruesta, Director of the Master on Evaluation of
Programmes and Public Policies and Professor of Political Science and Public Administration of
Universidad Complutense de Madrid, Spain
Meta-evaluation: the concept
Michael Scriven, “Thesaurus of Evaluation”:
“The evaluation of evaluations - indirectly, the evaluation ofevaluators- represents both an ethical and scientific obligation whenthe wellbeing of others is at stake”.
Joint Committee on Standards for Educational Evaluation:
1994. Standard A12 Metaevaluation: “The evaluation itself shoud be formatively and summativelyevaluated against these and other pertinent standards, so that its conduct is appropiately guidedand, on completion, stakeholders can closely examine its strengths and weaknesses”.
2011. Standards E2 Internal Metaevaluation & E3 External Metaevaluation
Michael Q. Patton:“ The evaluation of the evaluation based on a series of norms and professional principles”.
Cooksky & Caracelli:“ Systematic reviews of evaluations to determine the quality of their processes and results”.
There has been more focus on evaluation synthesis methodologies around evaluation results
(Olsen & O’Reilly, 2011).
What are other meta-evaluative approaches?
Source: modified from (Olsen & O’Reilly, 2011).
Evaluation synthesis
(synthesis evaluation) Summarizing evaluation results
Narrative/Research reviewDescriptive account for summarizing findings
Meta-analysisStatistical procedure for comparing findings
of quantitative evaluationsSystematic reviewUse of a rigorous peer-review protocol to
summarize evidence around a research question
Meta-evaluationEvaluation of evaluations (their designs,
processes, results and utilization)
EVALUATION SYNTHESIS
Synthesizing evaluation RESULTS (from which meta-analysis is a type)
The focus is on interventions and policies
METAEVALUATION
Evaluation of evaluation PROCESSES (how evaluation is concived, done and used)
The focus is on the evaluation of those interventions and policies
It is important to distinguish between two very different exercises:
Source: Bustelo, M. (2002) Meta-evaluation as a tool for the improvement and development of the evaluation function in publicadministrations, Paper presented at the European Evaluation Society Biennial Conference at Sevilla, Spain, October 2002.https://evaluationcanada.ca/distribution/20021010_bustelo_maria.pdf
MEv functions
1. Quality control of evaluations:
“¿Who evaluates the evaluator?” (Scriven).
It is related to the control of the bias ofthe evaluator and to ensure the credibilityof evaluations .
2. Comparative analysis of the evaluation function in variuoscountries (Rist, 1990, Ballart, 1993 and Derlien, 1998)
More than focusing on the quality of the studied evaluations, itdoes it on their contribution to the development of thatevaluation function in a policy field, an organization, institutionor political system.
https://www.collaborationprimer.ca/evaluation/
MEv functions (II)
3. Choice of which evaluation results can be synthesized
The knowledge about the quality of evaluations that MEvgenerates can be used to help in the decision making aboutwhat studies to be included in evaluation synthesis.
4. Identification of evaluation training needs
The MEv of multiple studies help to identify the strengths andweaknesses of the evaluative practice in order to developevaluation capacity programmes.
Types of MEv
Source: adapted from Bustelo, 2001; Bustelo, 2002; Stufflebeam, 1974 & 2001; Cooksy & Caracelli, 2005; Scriven, 20011; Yarbrough et al,
2011.
https://usabilitygeek.com
As Steve Jobs once said,
Types of MEv
Source: adapted from Buselo, 2001; Bustelo, 2002; Stufflebeam, 1974 & 2001; Cooksy & Caracelli, 2005; Scriven, 20011; Yarbrough et al, 2011.
http://oeko.de
Types of Mev (II)
Source: adapted from Buselo, 2001; Bustelo, 2002; Stufflebeam, 1974 & 2001; Cooksy & Caracelli, 2005; Scriven, 20011; Yarbrough et al, 2011.
First example: MEv of gender policies in Spain
Unity of analysis: eleven gender equality plans (evaluated or not), discourse analysis about
evaluation of the national agencies executing the gender plans.
1. Evaluation planning and evaluative strategies
- Responsiveness to their context- Clarity of the evaluation objectives- Institutional structures for the evaluation- Different types of evaluations used- Resources used in evaluations
2. Key elements of the evaluations
⁻ Stakeholders involved in the evaluation processes ⁻ Moment and timing of the evaluation⁻ Evaluation criteria and indicators⁻ Procedures and tools for data collection and analysis
3. Utilization and impact of evaluations
⁻ Adequacy and usefulness of the produced information⁻ Communication and dissemination of evaluation results⁻ Impact of the evaluation in policies and organizations
Meta-evaluation criteria (analysis dimensions)
1. The centrality of the evaluation process in the institution conducting the evaluation;2. Responsiveness of the evaluation to the plan or policy context and clarity (explicit) of
the evaluation purposes;3. Clarity and centrality of evaluation criteria (of what is evaluated). The techniques for
data collection and analysis should be chosen after the evaluation criteria are defined, and not vice versa;
4. Adequate management of evaluation resources, including (i) a good use of the different types of evaluation, (ii) the existence of adequate co-ordination structures which allow a reliable and collaborative information gathering, (iii) a good management of times and timetables, (iv) enough resources investment in evaluation;
5. Enough elaboration of the gathered information during the evaluation processes (systematic judgment of the information in the light of the evaluation criteria previously set);
6. The existence of good communication and dissemination processes of the evaluation results and reports
The logic of those evaluation questions and for judging the evaluation processes was built
around six main criteria:
Gender Responsive evaluationIt is necessary to distinguish between:
• Evaluation of gender policiesAs a policy tool, the evaluation might be especially fruitful for capturing the important changes and shifts on gender policies, for improving them, as well as for answering to what extent these policies are successful.As an integral part of the intervention, evaluation might guide developments, further needs and new areas for development
• Evaluation from a gender perspectiveAs part of the policy making process, and following the aim of the gender mainstreaming strategy, evaluation is an important part to be conducted under a gender perspective, with a gender lens
Gender Responsive (Meta)evaluation
How gender is included in the different evaluation phases?
Chapter 2
64
Figure 20. Stages to develop our Meta-evaluation analytical framework. Source: inspired by (Forss et al, 2008); (Yarbrough et al., 2011); (Davidson, 2012)
Twelve MEv criteria
covering evaluations
design – process - results-
utilization,
with associated dimensions
and rubrics
Second example: MEv of 40 evaluations in Senegal
Actual practice of MEv in aid development
evaluation
Conclusions about the usefulness of MEv
• MEv can be useful for the improvement and development of the evaluationfunction in many settings, especially in settings with limited evaluation cultureand low level of evaluation institutionalization;
• The use of standards, guidelines and professional competencies of theevaluation discipline can guide the critical reflection about a set of real-worldevaluations, surpassing the narrow conception of evaluation quality;
• The review of evaluation reports need to be complemented with interviews in order to grasp dimensions related to evaluation utilization and to better understand the constraints of real-world evaluation processes (evaluation design vs. real delivery, responsiveness to information needs of different audiences, etc);
• Following the trends of evaluation professionalization, research whose objectof study is the evaluation function can help to the improvement of itsusefulness to public policy making and development effectiveness.
Some bibliographic resources
AEA. (2004). American Evaluation Association Guiding Principles for Evaluators. Fairhaven MA, USA:
American Evaluation Association. Retrieved from www.eval.org
AfrEA. (2007a). African Evaluation Guidelines - Standards and Norms. Niamey. www.afrea.org
Baba, T. (2007). Meta-evaluation report of research studies, evaluations and reviews conducted by the
UNICEF Pacific Office during programme cycle 2003-2007. Suva. http://www.unicef.org/pacificislands/resources_9975.html
Baslé, M. (2013). Méta-évaluation des politiques publiques et qualité des évaluations. In Séminaire du
Réseau des chercheures en évaluation des politiques publiques de la Société Française d’Evaluation.
BMZ. (2009). Evaluation in German Development. A system’s review. Bonn. www.bmz.de
Bustelo, M. (2002). Metaevaluation as a tool for the improvement of the evaluation function in public
administrations. In European Evaluation Society Conference (pp. 1–15).
https://evaluationcanada.ca/distribution/20021010_bustelo_maria.pdf
CES. (2010). Competencies for Canadian Evaluation Practice. Canadian Evaluation Society. www.evaluationcanada.ca
DAC-OECD. (2010, November 3). DAC Quality Standards for Development Evaluation. http://doi.org/10.1787/9789264083905-en
DANIDA. (2004). Meta-Evaluation. Private and business sector development interventions.www.evaluation.dk
Eriksson, J. (2011). A Meta-Evaluation of USAID Foreign Assistance Evaluations. Washington DC.www.usaid.org
Some bibliographic resources (II)
IDEAS. (2012). Competencies for Development Evaluation Evaluators, Managers, and Commissioners.
International Development Evaluation Association. www.ideas-int.org
Ingram, G., Fostved, N., & Lele, U. (2003a). The CGIAR at 31 : An Independent Meta- Evaluation of the
Consultative Group on International Agricultural Research. The CGIAR in Africa: Past, Present,
and Future. W. www.cgiar.org
Ingram, G., Fostved, N., & Lele, U. (2003b). The CGIAR at 31: An Independent Meta-Evaluation of the
Consultative Group on International Agricultural Research. Vol 1: overview report. (Vol. 1).
Washington DC. www.cgiar.org
Lele, U., Barrett, C., Eicher, C. K., & Gardner, B. (2003). The CGIAR at 31: An Meta-Evaluation of the
Consultative Group on International Agricultural Research Volume 3: Annexes.
www.cgiar.org
Lomeña-Gelis, M. (2015). A Meta-evaluation of Sustainable Land Management Initiatives in Senegal. PhD thesis on Sustainability.
University Research Institute for Sustainability Science and Technology, Universitat Politecnica de Catalunya (Spain).
https://upcommons.upc.edu/handle/2117/95787
Olsen, K., & O’Reilly, S. (2011). Evaluation Methodologies. A brief review of meta-evaluation,
systematic review and synthesis evaluation methodologies and their applicability to complex
evaluations within the context of international development. (Vol. 44).. www.iodparc.com
Some bibliographic resources (IV) Shah, F., & Patch, J. (2011). Meta-Review of AusAID Education Sector Evaluations, 2006-2011.
http://auserf.com.au/wpcontent/files_mf/1367729332ERF10239_EducationMetaEvaluation.pdf
Sheeran, A. (2008). UNICEF Child Protection Meta-Evaluation. Seattle. www.unicef
Stufflebeam, D. L. (1999). Program Evaluations Metaevaluation checklist (based on the Program
Evaluation Standards ). www.wmich.edu
Stufflebeam, D. L. (2007). CIPP evaluation model checklist. Evaluation checklists project.
www.wmich.edu/evalctr/checklists
UNEG. (2005). Norms for Evaluation in the UN System. New York: United Nations Evaluation Group. www.uneval.org
UNEG. (2010a). Quality Checklist for Evaluation Reports. New York: United Nations Evaluation Group. www.uneval.org
UNEG. (2010b). Quality Checklist for Evaluation Terms of Reference and Inception Reports. New York:
United Nations Evaluation Group. www.uneval.org
UNIDO. (2010). Meta Evaluation UNIDO Integrated Programmes, 2007-2009. Vienna. www.unido.org
Universalia. (2003). Meta-Evaluation. An analysis of IUCN Evaluations. 2000-2002. www.iucn.org
Wingate, L. A. (2009). The Program Evaluation Standards applied for meta-evaluation purposes:
investigating interrater reliability and implications for use. Western Michigan University.
www.wmich.edu
Some bibliographic resources (V) World Bank. (2011). Writing terms of reference for an evaluation: a how-to guide (IEG Blue Booklet
Series). Washington DC. www.worldbank.org
Worlen, C. (2011). Meta-Evaluation of climate mitigation evaluations. Washington DC. www.thegef.org
Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The Programme Evaluation
Standards. A guide for evaluators and evaluation users. Joint Committee on Standards for
Educational Evaluation. Thousand Oaks: SAGE Publications. www.jcsee.org
https://www.slideshare.net/DawitWolde/meta-evaluation-and-evaluation-disseminationmanual-3
https://www.slideshare.net/davidpassmore/metaevaluation-theory
https://unesdoc.unesco.org/ark:/48223/pf0000247262
https://www.russellsage.org/publications/handbook-research-synthesis-and-meta-analysis-second-edition
https://read.oecd-ilibrary.org/development/strengthening-accountability-in-aid-for-trade/the-oecd-meta-evaluation-overview-of-
evaluations_9789264123212-8-en#page1
To know more about synthesis and meta-analysis in development evaluation:
https://www.evalforward.org/events/webinar-use-synthesis-and-meta-analysis-development-evaluation
Thank you, merci, gracias!
Mónica Lomeña-Gelis, Principal Evaluation OfficerIndependent Development
Evaluation (IDEV) African Development Bank Group
María Bustelo, Director Master on Evaluation of Programmes
and Public Policies, Universidad Complutense de Madrid