+ All Categories
Home > Documents > Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation:...

Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation:...

Date post: 13-Aug-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
64
Evaluation of Humanitarian Action (EHA) Course Reference Bibliography This bibliography has been developed to support the beginner/intermediate and advanced EHA courses run by Channel Research on behalf of ALNAP. The course is also supported by a reference manual and a course specific set of case studies. ALNAP http://www.alnap.org and Channel Research http://www.channelresearch.com This bibliography has been developed by John Cosgrave. Version 1.30
Transcript
Page 1: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Evaluation of Humanitarian Action (EHA)

Course Reference Bibliography

This bibliography has been developed to support the

beginner/intermediate and advanced EHA courses

run by Channel Research on behalf of ALNAP. The

course is also supported by a reference manual and a

course specific set of case studies.

ALNAP

http://www.alnap.org

and

Channel Research

http://www.channelresearch.com

This bibliography has been developed by John Cosgrave.

Version 1.30

Page 2: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

2

TABLE OF CONTENTS Table of Contents ................................................................................................................ 2 Introduction .......................................................................................................................... 3 Bibliography ......................................................................................................................... 4

Evaluation Text (10) ........................................................................................................... 4 Guidelines (18) ................................................................................................................... 6 Evaluation Policy (5) ........................................................................................................ 11 Interview (14) .................................................................................................................... 12 Focus Groups (5) .............................................................................................................. 16 Case Study (4) .................................................................................................................. 17 After Action Review (3) ................................................................................................... 18 Content Analysis (5) ........................................................................................................ 18 Ethnography (3) ............................................................................................................... 20 Surveys and Statistical Methods (14) ............................................................................ 20 Qualitative Analysis (5) .................................................................................................... 24 Participatory Appraisal (19)............................................................................................ 25 Other Methods (28) ......................................................................................................... 29 Standards and Ethics (9) ................................................................................................. 35 Framework for evaluation (10) ....................................................................................... 37 Evaluation: Evaluability (3) ............................................................................................. 40 Joint Evaluation (3) .......................................................................................................... 41 Real Time Evaluation (3) ................................................................................................. 41 Impact Evaluation (10) ................................................................................................... 42 Evaluation Other (7) ........................................................................................................ 45 Communicating (16) ....................................................................................................... 47 Utilisation (6) ..................................................................................................................... 50 Monitoring and Evaluation (5) ....................................................................................... 51 Research Text (9) ............................................................................................................. 53 Other Checklist (10) ......................................................................................................... 55 Knowledge Management (22) ...................................................................................... 57 Learning (5)....................................................................................................................... 62 Management (3) ............................................................................................................. 63 Epistemology (3) .............................................................................................................. 63

Page 3: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

3

INTRODUCTION The evaluation references listed here are subject to the following criteria:

They should be about, or touch on, evaluation issues and methods. The list does not contain evaluation summaries, such as the ALNAP annual reviews, which are themselves very useful resources for evaluators.

They are in the author’s library, either in electronic or paper format. This limits the huge range of potential resources to those that have been purchased or consulted by the author.

Each of the references is presented with a brief abstract. The items are arranged alphabetically by author within each group. The references are grouped as shown in the following table. As with all classification exercises the classification of some items is debatable. The group is somewhat eclectic as it represents a working evaluator's collection of reference sources. Clearly the references sources in the collection depend on the type of evaluations conducted.

An on-line version of the bibliography is available at http://www.interworks.eu/evalbook.html

Group Name No. of references

Evaluation Text 10

Guidelines 18

Evaluation Policy 5

Interview 14

Focus Groups 5

Case Study 4

After Action Review 3

Content Analysis 5

Ethnography 3

Surveys and Statistical Methods 14

Qualitative Analysis 5

Participatory Appraisal 19

Other Methods 28

Standards and Ethics 9

Framework for evaluation 10

Evaluation: Evaluability 3

Joint Evaluation 3

Real Time Evaluation 3

Impact Evaluation 10

Evaluation Other 7

Communicating 16

Utilisation 6

Monitoring and Evaluation 5

Research Text 9

Other Checklist 10

Knowledge Management 22

Learning 5

Management 3

Page 4: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

4

BIBLIOGRAPHY

EVALUATION TEXT (10) Bamberger, M., & Rugh, J. (2008). RealWorld Evaluation: Working under Budget, Time, Data, and

Political Constraints: Overview (American Evaluation Association, 5 November 2008 Professional Development Workshop Session 21). Last viewed on 13 July 2009. URL: http://www.realworldevaluation.org/uploads/RWE_Overview_Handbook_AEA_Nov_2008.doc

Notes: This is the summary chapter from the RealWorld Evaluation Text by Bamberger et al (2006). In this chapter the authors discuss how RealWorld Evaluation (RWE) approaches can be applied at each stage of the design and implementation of a typical evaluation. They identify RWE issues that can come up—that is, where there are constraints related to funding, time, availability of data, and clients‘ preconceptions—and suggest how the RWE approach can help to address those constraints. Readers new to the evaluation field might also find this chapter useful as a general introduction to the planning, design, implementation, dissemination, and use of any evaluation. This chapter is designed to be both an introduction to RWE as well as a useful condensation of many of the main points of the book. Figure 1 summarizes the seven steps of the RWE approach and this overview also includes references to other chapters of the book where more detailed coverage of particular issues can be found. A few of the important tables from other chapters have been included at the end of this stand-alone version.

Bamberger, M., Rugh, J., & Mabry, L. (2006). RealWorld Evaluation: Working under budget, time, data, and political constraints. Thousand Oaks: Sage

Notes: This text provides specific guidance on how to conduct evaluations when working under resource and/or data constraints. The authors demonstrate ways for addressing each of these constraints through practical examples from both developed and developing countries to show how adapting to different types of constraints can lead to successful evaluations. A short booklet that summarise some of the key material in this text is available from the World Bank as "Conducting Quality Impact Evaluations under Budget, Time and Data Constraints."

Chelimsky, E., & Shadish, W. (Eds.). (1997). Evaluation for the 21st century: A handbook. Thousand Oaks: Sage

Notes: A selection of the great and the good in evaluation explore questions around the development of evaluation methodologies at the turn of the century. Questions addressed include: the most useful methodological tools; new methodologies; changing evaluands in the 21st century; and the nature of evaluation in the 21st century. Topics discussed include: what makes evaluation different from other disciplines; the links and differences between the evaluation and the auditing professions; which activities have priority in evaluation; new methodological approaches to doing evaluation; the issues of advocacy versus truth in evaluation; and evaluating programmes versus empowering people to evaluate their own programmes. The book is a bit of a mixed bag, but still contains some useful nuggets.

Molund, S., & Schill, G. (2004). Looking Back, Moving Forward: Sida Evaluation Manual. Stockholm: Sida. Last viewed on 16 June, 2008. URL: http://www.oecd.org/dataoecd/51/26/35141712.pdf

Notes: This is my favourite general evaluation manual. This is a manual for evaluation of development interventions. It is intended primarily for Sida staff, but may also be useful to Sida‘s co-operation partners and independent evaluators engaged to evaluate activities supported by Sida. It consists of two main parts. The first deals with the concept of evaluation, roles and relationships in evaluation, and the evaluation criteria and standards of performance employed in development co-operation. The second is a step-by-step guide for Sida programme officers and others involved in the management of evaluations

Page 5: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

5

initiated by Sida or its partners. Although it can be read in its entirety all at once, most readers will probably use it in a piecemeal fashion. A reader who wants a rapid overview of basic concepts in the evaluation of development interventions should consult part one. A reader who is engaged in the management of an evaluation, may turn directly to part two and return to part one as need arises. Some familiarity with basic development cooperation terminology is helpful, but not essential. What is required, on the other hand, is a readiness to engage with new concepts, some of which may seem complicated at first glance. A certain tolerance of ambiguity is also useful. Although evaluation is a professional field of its own, many of its key terms do not have standard definitions. The manual adheres to the terminological conventions recommended by the OECD/DAC Glossary of Key Terms in Evaluation and Results-Based Management, which is included as an annex. Yet, in some cases it also takes notice of alternative usage. For Sida evaluation managers who must agree on conceptual matters with cooperation partners and professional evaluators some knowledge of the variability of international evaluation terminology can be very useful. Two limitations should be noted. One is that this is a handbook for evaluation managers rather than evaluators. While it deals extensively with conceptual and organisational matters, it has little to say about the technicalities of the evaluation research process. Still, evaluators may find it useful as a reference point for discussions about evaluation with Sida and its partners. A second limitation is that it focuses on matters that are common to all or most kinds of development evaluations. It does not deal with the peculiarities of the different sub-fields of development evaluation. Readers seeking information about matters that are unique to the evaluation of humanitarian assistance, research co-operation, programme support, and so on, must turn to other manuals.

Morra Imas, L. G., & Rist, R. C. (2009). The Road to Results. Washington: World Bank Notes: This is an evaluation text written specifically for the evaluation of development interventions. It is

based around the modules from the International Programme for Development Education Training (IPDET), but goes beyond these. IPDET is the leading forum for training development evaluators. Built as it is around a long-running development evaluation training programme, this is a solid text for development evaluators. The book starts with a discussion of the context and history of development evaluation. The drivers for development evaluation are then discussed. The authors repeatedly emphasise the need for results-based measurement, and favour using an evaluation approach built around a theory of change (a theory linking programme inputs to the desired change). The text contains lots of examples drawn from practice. A final chapter looks at future trends in development evaluation.

Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Thousand Oaks: Sage

Notes: Both practical and theoretical, this key text shows how to conduct program evaluations and why to conduct them in the manner prescribed. This entirely rewritten edition offers readers a full-fledged evaluation text from identifying primary users of an evaluation to focusing the evaluation, making methods decisions, analyzing data, and presenting findings. This fourth edition: provides updated materials in all chapters; offers new developments in stakeholder analysis; includes a utilization-focused checklist; gives greater emphasis to mixed methods; details the explosion of international evaluation; and contains updated web references, examples, and citations as well as new cartoons.

Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation : a systematic approach (7th ed.). Thousand Oaks, CA: Sage

Notes: This is a best-seller in evaluation with over 100,000 copies sold. This is the seventh edition and it justified it best-seller status. The text covers a good range of evaluation topics, including: setting evaluation questions; uncovering program theory; studying implementation; designing impact assessments; assessing program costs and benefits; and understanding the politics of evaluating. The authors draw on their decades of hands-on experience conducting evaluations to provide scores of examples to help readers understand how evaluators deal with various critical issues. This is probably the most comprehensive and authoritative general evaluation text available.

Scriven, M. (1991). Evaluation thesaurus (4th ed.). Newbury Park: Sage

Page 6: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

6

Notes: Written by one of the leaders in evaluation this text provides readers with a quick analysis of the leading concepts, positions, acronyms, processes, techniques, and checklists in the field of evaluation. Containing nearly 2,000 entries, the thesaurus offers readers a guide for understanding the relation of evaluation to the doctrine of value-free social science, ways to integrate the parts of multi-dimensional evaluations into an overall rating, the realities of evaluation consulting, and techniques for the use of spreadsheets in qualitative evaluation. Topics new to this edition include: recent work in personnel evaluation and its relevance to program evaluation; new material on the evaluation of scientific theories; new ways to extend objective testing beyond multiple choice items without loss of speed of correcting; new uses of computers in evaluation; analyses of related concepts such as diagnosis, policy, etiology, risk assessment, focus groups, quality circles and the economic realities of consulting including conflict of interest; types of contract; and press releases.

Wholey, J., Hatry, H., & Newcomer, K. (2004). Handbook of practical program evaluation (2nd ed.). San Francisco: Jossey-Bass

Notes: This text offers managers, analysts, consultants, and educators in government, nonprofit, and private institutions an outline of methods for assessing program results and identifying ways to improve program performance. This is the second edition and has been thoroughly revised. New chapters include chapters on logic modelling and on evaluation applications for small non-profit organizations. The book covers both in–depth program evaluations and performance monitoring. It presents evaluation methods that will be useful at all levels of government and in non-profit organizations.

WK Kellogg Foundation. (1998). WK Kellogg Foundation Evaluation Handbook. Battle Creek: WK Kellogg Foundation. Last viewed on 06 July 2009. URL: http://www.wkkf.org/Pubs/Tools/Evaluation/Pub770.pdf

Notes: This handbook provides a fairly in-depth look at some basic questions, such as the limitations of scientific methods in evaluation. One of the key aims of evaluation for the Kellogg Foundation is to improve and strengthen the projects being evaluated. Alternative methods such as feminist; participatory and theory-based evaluations are considered. The Handbook also provides background reading on logic models, for which there is a separate manual by the Foundation. The Handbook identifies four steps in planning an evaluation: identifying stakeholders; developing evaluation questions; budgeting for the evaluation; and selecting an evaluator. This is followed by three steps in implementing and two steps in utilisation.

GUIDELINES (18) Baker, J. (2000). Evaluating the Impact of Development Projects on Poverty: A Handbook for

Practitioners (World Bank Publications. Last viewed on 9 February 2009. URL: http://www.acehinstitute.org/Book-Evaluating%20the%20Impact%20of%20Poverty%20(World%20Bank).pdf

Notes: This handbook seeks to provide project managers and policy analysts with the tools needed for evaluating project impact. It is aimed at readers with a general knowledge of statistics. For some of the more in-depth statistical methods discussed, the reader is referred to the technical literature on the topic. Chapter 1 presents an overview of concepts and methods, Chapter 2 discusses key steps and related issues to consider in implementation, Chapter 3 illustrates various analytical techniques through a case study, and Chapter 4 includes a discussion of lessons learned from a rich set of ―good practice" evaluations of poverty projects that have been reviewed for this handbook. The case studies, included in Annex I, were selected from a range of evaluations carried out by the World Bank, other donor agencies, research institutions, and private consulting firms. They were chosen for their methodological rigor, in an attempt to cover a broad mix of country settings, types of projects, and evaluation methodologies. Also included in the Annexes are samples of the main components that would be necessary in planning any

Page 7: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

7

impact evaluation-sample terms of reference, a budget, impact indicators, a log frame, and a matrix of analysis.

Beck, T. (2006). Evaluating humanitarian action using the OECD-DAC criteria. London: ALNAP. Last viewed on 8 June, 2008. URL: www.odi.org.uk/alnap/publications/eha_dac/pdfs/eha_2006.pdf

Notes: This clearly-written guide provides practical support on how to use the OECD Development Assistance Committee (OECD/DAC) criteria in evaluation of humanitarian action (EHA). It covers the following areas: 1) key themes and issues current in EHA, particularly lesson-learning, accountability and evaluation; 2) clear definitions for the OECD DAC criteria with explanation, issues to consider, and examples of good practice; 3) very brief guidelines for good practice in methods for the evaluation of humanitarian action. This short book provides the clearest definitions of the DAC evaluation criteria available anywhere.

Bond, S. L., Boyd, S. E., Rapp, K. A., Raphael, J. B., & Sizemore, B. A. (1997). Taking stock: A practical guide to evaluating your own programs. Chapel Hill: Horizons Research. Last viewed on 6 July 2009. URL: http://www.horizon-research.com/reports/1997/stock.pdf

Notes: It is a practical guide to program evaluation written for community-based organizations (CBOs). It provides information that you can put to use now to help improve your programs. This manual focuses on internal evaluation—that is, program evaluation conducted in-house by CBO staff. We have taken this approach for one simple reason: many CBOs cannot afford to hire someone outside the organization to evaluate their programs, but they still need the kinds of information that evaluation can provide. The information in this manual should better prepare you to design and carry out a program evaluation. And because the field of evaluation is now putting greater emphasis on participatory evaluation (a middle ground between internal and external evaluation), you will be able to apply your knowledge either within your own organization or in working with an external evaluator. This manual will also help you recognize when you might need an external evaluator and the advantages of using these services, should your CBO be able to afford them at some point.

Centers for Disease Control. (1999). Framework for Program Evaluation in Public Health (MMWR Recommendations and Reports RR-11). Atlanta: Centers for Disease Control. Last viewed on 12 July 2009. URL: ftp://ftp.cdc.gov/pub/Publications/mmwr/rr/rr4811.pdf

Notes: CDC developed the framework for program evaluation to ensure that CDC remain accountable and committed to achieving measurable health outcomes against a background of rapid change in the public health sector. The intention is to incorporate the principles of the Framework into all CDC program operations, to stimulate innovation toward outcome improvement and be better positioned to detect program effects. More efficient and timely detection of these effects will enhance CDC'sr ability to translate findings into practice. Findings from prevention research will lead to program plans that are clearer and more logical; stronger partnerships will allow collaborators to focus on achieving common goals; integrated information systems will support more systematic measurement; and lessons learned from evaluations will be used more effectively to guide changes in public health strategies. Because categorical strategies cannot succeed in isolation, public health professionals working across program areas must collaborate in evaluating their combined influence on health in the community. Only then will we be able to realize and demonstrate the success of CDC's vision — healthy people in a healthy world through prevention.

Centers for Disease Control. (2005). Introduction to Program Evaluation for Public Health Programs: A Self-Study Guide. Atlanta: Centers for Disease Control and Prevention. Last viewed on 12 July 2009. URL: http://www.cdc.gov/eval/evalguide.pdf

Notes: This document is a ―how to‖ guide for planning and implementing evaluation activities. The manual is based on CDC‘s Framework for Program Evaluation in Public Health, and is intended to assist state, local, and community managers and staff of public health programs in planning, designing, implementing, and using the results of

Page 8: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

8

comprehensive evaluations in a practical way. The strategy presented in this manual will help assure that evaluations meet the diverse needs of internal and external stakeholders, including assessing and documenting program implementation, outcomes, efficiency, and cost-effectiveness of activities, and taking action based on evaluation results to increase the impact of programs.

Danida. (2006). Evaluation Guidelines. Copenhagen: Ministry of Foreign Affairs of Denmark. Last viewed on 11 July 2009. URL: http://www.um.dk/NR/rdonlyres/4BA486C7-994F-4C45-A084-085D42B0C70E/0/Guidelines2006.pdf

Notes: The primary purpose of the Evaluation Guidelines is to communicate to partners and external consultants Danida‘s expectations of the quality of evaluations. They constitute a framework built on principles, criteria, standards, good practices and information about Danida‘s evaluation process. Particular emphasis has been put on defining the roles of various stakeholders in evaluations and on developing codes of conduct for these stakeholders. The guidelines are not intended as an evaluation manual. Danida requires that its external consultants have the competencies expected of evaluation professionals: a sound grasp of evaluation methodologies, the skills and abilities to carry out evaluations as well as up to date knowledge of developments in the field of evaluation, in particular development evaluation.

ECB. (2007). The Good Enough Guide: Impact Measurement and Accountability in Emergencies. Oxford: Oxfam for the Emergency Capacity Building Project. Last viewed on 1 July 2009. URL: http://www.oxfam.org.uk/what_we_do/resources/downloads/Good_Enough_Guide.pdf

Notes: The Good Enough Guide helps busy field workers to know what difference they are making. It offers a set of basic guidelines on how to be accountable to local people and measure programme impact in emergency situations and contains a variety of tools on needs assessment and profiling. Its 'good enough' approach emphasises simple and practical solutions and encourages the user to choose tools that are safe, quick, and easy to implement. This pocket guide presents some tried and tested methods for putting impact measurement and accountability into practice throughout the life of a project. It is aimed at humanitarian practitioners, project officers and managers with some experience in the field, and draws on the work of field staff, NGOs, and inter-agency initiatives, including Sphere, ALNAP, HAP International, and People In Aid.

Groupe URD. (2005). Project Evaluation Companion Book. La Fontaine des Marins: Groupe URD. Last viewed on 07 July 2009. URL: http://www.projetqualite.org/compas/outil/uk/docs/project_evaluation_companion_book.doc

Notes: The Project Evaluation Companion Book presents an overview of the project evaluation section of the Quality COMPAS. Its format and content have been designed for practical use in the field and for facilitating group work. The Quality COMPAS is built around a quality reference framework, the compass rose, and the evaluation compass rose is presented on pages 4-5. The evaluation compass rose, composed of twelve criteria that define the quality of an evaluation, is centred on crisis-affected populations and their context. The Quality COMPAS method has two main sections: project management and project evaluation. The project evaluation function presented in this companion book is aimed aid workers who are in charge of one or more phase of the evaluation process. Quality assurance principles have been applied to both the project management and the project evaluation functions of the Quality COMPAS. The twelve quality criteria that form the compass rose are broken down into key questions.

Groupe URD. (2005). Project Management Companion Book. La Fontaine des Marins: Groupe URD. Last viewed on 07 July 2009. URL: http://www.projetqualite.org/compas/outil/uk/docs/project_management_companion_book.doc

Notes: The Project Management Companion Book presents an overview of the project management section of the Quality COMPAS. The Quality COMPAS is built around a

Page 9: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

9

quality reference framework, the compass rose, which is presented on pages 4-5. The compass rose, composed of twelve criteria that define the quality of a humanitarian project, is centred on crisis-affected populations and their context. The Quality COMPAS method has two main sections: project management and project evaluation. The project management function presented in this companion book is aimed at aid workers who are in charge of one or more project cycle phases. Quality assurance principles have been applied to both the project management and the evaluation functions of the Quality COMPAS.

Hallam, A. (1998). Evaluating Humanitarian Assistance Programmes in Complex Emergencies (RRN Good Practice Review 7). London: Overseas Development Institute. Last viewed on 8 June, 2008. URL: http://www.odihpn.org/documents/gpr7.pdf

Notes: This Good Practice Review is the written output of an OECD/DAC project initiated to identify and disseminate best practice in the evaluation of humanitarian assistance programmes. The study was intended to: improve the consistency and quality of evaluation methodologies; enhance the accountability function of evaluation; contribute to institutionalising the lessons learned; identify better methods for monitoring performance of humanitarian aid operations. A slightly edited version was published by the OECD/DAC as Guidance for Evaluating Humanitarian Assistance in Complex Emergencies in the following year.

Jones, R., Young, V., & Stanley, C. (2004). CIDA Evaluation Guide. Ottawa: Canadian International Development Agency. Last viewed on 11 July 2009. URL: http://www.acdi-cida.gc.ca/INET/IMAGES.NSF/vLUImages/Performancereview5/$file/English_E_guide.pdf

Notes: This guide was prepared to ensure that CIDA‘s staff, consultants and partners are properly informed about how evaluations of CIDA‘s investments in development cooperation are to be carried out, and what they are expected to achieve. The Guide sets out to: 1) build awareness and understanding about the evaluation function, and the role of evaluators; 2) enhance the value realized from CIDA‘s investments in evaluations; 3) ensure compliance with Treasury Board Secretariat (TBS) requirements, and; 4) 4romote effective, consistent management and work practices, both at headquarters and in the field.

OECD/DAC. (1999). Evaluating Country Programmes: Vienna Workshop, 1999 (Evaluation and aid effectiveness). Paris: Organisation for Economic Co-operation and Development, Development Assistance Committee. Last viewed on 16 June, 2008. URL: http://www.oecd.org/dataoecd/10/26/2667302.pdf

Notes: The increasingly widespread adoption of country assistance strategies has challenged evaluators to assess results at the country level. As documented in this publication, development agencies have employed a range of different approaches to carry out country programme evaluations. In order to review these experiences and to share lessons about strengths and weaknesses inherent in different approaches to country programme evaluation, the Working Party on Aid Evaluation organised a workshop in Vienna on 11 - 12 March 1999. This workshop principally focused on progress made since the first country programme evaluation workshop, also held in Vienna, in May 1994. This publication provides a unique overview of approaches and methods for country programme evaluation. It contains i) an analytical review of the workshop‘s main findings; ii) an overview of the state-of-the-art approaches and methodologies used for country programme evaluations; and iii) the case studies, which were presented and discussed at the workshop.

OECD/DAC. (1999). Guidance for Evaluating Humanitarian Assistance in Complex Emergencies (1999) (Evaluation and aid effectiveness: 1). Paris: Organisation for Economic Co-operation and Development, Development Assistance Committee. Last viewed on 09 July 2009. URL: http://www.oecd.org/dataoecd/9/50/2667294.pdf

Notes: This Guidance is aimed at those involved in the commissioning, design and management of evaluations of humanitarian assistance programmes principally within donor organisations but is also likely to be of use to UN agencies, NGOs and other

Page 10: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

10

organisations involved in the provision of humanitarian assistance. It is not intended as an exhaustive guide as specialised texts are available, but to complement the existing DAC Principles on Aid Evaluation by highlighting those areas which require special attention, the nature of the activities undertaken and the multi-actor, highly interconnected system by which the international community provides humanitarian assistance. The research for this publication was conducted by ODI and almost the identical text was publised by ODI as (Hallam, 1998).

OECD/DAC. (2000). Donor Support for Institutional Capacity Development in Environment: Lessons Learned (Evaluation and aid effectiveness: 3). Paris: Organisation for Economic Co-operation and Development, Development Assistance Committee. Last viewed on 09 July 2009. URL: http://www.oecd.org/dataoecd/10/27/2667310.pdf

Notes: This Summary document presents a review of donor support for Capacity Development in Environment (CDE). It is based on an assessment of a sample of approximately 70 evaluation and review reports provided by the DAC Members, relevant literature, analysis of 13 responses to a structured CDE questionnaire and selected DAC Member and institutional visits and interviews. A peer review of draft documents was undertaken by the International Institute for Environment and Development (IIED). The Summary and attendant Main Report constitute revisions of ―Work in Progress‖ documents presented at earlier WP-EV meetings. The study has examined the functional objectives of CDE processes as a basis for prioritising areas where further efforts are required to improve DAC Members‘ environmental performance. The study had, of necessity, to simplify an extremely complex set of issues and concepts. There are, in addition, numerous difficulties in assessing outcomes in relation to themes such as ―environment‖ or ―capacity development‖, particularly in the contexts of the widespread policy and institutional reforms of both DAC Members and recipient governments during the 1990s. An important limitation of the study is that it relies disproportionately on donor agency documentation.

OECD/DAC. (2002). Glossary of key terms in evaluation and results based management (Evaluation and Aid Effectiveness: 6). Paris: OECD/DAC Working Party on Aid Evaluation. Last viewed on 21 January 2009. URL: http://www.oecd.org/dataoecd/29/21/2754804.pdf

Notes: The DAC Working Party on Aid Evaluation (WP-EV) has developed this glossary of key terms in evaluation and results-based management because of the need to clarify concepts and to reduce the terminological confusion frequently encountered in these areas. Evaluation is a field where development partners – often with widely differing linguistic backgrounds – work together and need to use a common vocabulary. Over the years, however, definitions evolved in such a way that they bristled with faux amis, ambivalence and ambiguity. It had become urgent to clarify and refine the language employed and to give it a harmonious, common basis. With this publication, the WP-EV hopes to facilitate and improve dialogue and understanding among all those who are involved in development activities and their evaluation, whether in partner countries, development agencies and banks, or non-governmental organisations. It should serve as a valuable reference guide in evaluation training and in practical development work.

OECD/DAC. (2008). Evaluating Conflict Prevention and Peacebuilding Activities: Working draft for application period: A joint project of the DAC Network on Conflict, Peace and Development Co-operation and the DAC Network on Development Evaluation. Paris: OECD/DAC. Last viewed on 6 July 2009. URL: http://www.oecd.org/secure/pdfDocument/0,2834,en_21571361_34047972_39774574_1_1_1_1,00.pdf

Notes: This working draft develops guidance on conducting effective evaluations of conflict prevention and peacebuilding work. The working draft is intended to be used for a one year application phase through 2008. It is the result of an ongoing collaborative project by the OECD DAC Networks on Development Evaluation and on Conflict, Peace and Development Co‐operation (CPDC). The two Networks began this collaboration in 2005, responding to the need expressed by CPDC members for greater clarity regarding techniques and issues of evaluation in their field. Given the complexity of work in this

Page 11: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

11

field and the need to address different audiences, evaluators and peacebuilding practitioners alike, this working draft has extensive annexes containing specific information to compliment the shorter main text. The main text is divided into a general introduction, an outline of key planning and programming steps, and a description of the evaluation process itself. Individual readers may choose to focus on particular sections, according to their interest and needs.

WK Kellogg Foundation. (2004). Logic model development guide: Using Logic Models to Bring Together Planning, Evaluation, and Action. Battle Creek: WK Kellogg Foundation. Last viewed on 7 July 2009. URL: http://gametlibrary.worldbank.org/FILES/921_Kelloggs%20Program%20Logic%20Model.pdf

Notes: The program logic model is defined as a picture of how your organization does its work – the theory and assumptions underlying the program.A program logic model links outcomes (both short- and long-term) with program activities/processes and the theoretical assumptions/principles of the program. This guide is a companion publication to the Evaluation Handbook, focuses on the development and use of the program logic model.We have found the logic model and its processes facilitate thinking, planning, and communications about program objectives and actual accomplishments.Through this guide, we hope to provide an orientation to the underlying principles and language of the program logic model so it can be effectively used in program planning, implementation, and dissemination of results.

World Bank. (2007). Sourcebook for Evaluating Global and Regional Partnership Programs: Indicative Principles and Standards. Washington: The Independent Evaluation Group of the World Bank. Last viewed on 28 June 2008. URL: http://siteresources.worldbank.org/EXTGLOREGPARPRO/Resources/sourcebook.pdf

Notes: The first draft of this Sourcebook (dated August 31, 2006) was discussed and reviewed at a stakeholder consultative workshop held for this purpose in Paris on September 28–29, 2006. The present version, which incorporates the feedback received at the workshop, was presented to the Fifth Meeting of the DAC Evaluation Network in Paris on November 16–17, 2006. At its Fifth Meeting, the DAC Evaluation Network recommended a period of practical application, use, and review, rather than formal endorsement at this stage. It encourages the governing bodies and management units of GRPPs in which DAC members are involved to draw upon them in establishing their monitoring and evaluation policies and in conducting independent evaluations of their programs on a regular basis. It further encourages those who use this Sourcebook to provide feedback to IEG and the Network based on their experience, in order to inform and further improve the document for eventual formal endorsement by Network members.

EVALUATION POLICY (5) DFID. (2009). Building the evidence to reduce poverty: The UK‘s policy on evaluation for

international development. London: Department for International Development. Last viewed on 2 July 2009. URL: http://www.oecd.org/dataoecd/39/54/43183649.pdf

Notes: This document sets out the evaluation policy for the UK's development work (including humanitarian action). The guide set out the general policy as well as setting out eight evaluation criteria: Relevance; Effectiveness; Efficiency; Impact; Sustainability; Coverage; Coherence; and Coordination. The guide sets our four prinicples for evaluation methods: (a) designing evaluation in at the beginning so the right data can be collected early; (b) a ‗theory-driven approach‘, which seeks to establish how a policy or programme is supposed to work as part of a clear conceptual model which can be tested; (c) using both quantitative and qualitative methods; (d) evaluating joint interventions jointly with other partners, with the aim of distinguishing DFID‘s contribution from overall effects.

NORAD. (2006). Evaluation Policy 2006-2010: Part 1 Strategic priorities: Part 2 Evaluation Programme 2006-2008: Part 3 Guidelines for Evaluation of Norwegian Development

Page 12: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

12

Cooperation. Oslo: Norad. Last viewed on 14 November 2009. URL: http://www.norad.no/en/Tools+and+publications/Publications/Publication+Page?key=109574

Notes: The Evaluation Department in the Norwegian Agency for Development Cooperation (Norad) is responsible for the planning and implementation of evaluation of activities financed over the Norwegian development co-operation budget and for communicating the results to the decision-makers and public in general. The Evaluation Department also acts as the advisor to Norad and the Ministry of Foreign Affairs (MFA) in technical evaluation matters, and is Norway's representative in international evaluation work. This evaluation work is based on a mandate from the MFA. Activities are based on the political priorities set out by the Norwegian Parliament (Stortinget) and the Government of Norway, and derive from Norad's strategy.

Office of Evaluation. (2003). IFAD Evaluation Policy. Rome: International Fund for Agricultural Development. (IFAD No. Last viewed on 14 November 2009. URL: http://www.ifad.org/pub/policy/oe.pdf

Notes: This document sets out IFAD's evaluation policy. The focus is on independent evaluation. Part One outlines the policy framework, which consists of the purpose of independent evaluation and its stakeholders, the evaluation principles and the operational policies to be used by IFAD in its independent evaluation work. Part Two details operational procedures, organizational measures and other arrangements that ensure OE‘s independence from IFAD management and enhance its effectiveness. Part Three presents the role of the Executive Board and its Evaluation Committee in relation to the independent evaluation function, and the terms of reference (TOR) of the OE Director. Part Four describes how the policy will become effective, including the staggered introduction of particular provisions. The annexes summarize the guidelines and provisions for policy formulation laid down by the consultation on the sixth replenishment of IFAD‘s resources, outline important milestones in the organization of monitoring and evaluation at IFAD, introduce the types of evaluation that OE undertakes, and recapitulate the current TOR of the Executive Board‘s Evaluation Committee.

Rijneveld, W. (2006). Evaluation Policy: 2007-2010. Gorinchem: Stickting Woord en Daad. Last viewed on 14 November 2009. URL: http://www.mande.co.uk/docs/060928.Evaluation%20Policy%202007-2010.WR.ENG.pdf

Notes: This document from Stickting Woord en Daad (the Word and Deed foundation) sets out the organisation's evaluation policy. The policy used a pyramidical model to link monitoring, evaluation, and learning. The base of the pyramid is formed by monitoring, the middle by project evaluation, and the top by programme evaluation. These are envisaged as linked to Swieringa and Wierdsma learning model (1992, Becoming a Learning Organisation). Thus monitoring is linked with operational or single-stroke learning (improvement). Innovation is linked to project evaluation with double-stroke learning at the level of objectives and approaches. The model links development with programme evaluation with striple-stroke learning at the level of mission and strategy.

WFP. (2008). Draft Evaluation Policy. Rome: World Food Programme. Last viewed on 14 November 2009. URL: http://documents.wfp.org/stellent/groups/public/documents/resources/wfp177705.pdf

Notes: This presents the revised evaluation policy at WFP. The revision was recommended by the Peer Review of the evaluation function at WFP. This policy seeks to safeguard the independence of the evaluation at WFP through both structural and institutional means and to address some weaknesses identified in the peer review.

INTERVIEW (14) Atkinson, R. (2001). The Life Story Interview. In J. Gubrium & J. Holstein (Eds.), Handbook of

interview research: Context & method (pp. 121-140). Thousand Oaks: Sage. Last viewed on

Page 13: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

13

26 November 2009. URL: http://wiki.mediamind.org/images/c/c7/Atkinson_LifeStoryInterview.pdf

Notes: A description of the Life Story Interview method by Robert Atkinson. Telling the stories of our lives is so basic to our nature that we are largely unaware of its importance. We think in story form, speak in story form, and bring meaning to our lives through story. People everywhere are telling stories about some piece of their lives to friends and strangers alike. The stories we tell of our lives carry ageless, universal themes or motifs, and are always variations of one of the thousands of folk tales, myths, or legends, that have spoken to us for generations of our inner truths. Stories connect us to our roots.

Drever, E. (2003). Using semi-structured interviews in small-scale research: a teacher's guide (revised ed.). Glasgow: Scottish Council for Research in Education

Notes: This is an excellent little publications. It is a very short and clearly written introduction for those about to use semi-structures interviews for the first time. Chapters cover the whole gamut of using interviews: 1) the use of interviews; 2) different kinds of interviews; 3) the interview schedule (although evaluators should be warned that the examples of schedules are based on a very limited number of research questions in comparison to the number of questions that most evaluations are asked to answer); 4) planning and preparation; 5) doing the interview; 6) analysing the interview; and 7) reporting and communication. Although the book is intended for teachers doing small research projects the advice offered is good for all of those conducting semi-structured interviews for the first time.

General Accounting Office. (1991). Using Structured Interviewing Techniques (Methodology Transfer Paper PEMD-10.1.5). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://www.gao.gov/special.pubs/pe1015.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This guide for GAO evaluators on structured interviewing techniques focuses on: 1) how its evaluators should incorporate such techniques when conducting studies; 2) when such techniques should be used and what steps should be followed; and 3) techniques for designing a structured interview, pretesting, training interviewers, selecting and contacting interviewees, conducting the interviews, and analyzing the data.

Holstein, J. A., & Gubrium, J. F. (1995). The active interview (Sage University Papers Series on Qualitative Research Methods: 37). Thousand Oaks: Sage

Notes: Interviews were once thought to be the pipeline through which information was transmitted from a passive subject to an omniscient researcher. However the new `active interview' considers interviewers and interviewees as equal partners in constructing meaning around an interview. This interpretation changes a range of elements in the interview process - from the way of conceiving a sample to the ways in which the interview may be conducted and the results analyzed. In this guide, the authors outline the differences between active interviews and traditional interviews and give novice researchers clear guidelines on conducting a successful interview.

Kaiser, A. (1979). Questioning Techniques: A practical guide to better communication (N. M. Lechelt & U. Marten, Trans.). Farnham: Momenta Publishing

Notes: This book addresses how to ask better and clearer questions, how to relate to the answers you get, and how to build your discussions into useful dialogues. The book is written for anyone using questions in their daily work. It includes examples of conversations and the tools for analysis that the author advocates.

Kumar, K. (1989). Conducting key informant interviews in developing contries (AID program design and evaluation methodology report 13). Washington: Agency for International Development, Centre for Development Information and Evaluation. Last viewed on 7 December 2009. URL: http://pdf.usaid.gov/pdf_docs/PNAAX226.pdf

Page 14: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

14

Notes: Although key informant interviews are widely conducted in development settings, the quality and nature off the information they generate remain suspect for a variety of reasons. Key informants are not carefully selected. Interview guides are not prepared in advance. questions are inaptly worded and clumsily asked. The responses are not properly recorded and systematically analyzed. And above all, the findings are not satisfactorily verified. Thus, too often, this potentially useful and versatile method of data collection becomes a poorly planned activity generating information of dubious value and low credibility. To improve the quality of information for use in project and program design, implementation, and evaluation, this report outlines the steps involved in gathering and analyzing information thrrough key iknformant interviews. It also discusses the advantages, limitations, and possible uses of such information.

Kvale, S., & Brinkmann, S. (2008). Interviews: An introduction to qualitative research interviewing (2nd ed.). Los Angeles: Sage

Notes: This book provides readers in a wide variety of disciplines with the whys and hows of research interviewing. The revised edition retains the seven-stage structure of the first edition, continuing to focus on the practical, epistemological, and ethical issues involved with interviewing. Authors Steinar Kvale and Svend Brinkmann also include coverage of newer developments in qualitative interviewing, discussion of interviewing as a craft, and a new chapter on linguistic modes of interview analysis. Practical and conceptual assignments, as well as new tool boxes, provide readers with the means to dig deeper into the material presented and achieve a more meaningful level of understanding.

McCracken, G. D. (1988). The long interview (Sage University Papers Series on Qualitative Research Methods: 13). Newbury Park, Calif.: Sage Publications

Notes: This monograph provides a systematic guide to the theory and methods of the long qualitative interview or intensive interviewing. It gives a clear explanation of one of the most powerful tools of the qualitative researcher. The volume begins with a general overview of the character and purpose of qualitative inquiry and a review of key issues. The author outlines the four steps of the long qualitative interview and how to judge quality. He then offers practical advice for those who commission and administer this research, including sample questionnaires and budgets to help readers design their own. The author introduces key theoretical and methodological issues, various research strategies, and a simple four stage model of inquiry, from the design of an open-ended questionnaire to the write up of results.

Merton, R. K., Fiske, M., & Kendall, P. L. (1990). The focused interview: A manual of problems and procedures. New York: Free Press

Notes: In 1956, the Free Press published a report of Columbia's Bureau of Applied Social Research - co-authored by Merton, Marjorie Fiske and Patricia I.Kendall - which outlined a set of techniques aimed at eliciting the specific responses of individuals and groups to particular events and situations. The book may be regarded as seminal within sociology, spawning a whole field of qualitative opinion research that has continued to evolve through half a century of inquiry. This is a 1990 reissue of the book, with a new preface by Merton, a select bibliography of writings on the focused interview and focus group research, and a new introduction that traces the diffusion of Merton's technique from sociology to other fields, including history, psychology, mass media and marketing research.

Rubin, H., & Rubin, I. (2005). Qualitative interviewing: The art of hearing data (2nd ed.). Thousand Oaks: Sage

Notes: This is a good text on qualitative interviewing. The two authors are very experienced and draw many of the examples from their own experience. While the text covers a broad range of qualitative interviews, it is of use also for evaluators. I found the chapters on the design of questions particularly useful. This is a very solid text that gives a good overview of qualitative interviews and covers all aspects from the selection of the topic to writing up.

Page 15: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

15

Serrat, O. (2008). Conducting Exit Interviews (Knowledge Solutions 2). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Conducting-Exit-Interviews.pdf

Notes: Exit interviews provide feedback on why employees leave, what they liked about their job, and where the organization needs improvement. They are most effective when data is compiled and tracked over time. The concept has been revisited as a tool to capture knowledge from leavers. Exit interviews can be a win-win situation: the organization retains a portion of the leaver‘s knowledge and shares it; the departing employee articulates unique contributions and leaves a mark. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Asking Effective Questions (Knowledge Solutions 52). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/asking-effective-questions.pdf

Notes: Questioning is a vital tool of human thought and interactional life. Since questions serve a range of functions, depending on the context of the interaction, the art and science of questioning lies in knowing what question to ask when. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

USAID. (1996). Conducting Key Informant Interviews (Performance monitoring and evaluation tips: 2). Washington: USAID Center for Development Information and Evaluation. Last viewed on 20 September 2008. URL: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby233.pdf

Notes: USAID guidance for evaluators conducting key informant interviews. Key informnant interviews are qualitative, in-depth interviews of 15 to 35 people selected for their first-hand knowledge about a topic of interst. The interviews are loosely structured, relying on a list of issues to be discussed. Key informant interviews resemble a conversation among acquaintances, allowing a free flow of ideas and information. Interviewers frame questions spontaneously, probe for information and takes notes, which are elaborated on later.

Weiss, R. S. (1995). Learning from strangers: The art and method of qualitative interview studies. New York: Free Press

Notes: This book concentrates on in-depth or unstructured interviewing rather than the semi-structured approach more common in humanitarian evaluation. In-depth interviews are sometimes used in humanitarian evaluation to find out about the affected population's experience of disaster events, as for the 2001 DEC evaluation of the Mozambique floods where Nelia Taimo conducted in-depth interviews. The book provides some guidance for the interviewer including selecting interviewees and developing the interview guides. The concept of markers, where interviewees refer in passing to an important event or feeling state, is an important one that can be used by all interviewers to partially uncover the interviewees world-view something in passing. The author illustrates his points with substantial excerpts from interview transcripts. The result is a how-to book that is also an interesting read. One word of caution: the author argues that in editing for publication it is not permissible to insert words or change words (other than to correct things like 'gotta' to 'got to'), but that it is acceptable to excise words or sections. He does this without indicating that the quotes have been changed. It is more correct to indicate when parts of quotes are excised by using ellipses.

Page 16: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

16

FOCUS GROUPS (5) Krueger, R., & Casey, M. (2009). Focus groups: A practical guide for applied research (4th ed.).

Thousand Oaks: Sage

Notes: This probably the standard text on running a focus group for sociological research. It is a superb, well-written, 'hands-on' guide which details all the practicalities of designing focus groups, running them successfully, and analysing the results effectively. It provides clear step-by-step guidance for the whole process with practical advice based on experience with hundreds if not thousands of focus groups. It is clearly based on years of experience and offers a quick way to access the learning of two leaders in the use of focus groups. The techniques described are applicable to any type of focus group.

Krueger, R. A. (2002). Designing and Conducting Focus Group Interviews. St Paul: University of Minnesota. Last viewed on 20 September 2008. URL: http://www.shadac.umn.edu/img/assets/18528/FocGrp_Krueger_Oct02.pdf

Notes: This is a summary guide to designing and conduction focus group interviews. It is brief and to the point, with little discussion but with the main points highlighted. It also contains sample questions. It has a 17 item bibliography.

Kumar, K. (1987). Conducting group interviews in developing contries (AID program design and evaluation methodology report 8). Washington: Agency for International Development, Bureau for Program and Policy Coordination. Last viewed on 7 December 2009. URL: http://pdf.usaid.gov/pdf_docs/PNAAL088.pdf

Notes: The Group interview is one of the rapid, cost-effective data collection methods. It involves the use of direct probing techniques to gather information from several individuals in a group situation. Although superficially the difference between the individual and group interviews is the number of participants, this difference contributes to major variations between the two with regard to planning, nature of interview guides, probing techniques, and analysis of information. Group interviews can serve a wide range of information collection purposes. They can provide background information and help to generate ideas and hypothesis for project and program design, provide feedback from beneficiaries, and help in assessing responses to recommended innovations. They are also useful for obtaining data for monitoring and evaluation purposes and for interpreting available quantitative data. There are two main types of group interviews -- focus group interviews and community interviews -- that have wide potential in developing countries. Both types should be carefully planned. The investigator should conduct a systematic review of the relevant documents, records, or studies and consult with a few key informants before venturing into the field. The main concepts should be clearly defined in order to avoid possible misunderstanding between the respondents and the interviewers.

Morgan, D. L. (1997). Focus groups as qualitative research (2nd ed. Sage University Papers Series on Qualitative Research Methods: 16). Thousand Oaks: Sage

Notes: This is an updated version of an earlier text. It is a very succinct guide to Focus Groups in qualitative research. It presents an excellent overview that is explained in a coherent and readable manner. The author uses a wider definition of focus groups than, for example, Krueger and favours less structured approach to focus groups, where the group may discussion a single issue only. This book covers all the essentials of focus groups and addresses design and implementation issues. It is a useful text for researchers who would like an overview of the topic. It is particularly useful where researchers are using focus groups with research themes that are not yet clearly defined, but can be expected to emerge from the groups using a grounded-theory or similar approach. Evaluators who want to run focus groups to answer specific evaluation questions may find the more structured approach presented in Focus Groups (Krueger et al. 2009) better fitted to their needs.

USAID. (1996). Conducting Focus Group Interviews (Performance monitoring and evaluation tips: 10). Washington: USAID Center for Development Information and Evaluation. Last viewed on 20 September 2008. URL: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby233.pdf

Page 17: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

17

Notes: USAID guidance for evaluators conduction focus group interviews. A focus group interview is an inexpensive, rapid appraisal technique that can provide managers with a wealth of qualitative information on performance of development activities, services, and products, or other issues. A facilitator guides 7 to 11 people in a discussion of their experiences, feelings, and preferences about a topic. The facilitator raises issues identified in a discussion guide and uses probing techniques to solicit views, ideas, and other information. Sessions typically last one to two hours.

CASE STUDY (4) General Accounting Office. (1990). Case Study Evaluations (Methodology Transfer Paper PEMD-91-

10.1.9). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://www.gao.gov/special.pubs/pe1019.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This GAO guide presents information on the use of case study evaluations for GAO audit and evaluation work with a focus on: 1) the definition of a case study; 2) conditions under which a case study is an appropriate evaluation method for GAO work; and 3) distinguishing a good case study from a poor one. GAO also included information on: 1) various case study applications; and 2) case study design and strength assessment.

Stake, R. (1995). The Art of Case Study Research. Thousand Oaks: Sage Publications

Notes: The author draws from naturalistic, holistic, ethnographic, phenomenological, and biographic methods to present an exploration of case study methods. In his exploration, Stake uses and annotates an actual case, at Harper School, to demonstrate to readers how to resolve some of the major issues of case study research; for example, how to select the case (or cases) that will maximize learning, how to generalize what is learned from one case to another, and how to interpret what is learned from a case.

Yin, R. (2003). Applications of Case Study Research (2nd ed.). Thousand Oaks: Sage Publications

Notes: Written to augment the author's earlier extremely successful volume, Case Study Research: Design and Methods, the new edition of this applications book presents and discusses new case studies from a wide array of topics offering a variety of examples or applications of case study research methods. These applications demonstrate specific techniques or principles that are integral to the case study method. Through these practical applications, the reader is able to identify solutions to problems encountered during this type of research.

Yin, R. (2003). Case Study Research: Design and Methods (3rd ed. Applied Social Research Methods Series: 5). Thousand Oaks: Sage Publications Inc.

Notes: The Third Edition of the best-selling Case Study Research is a comprehensive presentation covers all aspects of the case study method - from problem definition, design, and data collection, to data analysis and composition and reporting. Yin also traces the uses and importance of case studies to a wide range of disciplines, from sociology, psychology and history to management, planning, social work, and education. New to the Third Edition are: additional examples of case study research; discussions of developments in related methods, including randomised field trials and computer-assisted coding techniques; added coverage of the strengths of multiple-case studies, case study screening, and the case study as a part of larger multi-method studies, and five major analytic techniques, including the use of logic models to guide analysis. This edition also includes references to examples of actual case studies in the companion volume Applications of Case Study Research.

Page 18: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

18

AFTER ACTION REVIEW (3) Robertson, S., & Brún, C. D. (2005, 12 January). National Library for Health: Knowledge

Management Specialist Library: After Action Management Reviews. Retrieved 16 September 2008, from http://www.library.nhs.uk/KnowledgeManagement/ViewResource.aspx?resID=70306&tabID=290&summaries=true&resultsPerPage=10&sort=TITLE&catID=10403

Notes: This short article in the Knowledge Management Specialist Library of the National Library for Health describes After Action Reviews and how to conduct them. It emphasises that After Action Reviews are learning events rather than critiques.

Serrat, O. (2008). Conducting After-Action Reviews and Retrospects (Knowledge Solutions 3). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Conducting-After-Action-Reviews.pdf

Notes: Organizational learning calls for nonstop assessment of performance—its successes and failures. This makes sure that learning takes place and supports continuous improvement. After action reviews and retrospects are a tool that facilitates assessments; they enable this by bringing together a team to discuss an activity or project openly and honestly. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

USAID. (2006). After-action review (Technical Guidance PN-ADF-360). Washington. Last viewed on 16 September 2008. URL: http://events.fcw.com/events/2007/KM/downloads/USAID_Wallace_The%20After%20Action%20Review.pdf

Notes: This handbook-the USAID guide on how to plan, prepare, and conduct an AAR-was developed by USAID Knowledge for Development (KfD) using the United States Army‘s TC (Technical Circular) 25-20 as a guide. The Army developed the concept of AARs as an essential training methodology for soldiers in preparing for both combat duty and ongoing programs such as peacekeeping. An after-action review (AAR) is a professional discussion of an event, that focuses on performance standards and enables development professionals and colleagues with similar or shared interests to discover for themselves what happened, why it happened, and how to sustain strengths and improve on weaknesses.The AAR tool affords leaders, staff, and par tners an opportunity to gain maximum benefit from every program, activity, or task. An AAR answers four major questions: 1) What was expected to happen? 2) What actually occurred? 3) What went well, and why? 4) What can be improved, and how?

CONTENT ANALYSIS (5) Benini, A. (2009). Text Analysis under Time Pressure: Tools for humanitarian and development

workers. Washington: Aldo Benini. Last viewed on 24 October 2009. URL: http://mande.co.uk/blog/wp-content/uploads/2009/09/Benini_TextAnalysis_090721.pdf

Notes: The purpose of this paper is to add simple productivity tools for text analysis, by publicizing existing ones and by adding one that I created. ―Simple‖ is a relative term. As the diagram in the Summary section suggests, the suitability of the tools depends on the skills and equipment level of the intending user. Also, I assume a kind of working environment that developing country organizations will not everywhere offer for computer-supported text analysis: that the analyst actually can acquire the documents digitally.

General Accounting Office. (1989). Content Analysis: A Methodology for Structuring and Analyzing Written Material (Methodology Transfer Paper PEMD-10.1.3). Washington:

Page 19: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

19

Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://archive.gao.gov/d48t13/138426.pdf

Notes: This is the original version of an original paper dating from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This paper defines and describes the evaluation method called ―content analysis.‖ It is a set of procedures for transforming nonstructured information into a format that allows analysis. Prom reading this paper, GAO analysts should gain an understanding of the basic concepts and procedures used in content analysis and also an ability to recognize the appropriate circumstances for using this evaluation method in their jobs. Although we have focused on techniques that make quantitative analysis possible, this is not necessarily the objective of all content analyses. We have presented the techniques that are the most applicable to GAO'S work. In chapter 1, we define content analysis and compdre it to similar procedures already used in GAO. In chapter 2, we discuss the procedures for using content analysis. In chapter 3, we explain the advantages and disadvantages of content analysis and describe some of its potential applications in program evaluation.

General Accounting Office. (1996). Content Analysis: A Methodology for Structuring and Analyzing Written Material (Methodology Transfer Paper PEMD-10.3.1). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://archive.gao.gov/f0102/157490.pdf

Notes: This is an updated version of an original paper dating from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This paper on content analysis describes how evaluators can use this methodology. It defines content analysis and details how to decide whether it is appropriate and, if so, how to develop an analysis plan. The paper also specifies how to code documents, analyze the data, and avoid pitfalls at each stage. Several software packages are described. It should be noted that while the general principles hold, there has been a lot of development in this type of software package since the mid 90s.

Hodson, R. (1999). Analyzing Documentary Accounts (Quantitative Applications in the Social Sciences: 07-128). Thousand Oaks: Sage

Notes: This monograph sets out an approach to content analysis, drawing quanititative data from qualitative accounts. The main focus is on the quantitative analysis of ethnographic accounts. All kinds of human behavior are observed, written about, or documented. This book shows how to use documentary works as data sources and how qualitative and quantitative analysis techniques can be combined. It shows how care must be taken with sampling and coding and gives attention to reliability and validity issues, showing how to begin statistical exploration once the data are assembled.

Krippendorff, K. (2004). Content analysis: an introduction to its methodology (2nd ed.). Thousand Oaks: Sage

Notes: This is a definitive sourcebook of the history and core principles of content analysis as well as an essential resource for present and future studies. The book introduces readers to ways of analyzing meaningful matter such as texts, images, voices – that is, data whose physical manifestations are secondary to the meanings that a particular population of people brings to them. Organized into three parts, the book examines the conceptual and methodological aspects of content analysis and also traces several paths through content analysis protocols. The author has completely revised and updated the Second Edition, integrating new information on computer-aided text analysis. The book also includes a practical guide that incorporates experiences in teaching and how to advise academic and commercial researchers. In addition, Krippendorff clarifies the epistemology and logic of content analysis as well as the methods for achieving its aims. The author presents the three distinguishing characteristics of contemporary content analysis as being: 1) that it is fundamentally empirically grounded, exploratory in process, and predictive or inferential in intent; 2) that it transcends traditional notions of symbols, contents, and intents; and 3) that it has been forced to develop a methodology of its own, one that enables researchers to plan, execute, communicate, reproduce, and critically evaluate an analysis independent of the desirability of its results.

Page 20: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

20

ETHNOGRAPHY (3) Atkinson, P., Coffey, A., Delamont, S., Lofland, J., & Lofland, L. (2007). Handbook of ethnography.

London: Sage

Notes: Ethnography is one of the chief research methods in sociology, anthropology and other cognate disciplines in the social sciences.This is a landmark work in ethnography, which draws on the expertise of an internationally renowned group of interdisciplinary scholars. It provides readers with a one-stop critical guide to the past, present and future of ethnography. This book provides an unparalleled, critical guide to its principles and practice. The volume is organized into three sections. The first systematically locates ethnography firmly in its relevant historical and intellectual contexts. The roots of ethnography are pinpointed and the pattern of its development is demonstrated. The second section examines the contribution of ethnography to major fields of substantive research. The impact and strengths and weaknesses of ethnographic method are dealt with authoritatively and accessibly. The third section moves on to examine key debates and issues in ethnography, from the conduct of research through to contemporary arguments.

Hammersley, M., & Atkinson, P. (2007). Ethnography : principles in practice (3rd ed.). London: Routledge

Notes: This is an exceptionally well written book that make all aspects of ethnography very accessible to readers, The book offers a systematic introduction to ethnographic principles and practice. Points in the text are illustrated with well-chosen examples from ethnograpies. New material covers the use of visual and virtual research methods, hypermedia software, and the issue of ethical regulation. There is also a new prologue and epilogue. The authors argue that ethnography is best understood as a reflexive process. What this means is that we must recognise that social research is part of the world that it studies. From an outline of the principle of reflexivity in Chapter One, the authors go on to discuss and exemplify the main features of ethnographic work: the selection and sampling of cases; the problems of research access; observation and interviewing; recording and filing data; the process of data analysis and writing research reports. The book also considers the ethical issues surrounding ethnographic research. Throughout, the discussion draws on a wide range of illustrative material from classic and more recent studies within a global context.

Okely, J. (1983). The Traveller-Gypsies. Cambridge: Cambridge University Press

Notes: This is a very well-written ethnographic classic that examines the Gypsies. In this book Judith Okely challenges popular accounts of Gypsies which suggest that they were once isolated communities, enjoying an autonomous culture and economy now largely eroded by the processes of industrialisation and western capitalism. This is an excellent ethnography, and the author provides some examples of the methods she used. The Traveller-Gypsies is the first monograph to be published on Gypsies in Britain using the perspective of social anthropology. It examines the historical origins of the Gypsies, their economy, travelling patterns, self-ascription, kinship and political groupings, and their marriage choices, upbringing and gender divisions. A detailed analysis of pollution beliefs reveals an underlying system which expresses and reinforces the separation of Gypsies from non-Gypsies. Explanations for beliefs are sought in their contemporary meaning as opposed to their alleged Indian origin. None of these aspects are analysed independently of the wider society, its policies, beliefs, and practices.

SURVEYS AND STATISTICAL METHODS (14) Clason, D., & Dormody, T. (1993). Analyzing Data Measured by Individual Likert-Type Items.

Studies, 3(7), 4. Last viewed on 17 February 2009. URL: http://pubs.aged.tamu.edu/jae/pdf/Vol35/35-04-31.pdf

Notes: Review examines the appropriateness of using different statistical techniques to examine individual Likert items that comprise part of a Likert scale.

Page 21: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

21

DeVellis, R. (2003). Scale development: Theory and applications (2nd ed. Applied social research methods: 26). Thousand Oaks: Sage Publications

Notes: Written at a surprisingly accessible level for such a complex topic, this monograph guides the reader toward the identification of the latent variable, the generation of an item pool, the format for measurement and the optimization of the scale length. Using exercises to illustrate the concepts, the text also includes advice about factor analytic strategies. This book is a must-read for anyone developing new measures for survey research.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail and mixed-mode surveys: The tailored design method. (3rd ed.). Hoboken: Wiley

Notes: This is a well-written text that offers good, practical, and straightforward advice that works. It is immediately useful for anyone thinking doing a survey. It is a complete start–to–finish guide for every researcher to successfully plan and conduct Internet, mail, and telephone surveys. This book presents a succinct review of survey research methods, equipping you to increase the validity and reliability, as well as response rates, of your surveys. Now thoroughly updated and revised with information about all aspects of survey research grounded in the most current research. The third edition provides practical how–to guidelines on optimally using the Internet, mail, and phone channels to your advantage. This edition includes expanded coverage of online surveys. The book addresses: strategies and tactics for determining the needs of a given survey, how to design it, and how to effectively administer it; how and when to use mail, telephone, and Internet surveys to your maximum advantage; proven techniques to increase response rates; guidance on how to obtain high–quality feedback from mail, electronic, and other self–administered surveys; direction on how to construct effective questionnaires, including considerations of layout; and the effects of sponsorship on the response rates of surveys.

Field, A. (2005). Discovering Statistics using SPSS (2nd ed.). London: Sage

Notes: This edition of Field's textbook provides students of statistical methods with everything they need to understand, use and report statistics - at every level. Written in Andy Field's vivid and entertaining style, and furnished with playful examples from everyday student life (among other places), the book forms an accessible gateway into the often intimidating world of statistics and a unique opportunity for students to ground their knowledge of statistics through the use of SPSS. The title of the book accurately reflects the approach taken. This is not simply a primer on how to use SPSS, but is a very good statistics text using SPSS as a vehicle for illustrating and expanding on the statistical content of the book. At the same time it also serves as a manual for SPSS, and has taught me things that I had not known about the software. One advantage of the text is that it is not tied specifically to the latest version of SPSS.

Fowler, F. (1995). Improving survey questions: Design and evaluation (Applied Social Research Methods Series: 38). Los Angeles: Sage Publications

Notes: This book addresses the issue of "what is a good question?". Although there are several potential sources for error in survey data, the validity of surveys is dependent upon the design of the question asked. This book shows how to word and format questions that will evoke the kind of answers for which they are designed and how to evaluate empirically survey questions. In addition, the book covers: how to write good questions aimed at collecting information about objective facts and events; measuring subjective phenomena; some alternative methods for attacking common measurement problems; how to evaluate the extent to which questions are consistently understood and administered; and how to evaluate the data resulting from a set of questions.

Fowler, F. (2009). Survey research methods (4th ed. Applied social research methods: 1). Los Angeles: Sage Publications

Notes: An excellent introduction to the topic. Survey Research Methods presents the current methodological knowledge on surveys. The book provides information for those who want to collect, analyze, or read about survey data with a sound basis for evaluating

Page 22: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

22

how each aspect of a survey can affect its precision, accuracy, and credibility. This edition, the fourth, has been updated in four primary ways: it much more prominently addresses the growth of the Internet for data collection and the subsequent rapid expansion of online survey usage; it addresses the precipitous drop in response rates for telephone surveys, particularly those based on random-digit dialling; it offers new and expanded coverage monitoring the continued improvement in techniques for pres-urvey evaluation of questions; and it addresses the growing role of individual cell phone in addition - and often instead of - household landlines. Two new chapters, "The Nature of Error in Surveys" and "Issues in Analyzing Survey Data," further emphasize the importance of minimizing non-sampling errors through superior question design, quality interviewing, and high response rates.

Garson, G. D. (2009, 28 January). Reliability Analysis. Retrieved 17 February 2009, from http://faculty.chass.ncsu.edu/garson/PA765/reliab.htm

Notes: Researchers must demonstrate instruments are reliable since without reliability, research results using the instrument are not replicable, and replicability is fundamental to the scientific method. Reliability is the correlation of an item, scale, or instrument with a hypothetical one which truly measures what it is supposed to. The page discusses Cronback's alpha and other measures of reliability.

General Accounting Office. (1992). Quantitative Data Analysis: An Introduction (Methodology Transfer Paper PEMD-10.1.11). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://www.gao.gov/special.pubs/pe10111.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. The basic thesis of this paper is that successful data analysis, whether quantitative or qualitative, requires:1) understanding a variety of data analysis methods; ) planning data analysis early in a project and making revisions in the plan as the work develops; 3) understanding which methods will best answer the study questions posed, given the data that have been collected; and 4) once the analysis is finished, recognizing how weaknesses in the data or the analysis affect the conclusions that can properly be drawn. The study questions govern the overall analysis, of course. But the form and quality of the data determine what analyses can be performed and what can be inferred from them. This implies that the evaluator should think about data analysis at four junctures: 1) when the study is in the design phase; 2) when detailed plans are being made for data collection; 3) after the data are collected; and d) as the report is being written and reviewed.

General Accounting Office. (1992). Using Statistical Sampling (Methodology Transfer Paper PEMD-10.1.6). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://archive.gao.gov/t2pbat6/146859.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This paper provide its readers with a background on sampling concepts and methods that will enable them to identify jobs that can benefit from statistical sampling, to know when to seek assistance from a statistical sampling specialist, and to work with the specialist to design and execute a sampling plan. The paper describes sample design, selection and estimation procedures, and the concepts of confidence and sampling precision. Two additional topics, treated more briefly, include special applications of sampling to auditing and evaluation and some relationships between sampling and data collection problems. Last, but not least, the strengths and limitations of statistical sampling are summarized. The original paper was authored by Harry Conley and Lou Fink in April 1986. This reissued version supersedes the earlier edition.

General Accounting Office. (1993). Developing and Using Questionnaires (Methodology Transfer Paper PEMD-10.1.7). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://archive.gao.gov/t2pbat4/150366.pdf

Page 23: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

23

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This paper provides evaluators with a background that is of sufficient depth to use questionnaires in their evaluations. Specifically, this paper provides rationales for determining when questionnaires should be used to accomplish assignment objectives. It also describes how to plan, design, and use a questionnaire in conducting a population survey. The aim of the paper is to make evaluators familiar enough with questionnaire design guidelines to plan and use a questionnaire; to make preliminary designs and assist in many development and testing tasks; to communicate the questionnaire requirements to the measurement, sampling, and statistical analysis experts; and to ensure the quality of the final questionnaire and the resulting data collection. This 1993 paper does not include the rapid changes in surveying modes that have occurred since the early 90's.

Germuth, A. A. (2007). Improving Survey Quality: Assessing and Increasing Survey Reliability and Validity (PowerPoint presentation). Durham: Compass Consulting Group. (C. C. Group No. Last viewed on 23 March 2009. URL: http://www.compassconsultinggroup.org/resources/docs/whitepapers/Improving_Survey_Quality.ppt

Notes: This presentation is part of a workshop on how to improve survey quality. The presentation explains reliability and validity with respect to surveys and shows ways to improve each during the survey design phase for both factual and subjective/abstact surveys. The presentation demonstrates ways to use pilot test responses to assess the reliability of subjective / abstract survey constructs by conducting confirmatory factor analysis and calculating Cronbach‘s alpha.

Nichols, P. (1991). Social survey methods: a field-guide for development workers (Development Guidelines: 6). Oxford: Oxfam

Notes: This monograph guides the reader through using social surveys for research in a development context. While the book briefly refers to qualitative methods (which the book refers to as informal methods), the main emphasis is on quantitative surveys. The book is well written and provides simple but solid guidance for those implementing a quantitative social survey.

Sapsford, R. (2007). Survey research (2nd ed. ed.). London: Sage

Notes: This book presents an informative and accessible account of survey research. It guides the reader through the main theoretical and practical aspects of the subject and illustrates the application of survey methods through examples. Although this edition has been revised and updated and has far fewer errata than the first edition, it still contains several annoying numerical errors that forces the reader to check the data themselves. The book presents: concise and analytic coverage of multivariate analysis techniques; a chapter giving theoretical and practical advice on the stages involved in constructing scales to measure attitude or personality; an account of using materials on the internet.

Swisher, L. L., Beckstead, J. W., & Bebeau, M. J. (2004). Factor Analysis as a Tool for Survey Analysis Using a Professional Role Orientation Inventory as an Example. Physical Therapy, 84(9), 784-799. Last viewed on 21 April 2009. URL: http://www.ptjournal.org/cgi/content/abstract/84/9/784

Notes: The purpose of this article is to illustrate how confirmatory factor analysis can be used to extend and clarify a researcher's insight into a survey instrument beyond that afforded through the typical exploratory factor analytic approach. The authors use as an example a survey instrument developed to measure individual differences in professional role orientation among physical therapists, the Professional Role Orientation Inventory for Physical Therapists (PROI-PT). Five hundred and three physical therapists responded to a mail survey instrument that was sent to a random sample of 2,000 American Physical Therapy Association members. An adapted version of the Professional Role Orientation Inventory, a 40-item Likert-scale instrument developed to assess professional role orientation on 4 dimensions (authority, responsibility, agency, and autonomy), was used. Exploratory and confirmatory factor analyses were used to examine the factorial validity

Page 24: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

24

of the PROI-PT. Exploratory factor analysis served as a starting point for examining the factor structure of the instrument. Confirmatory factor analysis then was used to test the hypothesized factor structure and to suggest refinements to the PROI-PT that would improve a psychometric property (internal consistency). Although further refinement of the PROI-PT is needed, an instrument that yields valid and reliable measurements of individual differences in professionalism among physical therapists could further our understanding of the psychosocial aspects of physical therapist practice. Exploratory and confirmatory factor analyses can be used by researchers who study various psychosocial constructs in physical therapy.

QUALITATIVE ANALYSIS (5) Firestone, W. A. (1993). Alternative Arguments for Generalizing from Data as Applied to

Qualitative Research. Educational Researcher, 22(4), 16-23. Last viewed on 12 August 2009. URL: http://www.jstor.org/stable/1177100

Notes: One criticism about qualitative research is that it is difficult to generalize findings to settings not studied. To explore this issue, I examine three broad arguments for generalizing from data: sample-to-population extrapolation, analytic generalization, and case-to-case transfer. Qualitative research often uses the last argument, but some efforts have been made to use the first two. I suggest that analytic generalization can be very helpful for qualitative researchers but that sample-to-population extrapolation is not likely to be.

Golafshani, N. (2003). Understanding Reliability and Validity in Qualitative Research The Qualitative Report, 8(4), 597-607. Last viewed on 30 October 2008. URL: http://www.nova.edu/ssss/QR/QR8-4/golafshani.pdf

Notes: The use of reliability and validity are common in quantitative research and now it is reconsidered in the qualitative research paradigm. Since reliability and validity are rooted in positivist perspective then they should be redefined for their use in a naturalistic approach. Like reliability and validity as used in quantitative research are providing springboard to examine what these two terms mean in the qualitative research paradigm, triangulation as used in quantitative research to test the reliability and validity can also illuminate some ways to test or maximize the validity and reliability of a qualitative study. Therefore, reliability, validity and triangulation, if they are relevant research concepts, particularly from a qualitative point of view, have to be redefined in order to reflect the multiple ways of establishing truth.

Lewins, A., & Silver, C. (2007). Using software in qualitative research: a step-by-step guide. Los Angeles: Sage

Notes: This book is a primer for the use of Computer Assisted Qualitative Data Analysis (CAQDAS). It combines several aspects of CAQDAS, helping the reader choose the most appropriate package for their specific needs and get the most out of the software once they are using it. The text considers tasks and processes, bringing them together to demystify qualitative software and encourage flexible and critical choices and uses of software in supporting analysis.

Silverman, D. (2001). Interpreting qualitative data : methods for analyzing talk, text and interaction (2nd ed.). London: Sage

Notes: This book is a basic qualitative methods text on the analysis of qualitative data. This is a companion volume to David Silverman's Doing Qualitative Research: A Practical Handbook togehter with the associated reader Qualitative Research: Theory, Method & Practice, which provides further, more focused, material that students require before contemplating their own qualitative research study.

Trochim, W. M. K. (2006, 20 October). Qualitative Validity. Research Methods Knowledge Base Retrieved 30 October 2008, from http://www.socialresearchmethods.net/kb/qualval.php

Page 25: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

25

Notes: Depending on their philosophical perspectives, some qualitative researchers reject the framework of validity that is commonly accepted in more quantitative research in the social sciences. They reject the basic realist assumption that their is a reality external to our perception of it. Consequently, it doesn't make sense to be concerned with the "truth" or "falsity" of an observation with respect to an external reality (which is a primary concern of validity). These qualitative researchers argue for different standards for judging the quality of research.

PARTICIPATORY APPRAISAL (19) Kumar, S. (2002). Methods for community participation: a complete guide for practitioners.

Rugby: ITDG Publishing. Last viewed on 30 October 2009. URL: http://developmentbookshop.com/product_info.php?products_id=541

Notes: This book presents participatory rural appraisal (PRA) in a comprehensive manner. The author defines PRA as a growing body of methods to enable local people to share, enhance and analyse their knowledge of life and conditions in order to plan, act, monitor and evaluate their actions. The basic premise of PRA is that poor and marginalized people are capable of analysing their own realities and that they should be enabled to do so. The book provides examples from experiences, material with directions for use, as well as possibilities for innovation. It contains insight from actual practice in the field, and contains useful tips on the best practices which readers and practitioners should find valuable. The book is divided into four parts. Part 1 deals with the concept of participation and explores its multiple dimensions. Parts 2 and 3 deal with the methods of PRA. Each method is explained with an introduction, applications, examples, a process outlining the steps, the time and material required, and the advantages and limitations of using these methods. The final part provides a summary of the concept of PRA.

World Bank. (2007). 24-hour calendar (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/4_24_hour_calendar.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a 24-hour calendar, a visual method of showing the way people allocate their time between different activities over a 24-hour period. Enables understanding of the impact of policy changes/implementation on daily schedules, workloads, and time use. Reveals differences in schedules and workloads between people from different social groups and at different times of year and can be used to look at the social impacts (for example, on health and education) of different workloads.

World Bank. (2007). Asset wheel (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/5_Asset_Wheel.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a asset wheel, a visual method of showing the different assets/resources and the linkages between them. It is useful for understanding differences in the asset bases of different social groups; establishing an asset baseline, which can be used to explore livelihood strategies/diversification and opportunities for and constraints to increasing asset holdings; and examining potential impacts of a policy change on the asset bases of different social groups.

World Bank. (2007). Causal flow diagram (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/8_Causal_flow_diagram.pdf

Page 26: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

26

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a causal flow diagram, a method of showing diagrammatically the causes, effects, and relationships between variables associated with policy change and poverty and social change. Traces differences in cause–effect relationships by different social groups. Reveals relationships between economic, political, social, and environmental factors.

World Bank. (2007). Community profile (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375678936/2_Community_profile.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a community profile, an overview of a community containing information on a broad range of factors (such as environmental/natural features and management, sociodemographic characteristics, political and economic structures, local institutions, economic activities and livelihoods, basic household and community facilities, and social organization).

World Bank. (2007). Community resource mapping (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375678936/4_Community_resource_mapping.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a community resource mapping, a method of showing information regarding the occurrence, distribution, access to and use of resources; topography; human settlements; and activities of a community from the perspective of community members; this method enables people to picture resources and features and to show graphically the significance attached to them.

World Bank. (2007). Entitlements matrix (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/7_Entitlements_matrix.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a entitlements matrix, a method of representing socially differentiated perceptions of and actual rights and entitlements, and understanding differences in the way they are applied to different groups of people (such as women and men, poorer households, different ethnic groups, and so on). Useful for identifying possible linkages between capacity and resources to claim rights and people‘s capacity to deal with risk and vulnerability, as well as potential impacts of policy reform on rights and entitlements.

World Bank. (2007). Institutional mapping/ Venn diagramming (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375938992/1_Insti_mapping_Venn_diagramming.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a institutional mapping/ venn diagramming, a visual method of identifying and representing perceptions of key institutions (formal and informal) and individuals inside and outside a community as well as their relationships and importance. Enables understanding how different community members perceive institutions both within the community (in terms of decision making, accessibility, and services) and outside the community (in terms of participation, accessibility, and services).

Page 27: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

27

World Bank. (2007). Institutional perception mapping (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375938992/2_Insti_perception_mapping.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a institutional perception mapping, a visual method of identifying and representing perceptions of key institutions (formal and informal) and individuals inside and outside a community as well as their relationships and importance to different social groups. Good for understanding the sets of social relations that mediate the transmission of a policy change.

World Bank. (2007). Livelihood matrix scoring (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/6_Livelihood_matrix_scoring.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a livelihood matrix scoring, a method of investigating preferred and prioritized livelihood options of population subgroups against specified criteria (rather than a description of current livelihood strategies). Contributes to an understanding of possible impacts of policy reform on livelihood options and preferences.

World Bank. (2007). Mobility mapping (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375938992/3_Mobility_mapping.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a mobility mapping, a visual representation of people‘s movements within and outside their community. Identifies issues and problems related to socially differentiated mobility and access to resources (such as land, water, health and education services, information, capital, decision making, and so on) and consequences of socially differentiated mobility for different social groups, their households, and livelihoods. Socially differentiated mobility within and outside a community can indicate differing levels of freedom, wealth, empowerment, and rights.

World Bank. (2007). Risk indexing (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/2_Risk_indexing.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a risk indexing, a systematic approach to identify, classify, and order sources of risk and to examine differences in risk perception.

World Bank. (2007). Risk mapping (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/1_Risk_mapping.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a risk mapping, good for understanding the vulnerability context, delineating perceptions of risk at different levels, and examining the multiple risk and vulnerabilities (the most vulnerable will experience multiple risks) and concomitant vulnerabilities as a result of a policy change; risk mapping helps to identify the covariance of risk and the coincidence of (multiple) vulnerabilities that impact most severely on the poorest.

Page 28: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

28

World Bank. (2007). Seasonal calendar (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375811087/3_Seasonal_calendar.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a seasonal calendar, a visual method of showing the distribution of seasonally varying phenomena (such as economic activities, resources, production activities, problems, illness/disease, migration, natural events/ phenomena, climate, and so on) over time. Nuances analysis of impact of policy change by revealing the seasonal variations in vulnerability and access to assets and resources. Useful for understanding the relationship between seasonally varying phenomena and livelihood strategies.

World Bank. (2007). Social mapping (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375678936/3_Social_mapping.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a social mapping, a visual method of showing the relative location of households and the distribution of people of different types (such as male, female, adult, child, landed, landless, literate, illiterate, and so on) together with the social structure and institutions of an area.

World Bank. (2007). Time Line Life histories (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375678936/6_Time_line.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a time line life histories, good for identifying trends and changes to poverty over time; very important to triangulate information with secondary review, interviews, and survey data.

World Bank. (2007). Tools for Institutional, Political, and Social Analysis of Policy Reform: A Sourcebook for Development Practitioners. Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/TIPs_Sourcebook_English.pdf

Notes: This sourcebook was developed from an earlier World Bank website on analytical tools. It provides a framework and tools for focusing policy analysis on political economy, power relations, and social dynamics. The tools presented can be used at the macro, meso, and micro levels. Tools at all levels are likely to be of interest to evaluators. The text is supported by a series of short notes by the World Bank on Participatory Appraisal tools.

World Bank. (2007). Transect walk (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375678936/1_Transect_walk.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a transect walk, a tool for describing and showing the location and distribution of resources, features, the landscape, and main land uses along a given transect.

World Bank. (2007). Wealth ranking (Participatory Tools for Micro-Level Poverty and Social Impact Analysis). Washington: World Bank. Last viewed on 31 October 2009. URL: http://siteresources.worldbank.org/EXTTOPPSISOU/Resources/1424002-1185304794278/4026035-1185375653056/4028835-1185375678936/5_Wealth_ranking.pdf

Notes: This is a supplementary tool from the World Bank's Sourcebook on Tools for Institutional, Political and Social Analysis for Policy Reform. This describes a wealth

Page 29: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

29

ranking, this method involves the ranking of different individuals, households, or communities according to locally developed criteria of well-being. Performing such exercises for communities as well as households or individuals illustrates the significance of factors and assets that affect poverty at the community, group, or household level.

OTHER METHODS (28) Bamberger, M. (2000). Integrating quantitative and qualitative research in development projects

(Directions in Development). Washington: World Bank Publications

Notes: This report is based on a two-day workshop held in June 1998, where outside research specialists and World Bank staff discussed the importance of integrating these research methods. The participants reviewed experiences in the use of mixed-method approaches in Bank research and project design. This report is a result of those discussions. The report examines the need for integrated research approaches in social and economic development, presents case studies of integrated approaches in practice, and talks about lessons learned.

Beebe, J. (2001). Rapid assessment process: An introduction. Walnut Creek: AltaMira Press. Last viewed on 30 October 2009. URL: http://books.google.co.uk/books?id=NkSJ3aaaUDIC&printsec=frontcover&dq=Rapid+Assessment+Process:+An+Introduction&ei=6q7qSpbXMJDczQSK7ZGIDA#v=onepage&q=&f=false

Notes: The book presents an approach to the rapid assessment of problematic situations using intensive, team-based qualitative inquiry with triangulation, interactive data analysis and focused data collection to quickly develop an preliminary understanding of a situation from an insider's perspective. The approach is essentially one of rapid ethnography focused around triangulation of the data collected.

Davies, R., & Dart, J. (2005). The ‗Most Significant Change‘ (MSC) Technique: A Guide to Its Use. London: CARE International. Last viewed on 9 April 2009. URL: http://www.mande.co.uk/docs/MSCGuide.pdf

Notes: This guide publication is aimed at organisations, community groups, students and academics who wish to use the Most Significant Change (MSC) approach to help monitor and evaluate their social change programs and projects, or to learn more about how it can be used. The technique is applicable in many different sectors, including agriculture, education and health, and especially in development programs. It is also applicable to many different cultural contexts. The guide's introductory chapter provides a quick overview of MSC before presenting practical advice and troubleshooting in the next two chapters. Other chapters look at building the capacity to use the MSC approach and about using it in project cycle management. The final chapters examine the theory underlying MSC before addressing issues of validity and the epistemological position of the approach.

Freedman, D., Arland, T., Camburn, D., Alwin, D., & Young-DeMarco, L. (1988). The Life History Calendar: A Technique for Collecting Retrospective Data. Sociological Methodology, 18, 37-68. Last viewed on 26 November 2009. URL: http://www.jstor.org/stable/271044

Notes: This paper details the authors' selection, design, and use of a life history calendar (LHC) to collect retrospective life course data. A sample of nine hundred 23-year-olds, originally interviewed in 1980, were asked about the incidence and timing of various life events in the nine years since their 15th birthday. The accuracy of the LHC retrospective data can be tested by comparing the 1980 reports about current activities with the 1985 LHC retrospective reports about those same activities during the 1980 interview month. The following aspects of the LHC are described: (a) the concept, uses, and advantages of the LHC, (b) the time units and domains used, (c) the mode of recording the responses and the decisions and problems involved, (d) interviewer training, and (e) coding. The following results attest to the accuracy of the LHC retrospective data: (a) only four of the calendars had missing data in any month; (b) the data obtained in 1980 about current work, school attendance, marriage, and children showed a remarkable correspondence to

Page 30: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

30

the retrospective 1985 LHC reports of these events; (c) the interviewers were positive about the LHC's ability to increase respondent recall.

Hutchby, I., & Wooffitt, R. (1998). Conversation analysis: Principles, practices and applications (Polity

Notes: The is the first edition of the book. A 2nd edition was published in 2008. This book provides a wide ranging and readable introduction to the somewhat esoteric field of conversation analysis. The authors make is more accessible. They use a number of concrete examples to illustrate the method.

Serrat, O. (2008). Appreciative Inquiry (Knowledge Solutions 21). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/appreciative-inquiry.pdf

Notes: Appreciative inquiry is the process of facilitating positive change in organizations. Its basic assumption is uncomplicated: every organization has something that works well. Appreciative inquiry is therefore an exciting generative approach to organizational development. At a higher level, it is also a way of being and seeing. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Building Communities of Practice (Knowledge Solutions 4). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Building-Communities-Practice.pdf

Notes: Communities of practice are groups of like-minded, interacting people who filter, analyze, invest and provide, convene, build, and learn and facilitate to ensure more effective creation and sharing of knowledge in their domain. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Conducting Peer Assists (Knowledge Solutions 1). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Conducting-Peer-Assists.pdf

Notes: Peer assists are events that bring individuals together to share their experiences, insights, and knowledge on an identified challenge or problem. They also promote collective learning and develop networks among those invited. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Conducting Successful Retreats (Knowledge Solutions 23). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Conducting-Successful-Retreats.pdf

Notes: A retreat is a meeting designed and organized to facilitate the ability of a group to step back from day-to-day activities for a period of concentrated discussion, dialogue, and strategic thinking about their organization‘s future or specific issues. Organizations will reap full benefits if they follow basic rules. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Page 31: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

31

Serrat, O. (2008). Creating and Running Partnerships (Knowledge Solutions 9). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Creating-Running-Partnerships.pdf

Notes: Partnerships have a crucial role to play in the development agenda. To reach the critical mass required to reduce poverty, there must be more concerted effort, greater collaboration, alignment of inputs, and a leveraging of resources and effort. Understanding the drivers of success and the drivers of failure helps efforts to create and run them. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Culture Theory (Knowledge Solutions 22). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Culture-Theory.pdf

Notes: Culture theory strengthens the expectation that markets work, not because they are comprised of autonomous individuals who are free of social sanctions but because they are powered by social beings and their distinctive ideas, beliefs, values, and knowledge. It can contribute to understanding and promoting development where group relationships predominate and individualism is tempered. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Focusing on Project Metrics (Knowledge Solutions 16). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Focusing-Project-Metrics.pdf

Notes: The need to ensure that scarce funding is applied to effective projects is a goal shared by all. Focusing on common parameters of project performance is a means to that end. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Outcome Mapping (Knowledge Solutions 17). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Outcome-Mapping.pdf

Notes: Development is about people—it is about how they relate to one another and their environment, and how they learn in doing so. Outcome mapping puts people and learning first and accepts unexpected change as a source of innovation. It shifts the focus from changes in state, viz. reduced poverty, to changes in behaviors, relationships, actions, and activities. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Output Accomplishment and the Design and Monitoring Framework (Knowledge Solutions 6). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Output-Accomplishment.pdf

Notes: The design and monitoring framework is a logic model for objectives oriented planning that structures the main elements in a project, highlighting linkages between

Page 32: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

32

intended inputs, planned activities, and expected results. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Reading the Future (Knowledge Solutions 11). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Reading-Future.pdf

Notes: Scenario-building enables managers to invent and then consider in depth several varied stories of equally plausible futures. They can then make strategic decisions that will be sound for all plausible futures. No matter what future takes place, one is more likely to be ready for and influential in it if one has thought seriously about scenarios. Scenario planning challenges mental models about the world and lifts the blinders that limit our creativity and resourcefulness. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). The Reframing Matrix (Knowledge Solutions 20). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/The-Reframing-Matrix.pdf

Notes: Everyone sees things differently—knowledge often lies in the eye of the beholder. The reframing matrix enables different perspectives to be generated and used in management processes. It expands the number of options for solving a problem. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Storytelling (Knowledge Solutions 10). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Storytelling.pdf

Notes: Storytelling is the use of stories or narratives as a communication tool to value, share, and capitalize on the knowledge of individuals. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). The Sustainable Livelihoods Approach (Knowledge Solutions 15). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Sustainable-Livelihoods-Approach.pdf

Notes: The sustainable livelihoods approach improves understanding of the livelihoods of the poor. It organizes the factors that constrain or enhance livelihood opportunities, and shows how they relate. It can help plan development activities and assess the contribution that existing activities have made to sustaining livelihoods. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Assessing the Effectiveness of Assistance in Capacity Development (Knowledge Solutions 29). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/assessing-effectiveness-assistance-capacity-development.pdf

Page 33: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

33

Notes: Feedback is the dynamic process of presenting and disseminating information to improve performance. Feedback mechanisms are increasingly being recognized as key elements of learning before, during, and after. Assessments by executing agencies of the effectiveness of assistance in capacity development are prominent among these. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Building Networks of Practice (Knowledge Solutions 34). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Building-Networks-of-Practice.pdf

Notes: Organizational boundaries have been stretched, morphed, and redesigned to a degree unimaginable 10 years ago. Networks of practice have come of age. The learning organization pays attention to their forms and functions, evolves principles of engagement, circumscribes and promotes success factors, and monitors and evaluates performance with knowledge performance metrics. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Conducting Effective Meetings (Knowledge Solutions 36). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Conducting-Effective-Meetings.pdf

Notes: Meetings bring people together to discuss a predetermined topic. However, too many are poorly planned and managed, and therefore fail to satisfy objectives when they do not simply waste time. The operating expenses of time wasted include related meeting expenditures, salaries, and opportunity costs. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Drawing Mind Maps (Knowledge Solutions 40). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/drawing-mind-maps.pdf

Notes: Mind maps are a visual means that represent, link, and arrange concepts, themes, or tasks, with connections usually extending radially from a central topic. They are used by individuals and groups (informally and intuitively) to generate, visualize, structure, and classify these. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). The Five Whys Technique (Knowledge Solutions 30). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/The-Five-Whys-Technique.pdf

Notes: When confronted with a problem, have you ever stopped and asked ―why‖ five times? If you do not ask the right question, you will not get the right answer. The Five Whys is a simple question-asking technique that explores the cause-and-effect relationships underlying problems. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and

Page 34: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

34

approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Growing Managers Not Bosses (Knowledge Solutions 38). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/growing-managers-not-bosses.pdf

Notes: In the 21st century, managers are responsible for the application and performance of knowledge at task, team, and individual levels. Their accountability is absolute and cannot be relinquished. In a changing world, successful organizations spend more time, integrity, and brainpower on selecting them than on anything else. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). The Most Significant Change Technique (Knowledge Solutions 25). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Most-Significant-Change.pdf

Notes: The Most Significant Change technique helps monitor and evaluate the performance of projects and programs. It involves the collection and systematic participatory interpretation of stories of significant change emanating from the field level—stories about who did what, when, and why, and the reasons why the event was important. It does not employ quantitative indicators. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). The SCAMPER Technique (Knowledge Solutions 31). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/The-SCAMPER-Technique.pdf

Notes: Ideas are not often plucked out of thin air. The SCAMPER brainstorming technique uses a set of directed questions to resolve a problem (or meet an opportunity). It can also turn a tired idea into something new and different. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Social Network Analysis (Knowledge Solutions 28). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Social-Network-Analysis.pdf

Notes: Power no longer resides exclusively (if at all) in states, institutions, or large corporations. It is located in the networks that structure society. Social network analysis seeks to understand networks and their participants and has two main focuses: the actors and the relationships between them in a specific social context. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Working in Teams (Knowledge Solutions 33). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/working-in-teams.pdf

Page 35: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

35

Notes: Cooperative work by a team can produce remarkable results. The challenge is to move from the realm of the possible to the realm of practice. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

STANDARDS AND ETHICS (9) Bebeau, M. J., Pimple, K. D., Muskavitch, K. M. T., Borden, S. L., & Smith, D. H. (1995). Moral

Reasoning in Scientific Research: Cases for Teaching and Assessment. Bloomington: Poynter Center for the Study of Ethics and American Institutions. Last viewed on 24 July 2009. URL: http://poynter.indiana.edu/mr/mr.pdf

Notes: This booklet is a package of materials developed to help enhance the reader's ability to develop well-reasoned responses to the kinds of ethical problems that are likely to arise in the practice of science. It has been tailored for the training of graduate students in the biomedical sciences, but you may find that the materials work with researchers from a variety of backgrounds. The package contains six cases for discussion and instructions for students describing the rationale and procedures for this exercise. The booklet suggests that professionals, including professional scientists, have a particular responsibility to have well-developed skills of moral reasoning. The booklet briefly sets forth the following four criteria for evaluating the adequacy of a moral argument: 1) whether the response addresses each of the ethical issues and points of ethical conflict presented in the case or problem; 2) whether each interested party‘s legitimate expectations are considered; 3) whether the consequences of acting are recognized, specifically described (not just generally mentioned), and incorporated into the decision; 4) whether each of the obligations or duties of the protagonist are described.

Cassell, J., & Jacobs, S.-E. (Eds.). (2006). Handbook on Ethical Issues in Anthropology (Special publication of the American Anthropological Association: 23). Arlington: American Anthropological Association. Last viewed on 3 August 2009. URL: http://www.aaanet.org/committees/ethics/toc.htm

Notes: This handbook was sponsored by the Committee on Ethics of the American Anthropological Association (AAA) to stimulate discussion and reflection on ethical issues. Chapter 1 contains a brief review essay and an annotated bibliography. Chapter 2 presents the background to the formation of that Committee and the writing of the AAA's first code of ethics, the Principles of Professional Responsibility. This code, still in effect, has been revised substantially over a period of ten years. Chapters 3 and 4 contain a series of ethical dilemmas, first published in the Anthropology Newsletter. All were actual dilemmas. The solutions used by the anthropologists who provided the dilemmas were published the following month, with readers asked to comment on dilemmas and solutions. Chapter 3 and Chapter 4 contain dilemmas presented by Jacobs and Cassell respectively. The cases are presented in the order in which they were published, with a title assigned to each case.

Joint Committee on Standards for Educational Evaluation, & Sanders, J. (1994). The program evaluation standards (2nd ed.). Thousand Oaks: Sage Publications

Notes: This revised edition of the original standards presents 30 standards for evaluation. These are divided into four groups corresponding to the attributes of fair programme evaluation - utility; feasibility; propriety; and accuracy. This edition combines some of the original standards adds others, with new case illustrations featuring applications of the standards to reform efforts in a diverse range of settings including schools, universities, law, medicine, nursing, business and social service agencies. Taken as a set, the Standards provide a working philosophy for evaluation which will lead to useful, feasible, ethical and sound programme evaluation. The also provide a possible tool for the assessment of evaluation quality.

Page 36: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

36

OECD/DAC. (1991). Principles for evaluation of development assistance [OCDE/GD(91)208]. Paris: Development Assistance Committee of the Organisation for Economic Cooperation and Development. Last viewed on 16 June 2008. URL: http://www.oecd.org/dataoecd/31/12/2755284.pdf

Notes: The following set of principles state the views of DAC Members on the most important requirements of the evaluation process based on current policies and practices as well as donor agency experiences with evaluation and feedback of results. This report was endorsed at the DAC High-Level Meeting held on 3 and 4 December 1991. It is made available to the public on the responsibility of the Secretary-General of the OECD.

OECD/DAC NDE. (2006). DAC Evaluation Quality Standards (for test phase application). Paris: Network on Development Evaluation, Development Assistance Committee of the Organisation for Economic Cooperation and Development. Last viewed on 9 July 2009. URL: http://www.oecd.org/dataoecd/30/62/36596604.pdf

Notes: The DAC Evaluation Quality Standards identify the key pillars needed for a quality evaluation process and product. They have been prepared by DAC members in order to define member countries‘ expectations of evaluation processes, and evaluation products. The Standards are not binding on member countries, but a guide to good practice and aim to improve the quality of development intervention evaluations. They are intended to contribute to a harmonised approach to evaluation in line with the principles of the Paris Declaration on Aid Effectiveness. The quality standards cover ten areas of evaluation: rationale; scope; context; methodology; information sources; independence; ethics; quality assurance; relevance of findings; and completeness. A revised version was due to be circulated in mid 2009.

OECD/DAC NDE. (2008). Evaluating Development Cooperation: Summary of key norms and standards (Evaluation and aid effectiveness). Paris: Organisation for Economic Co-operation and Development, Development Assistance Committee Network on Development Evaluation. Last viewed on 09 July 2009. URL: http://www.oecd.org/dataoecd/12/56/41612905.pdf

Notes: This document provides a very brief and rapid overview of different OECD/DAC evaluation norms and standards, but does not refer to any other norms or standards. The OECD/DAC body of norms and standards is based on the experience of donor administrations, and evolves over time to fit the changing aid environment. The norms and standards summarised here should be applied discerningly and adapted carefully to fit the purpose, object and context of each evaluation. As this summary document is not an exhaustive evaluation manual readers are encouraged to refer to the complete texts available on the DAC website.

Stufflebeam, D. L., Goodyear, L., Marquart, J., & Johnson, E. (2005). Guiding Principles Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 6 July 2009. URL: http://www.wmich.edu/evalctr/checklists/guidingprinciples2005.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist provides guidance for using the American Evaluation Association‘s Guiding Principles for Evaluators in planning, conducting, and evaluating evaluations. Checkpoints are organized around the principles of systematic inquiry, competence, integrity/honesty, respect for people, and responsibilities for general and public welfare. The checklist is formatted to facilitate use of the checklist for metaevaluations.

UNEG. (2008). UNEG Code of Conduct for Evaluation in the UN System. New York: United Nations Evaluation Group. Last viewed on 12 November 2009. URL: http://www.unevaluation.org/documentdownload?doc_id=100&file_id=547

Notes: The conduct of evaluators in the UN system should be beyond reproach at all times. Any deficiency in their professional conduct may undermine the integrity of the evaluation, and more broadly evaluation in the UN or the UN itself, and raise doubts about the quality and validity of their evaluation work. This Code of Conduct applies to all evaluation staff and consultants in the UN system. The principles behind the Code of

Page 37: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

37

Conduct are fully consistent with the Standards of Conduct for the International Civil Service by which all UN staff are bound. UN staff are also subject to any UNEG member specific staff rules and procedures for the procurement of services. The provisions of the UNEG Code of Conduct apply to all stages of the evaluation process from the conception to the completion of an evaluation and the release and use of the evaluation results.

UNEG. (2008). UNEG Ethical Guidelines for Evaluation. New York: United Nations Evaluation Group. Last viewed on 12 November 2009. URL: http://www.uneval.org/documentdownload?doc_id=102&file_id=548

Notes: Based on commonly held and internationally recognized professional ideals such as those outlined in the Standards of Conduct set by the International Civil Service, the UNEG Ethical Guidelines expand on the UNEG Code of Conduct for Evaluation on the UN System.

FRAMEWORK FOR EVALUATION (10) Borton, J. (Ed.). (1994). Code of conduct for the International Red Cross and Red Crescent

Movement and NGOs in disaster relief: Network Paper 7 (Network paper - Relief and Rehabilitation Network: 7). London: Relief and Rehabilitation Network, Overseas Development Institute. Last viewed on 21 January 2009. URL: http://www.odihpn.org/documents/networkpaper07.pdf

Notes: The appearance of a Code of Conduct setting standards for the work of NGOs involved in the provision of humanitarian aid is a significant and welcome step - all the more so because it is a collaborative product of many of the largest non-governmental agencies within the International Relief System. The text of the Code provides surprisingly little information on its origins, the process by which it was developed and how it is expected to operate in practice. Such information provides the necessary context for any assessment of the Code's significance and value and also for any discussion over the actual text of the Code. This document is intended to provide such background information so as to give RRN members a sufficient basis upon which to make their assessment of the Code. It has been prepared by the RRN Coordinator drawing on discussions with some of the individuals involved in the preparation of the Code. The actual text and contents of the Code have not been commented upon.

Good Humanitarian Donorship. (2003). Principles and Good Practice of Humanitarian Donorship. Stockholm: Germany, Australia, Belgium, Canada, the European Commission, Denmark, the United States, Finland, France, Ireland, Japan, Luxembourg, Norway, the Netherlands, the United Kingdom, Sweden and Switzerland. Last viewed on 14 May 2009. URL: http://www.reliefweb.int/ghd/a%2023%20Principles%20EN-GHD19.10.04%20RED.doc

Notes: This one page document presents the 23 principles and good practice of humanitarian donorship. This is sometimes referred to as the Good Humanitarian Donorship Initiative (GHDI). The GHD principles were endorsed in Stockholm, 17 June 2003 by seventeen major donors: Germany, Australia, Belgium, Canada, the European Commission, Denmark, the United States, Finland, France, Ireland, Japan, Luxembourg, Norway, the Netherlands, the United Kingdom, Sweden and Switzerland. The principles contain a useful definition of Humanitarian Action in Principle 1: The objectives of humanitarian action are to save lives, alleviate suffering and maintain human dignity during and in the aftermath of man-made crises and natural disasters, as well as to prevent and strengthen preparedness for the occurrence of such situations. The principles include three principles defining humanitarian action, seven general principles, four funding principles, six principles on promoting standards, and three principles on learning and accountability.

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs: The four levels (3rd ed.). San Francisco: Berrett-Koehler

Notes: This is an updated edition of the bestselling classic. In 1959 Donald Kirkpatrick developed a four-level model for evaluating training programs. Since then, the "Kirkpatrick Model" has become the most widely used approach to training evaluation in

Page 38: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

38

the corporate, government, and academic worlds. Evaluating Training Programs provided the first comprehensive guide to Kirkpatrick's Four Level Model, along with detailed case studies of how the model is being used successfully in a wide range of programs and institutions. Kirkpatrick's model focuses on four areas for a more comprehensive approach to evaluation: Evaluating Reaction, Evaluating Learning, Evaluating Behavior, and Evaluating Results. In the third edition of this classic bestseller, Kirkpatrick offers new forms and procedures for evaluating at all levels and several additional chapters about using balanced scorecards and "Managing Change Effectively." He also includes twelve new case studies from organizations that have been evaluated using one or more of the four levels.

Lautze, S., & Raven-Roberts, A. (2003). The Vulnerability Context: Is There Something Wrong With This Picture? Paper presented at the Food Security in Complex Emergencies: building policy frameworks to address longer-term programming challenges, Tivoli, 23-25 September 2003. Last viewed on 13 November 2009. URL: http://www.fao.org/crisisandhunger/root/pdf/lautze.pdf

Notes: This paper presents the livelihoods model developed by Lautze and Raven-Roberts. It explores the characteristics of complex emergencies and places a central focus on the role of violence as the key, singular and defining characteristic of a range of disasters categorized as ―complex emergencies‖. The nature of violence and its implications for relief responders is briefly reviewed. The majority of the paper analyses the challenges facing livelihoods specialists working to analyze the impact of complex emergencies on livelihoods systems using presently available sustainable livelihoods frameworks. Specifically, the particular relationships between violence and: a) assets; b) processes, institutions and policies; and c) outcomes are explored. Further work is needed to complete the analysis, particularly with respect to the impact of violence on other elements of livelihoods frameworks, specifically: access and influence; and, livelihoods strategies. This paper concludes that some aspects of sustainable livelihoods frameworks need to be modified in order to increase the utility and relevance of livelihoods frameworks in complex emergencies. Importantly, this entails a shift of focus from sustainable livelihoods to resilient livelihoods.

OCHA. (2004). Guiding Principles on Internal Displacement. Geneva: OCHA. Last viewed on 13 November 2009. URL: http://www3.brookings.edu/fp/projects/idp/resources/GPEnglish.pdf

Notes: The Guiding Principles, also known as the Deng Guiding Principiles, seek to protect all internally displaced persons in internal conflict situations, natural disasters and other situations of forced displacement. They are one of the most important contributions of the mandate of the Representative of the Secretary-General. The Representative of the Secretary General presented them to the UN Commission on Human Rights in 1998. The UN Commission and the General Assembly in unanimously adopted resolutions have taken note of the Principles, welcomed their use as an important tool and standard, and encouraged UN agencies, regional organizations, and NGOs to disseminate and apply them. Individual governments have begun to incorporate them in national policies and laws, international organizations and regional bodies have welcomed and endorsed them, and some national courts have begun to refer to them as relevant restatements of existing international law. In his 2005 report on UN reform (In Larger Freedom) the UN Secretary-General refers to the Principles as "the basic international norm for protection" of IDPs.

OCHA CMCS. (2007). Guidelines On The Use of Foreign Military and Civil Defence Assets In Disaster Relief–―Oslo Guidelines‖ Revision 1.1. Geneva: Civil-Military Coordination Section of the United Nations Office for the Coordination of Humanitarian Affairs Last viewed on 8 June, 2008. URL: http://www.reliefweb.int/rw/lib.nsf/db900sid/AMMF-6VXJVG/$file/OCHA-Nov2006.pdf?openelement

Notes: The ―Oslo Guidelines‖ were originally prepared over a period of two years beginning in 1992. They were the result of a collaborative effort that culminated in an international conference in Oslo, Norway, in January 1994 and were released in May 1994. The unprecedented deployment in 2005 of military forces and assets in support of humanitarian response to natural disasters, following an increasing trend over the past

Page 39: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

39

years, confirmed the need to update the 1994 ―Oslo Guidelines‖. The Consultative Group on the Use of Military and Civil Defence Assets (MCDA), at its annual meeting in December 2005, tasked OCHA‘s Civil-Military Coordination Section (CMCS) with this facelift, to reflect current terminology and organizational changes, following a layout similar to the 2003 ―Guidelines on the Use of Military and Civil Defence Assets to Support United Nations Humanitarian Activities in Complex Emergencies‖ (―MCDA Guidelines‖).

People in Aid. (2003). Code of Good Practice in the managment and support of aid personnel. London: People in Aid. Last viewed on 7 July 2009. URL: http://www.peopleinaid.org.uk/pool/files/publications/code.pdf

Notes: The first version of the Code of Best Practice was drawn up between 1995 and 1997 through extensive consultation. Although originally driven by agencies in the UK and Ireland, with funding from the UK government‘s then Overseas Development Administration, the input on best practice also came from the UN family, from the USA, from Continental Europe and from the human resources and field experience of a large number of individuals. This revised edition of the People In Aid Code reflects the changes in internal priorities and external influences that agencies now face. The Code sets out seven guiding principles on: human resources strategy; staff policies and practices; managing people; consultation and communication; recruitment and selection; learning, training and development; health, safety and security.

Smith, D. (2004). Towards a strategic framework for peacebuilding: getting their act together: Overview report of the Joint Utstein Study of Peacebuilding (Evaluation Report: 1/2004). Oslo: Royal Norwegian Ministry of Foreign Affairs. Last viewed on 23 July 2009. URL: http://www.regjeringen.no/upload/kilde/ud/rap/2004/0044/ddd/pdfv/210673-rapp104.pdf

Notes: The Joint Utstein Peacebuilding study was developed by the Evaluation Departments of the respective foreign and development cooperation ministries (Germany, the Netherlands, Norway and the UK), with Norway taking the lead, to carry out a survey of peacebuilding experience. The International Peace Research Institute, Oslo (PRIO) assisted in conceptualizing the study, and was then chosen as the lead consultant to manage the research. The research framework relied on each of the four departments to find research assistants to carry out the four surveys according to PRIO‘s instructions. It was then agreed that the research teams should also independently write country papers outlining and reviewing key policy issues, drawing on the material unearthed in the surveys and supplemented by interviews. This report draws on the four independent national studies, to identify key findings for analysis and comparison. The overall findings of this report centre on the challenges presented in defining policy terms, articulating goals, key concepts and vocabulary in peacebuilding. A key finding is that a major strategic deficit exists between the articulation of policy and efforts to translate this policy into practice. The international comparison and the scale of the survey of activities combine to form a unique basis for this report.

Twigg, J. (2007). Characteristics of a Disaster-resilient Community: A Guidance Note: Version 1 (for field testing). London: DFID Disaster Risk Reduction Interagency Coordination Group. Last viewed on 25 September 2008. URL: http://www.benfieldhrc.org/disaster_studies/projects/communitydrrindicators/Characteristics_disaster_high_res.pdf

Notes: In 2006, as part of the British Overseas NGOs for Development (BOND) DFID DRR Subworking Group, Tearfund together with ActionAid, Christian Aid, Plan International, Practical Action, and the British Red Cross along with the International Federation of Red Cross and Red Crescent Societies commissioned DRR Consultant - John Twigg - to define a Disaster Resilient Community. This came about after it was recognised that there were no resources that illustrated what Hyogo Framework for Action empowered community looks like - and yet, we were asking Governments, Organisations and communities to embrace this global framework. The six commissioning agencies are now in the process of field testing the characteristics across their projects, with the aim of producing an updated version in time for the Global Platform in Geneva, 2009.

Page 40: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

40

UNISDR. (2005). Hyogo Framework for Action 2005-2015: Building the Resilience of Nations and Communities to Disasters: Extract from the final report of the World Conference on Disaster Reduction: 18-22 January 2005, Kobe, Hyogo, Japan. Geneva: International Strategy for Disaster Reduction. Last viewed on 18 November 2008. URL: http://www.unisdr.org/eng/hfa/docs/Hyogo-framework-for-action-english.pdf

Notes: This is a short version of the longer version of the 2005 conference that established the Hyogo Framework for Action. The World Conference on Disaster Reduction was held from 18 to 22 January 2005 in Kobe, Hyogo, Japan, and adopted the Framework for Action 2005-2015: Building the Resilience of Nations and Communities to Disasters (here after referred to as the ―Framework for Action‖). The report stresses the importance of disaster risk reduction being underpinned by a more pro-active approach to informing, motivating and involving people in all aspects of disaster risk reduction in their own local communities. It also highlights the scarcity of resources allocated specifically from development budgets for the realization of risk reduction objectives, either at the national or the regional level or through international cooperation and financial mechanisms, while noting the significant potential to better exploit existing resources and established practices for more effective disaster risk reduction.

EVALUATION: EVALUABILITY (3) European Union. (2003). Evaluating Socio Economic Development, Source 2: Methods &

Techniques: Evaluability assessment [Electronic Version]. Retrieved 5 October 2009, from http://ec.europa.eu/regional_policy/sources/docgener/evaluation/evalsed/downloads/sb2_evaluability_assessment.doc

Notes: 'Evaluability assessment' is an assessment prior to commencing an evaluation to establish whether a programme or policy can be evaluated and what might be the barriers to its effective and useful evaluation. It requires a review of the coherence and logic of a programme, clarification of data availability, an assessment of the extent to which managers or stakeholders are likely to use evaluation findings given their interests and the timing of any evaluation vis a vis future programme or policy decisions. In addition to assisting evaluators, 'evaluability assessment' has been acknowledged as useful for policy makers, programme managers and other stakeholders or partners. The process of undertaking an assessment early on can help clarify the logic of programmes and lead to fine-tuning or improvement before the programme has progressed too far.

Kaufman-Levy, D., & Poulin, M. (2003). Evaluability assessment: Examining the readiness of a program for evaluation. Washington: Justice Research and Statistics Association. Last viewed on 5 October 2009. URL: http://www.jrsainfo.org/pubs/juv-justice/evaluability-assessment.pdf

Notes: The purpose of this briefing is to introduce program managers to the concept of Evaluability Assessment (EA). Developed by Joseph Wholey in 1979, EA is a tool that can help an evaluator determine whether a program meets the criteria for a meaningful evaluation to take place. Even if a program does not undergo a formal EA, the concepts and ideas are nevertheless important for a program manager to understand and consider prior to having an evaluation conducted. Program managers should bear in mind how these concepts may affect the evaluation process and results.

Trevisan, M., & Huang, Y. (2003). Evaluability assessment: A primer [Electronic Version]. Practical Assessment, Research and Evaluation, 8. Retrieved 5 October 2009, from http://pareonline.net/getvn.asp?v=8&n=20

Notes: This paper has two purposes. The first is to increase awareness among policymakers and practitioners for the power and utility of evaluability assessment (EA), particularly at the state and local level. To this end, the background and rationale for this evaluation strategy is documented. The second is to support and promote its use. The article provides an outline of the procedures for conducting effective EA. While there are detailed resources available (e.g., Nay & Kay, 1982; Smith, 1989; Wholey, 1983), we provide

Page 41: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

41

a simplified, accessible presentation of the EA procedure and framework. An example is also provided. Issues and benefits are discussed.

JOINT EVALUATION (3) Laegreid, T. (2009). Joint Evaluations and Learning in Complex Emergencies: Lessons from the

Humanitarian and Development Domains (Security in Practice 7). Oslo: Norwegian Institute of International Affairs (NUPI). Last viewed on 12 December 2009. URL: http://www.nupi.no/content/download/11481/112059/version/4/file/SIP-7-NUPI+Report-L%C3%A6greid.pdf

Notes: This article argues that joint evaluations can be one tool useful for improving the coordination of very diverse actors. It examines some experiences of joint evaluations and learning processes undertaken mainly in the humanitarian domain, where valuable lessons have been learned over the past 15 years. This report examines the following specific cases: the joint donor evaluation of Rwanda (1995–96), the Tsunami Evaluation Coalition (TEC) 2005–2006 and the Inter-agency Real-time evaluation of Darfur (2004–2005).

OECD/DAC. (2000). Effective Practices in Conducting a Joint Multi-Donor Evaluation (Evaluation and aid effectiveness: 4). Paris: Organisation for Economic Co-operation and Development, Development Assistance Committee. Last viewed on 09 July 2009. URL: http://www.oecd.org/dataoecd/10/28/2667318.pdf

Notes: Joint multi-donor evaluations provide opportunities but also create challenges. The collaborative nature of such evaluations requires special attention and handling. This guide sets out the key steps to be taken when planning and implementing multi-donor evaluations. It draws upon experiences encountered in various types of multi-donor evaluations, some of which were conducted by the DAC Working Party on Aid Evaluation, while others were joint evaluations of multilateral agencies such as UNICEF, or more recently the European Commission's aid programmes and UNCDF. Insights have also been gained through joint evaluations by the Nordic countries.

OECD/DAC. (2006). Guidance for Managing Joint Evaluations. Paris: OECD/DAC. Last viewed on 6 July 2009. URL: http://www.oecd.org/dataoecd/29/28/37512030.pdf

Notes: This booklet is directed at the wider evaluation community and provides practical advice and tips for those involved in planning and implementing joint evaluations. The update and revision are based on both the earlier publication Effective Practices in Conducting a Joint Multi-Donor Evaluation (2000) and the new findings and recommendations detailed in the consultant report to the DAC Evaluation Network (2005).

REAL TIME EVALUATION (3) Cosgrave, J., Ramalingam, B., & Beck, T. (2009). Real-time evaluations of humanitarian action: An

ALNAP Guide: Pilot version. London: Active Learning Network for Accountability and Performance in Humanitarian Action. Last viewed on 3 March 2009. URL: http://www.alnap.org/publications/pdfs/RTEguide.pdf

Notes: Real-time evaluation (RTE) is probably one of the most demanding types of evaluation practice, requiring not only a wide range of skills from evaluators but also a tightly focused professional approach in order to meet the time demands of an RTE. However, RTEs are not about doing regular evaluation work faster. Approaches in RTEs must be different from those in regular evaluations because of the limited time available to make an evaluative judgement. This pilot guide is intended to help both evaluation managers and team leaders in commissioning, overseeing and conducting real-time evaluations of humanitarian operational responses. Drawing on a synthesis of existing good practices, it is intended as a flexible resource that can be adapted to a variety of contexts. This guide concentrates on RTEs undertaken in first phase of an emergency response – where the RTE fieldwork takes place within a few months of the start of the response. This is because such RTEs pose particular problems for both the evaluation

Page 42: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

42

manager and the evaluation team. RTEs that take place later on in the response are closer to ex-post humanitarian evaluations, but this guide also addresses how such RTEs can feed into ongoing operations. The focus of this guide is therefore on what is distinctive about humanitarian RTEs. It does not offer advice on evaluation methodologies in general, but on specific aspects of methodology which make RTEs unique and different. Nevertheless some of the advice will apply to all evaluations and not just to RTEs. This is motivated partly by the authors‘ observations of areas of weakness in existing evaluations.

Sandison, P. (2003). Desk review of real-time evaluation experience (Evaluation Working Paper). New York: Unicef. Last viewed on 9 March 2009. URL: http://origin-www.unicef.org/evaldatabase/files/FINAL_Desk_Review_RTE.pdf

Notes: UNICEF commissioned this desk study with the objective of identifying key lessons on real-time evaluation (RTE) through drawing from the experience of other agencies and organisations in this area. The findings are intended to provide the basis for UNICEF to develop and test a real-time review or evaluation methodology as a component of its evaluation system. A number of real-time evaluations were examined from seven humanitarian agencies and interviews carried out with more than 20 individuals with experience of RTE.

WFP. (2004). Review of WFP's experience with real-time evaluation (Agenda item 2 for Executive Board Second Regular Session Rome, 27–28 May 2004 (WFP/EB.2/2004/2-B). Rome: World Food Programme Office of Evaluation. Last viewed on 24 March 2009. URL: http://documents.wfp.org/stellent/groups/public/documents/eb/wfp029958.pdf

Notes: This document provides a summary of the review of WFP‘s first experience with real-time evaluation of emergency operations, as requested by the Office of Evaluation at the Third Regular Session of the Executive Board in 2003. Real-time evaluation differs from ordinary ex-post evaluation in that it is conducted simultaneously with implementation of the emergency operation. WFP used the real-time evaluation approach for the first time with EMOP 10200 for the southern Africa food crisis from July 2002 to June 2003. A review was commissioned to assess the usefulness to stakeholders of real-time evaluation and to inform any future use of the approach. The review studied the Office of Evaluation and Monitoring documents, correspondence and background materials relating to real-time evaluation in general and the real-time evaluation of EMOP 10200 in particular. Stakeholder groups were interviewed to obtain their perceptions of real-time evaluation and their views on its future use. Stakeholders see real-time evaluation as potentially useful in improving quality by (i) solving operational problems as they occur, (ii) enabling organizational learning for improvement of future EMOPs and (iii) providing an independent assessment of results. Stakeholders‘ suggested changes to real-time evaluation have been incorporated in the review‘s recommendations. WFP has accepted and will implement the recommendations.

IMPACT EVALUATION (10) Bamberger, M. (2006). Conducting quality impact evaluations under budget, time and data

constraints. Washington: Independent Evaluation Group of the World Bank. Last viewed on 16 June 2008. URL: http://www.oecd.org/dataoecd/54/35/37010607.pdf

Notes: This book is to some extent a short summary of the text RealWorld Evaluation by Michael J. Bamberger, Jim Rugh, and Linda Mabry. An extensive literature is now available on appropriate methodologies for evaluating the impacts of development projects and programs. This booklet applies these methodologies to the real-world situations and constraints faced by task managers and researchers. It is intended to complement other recent World Bank publications including Baker (2000), Operations Evaluation Department (2004), Ravallion (2001, 2005), White (2006), and the methodological guidelines and impact evaluation case studies on the Bank‘s Poverty Impact Analysis, Monitoring and Evaluation website.

Buttenheim, A. (2009). Impact evaluation in the post-disaster setting: a conceptual discussion in the context of the 2005 Pakistan earthquake (3ie Working Paper 5). Mumbai: International

Page 43: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

43

Initiative for Impact Evaluation. Last viewed on 16 December 2009. URL: http://www.3ieimpact.org/admin/pdfs_papers/Working%20Paper%205.pdf

Notes: There is growing interest in impact evaluation in both the humanitarian and the development sectors. Several recent reports have identified post -disaster impact evaluation (PDIE) as a particular challenge and have galvanized interest in pushing the field forward. This study reviews existing work, synthesizes a set of guiding principles and analytic frameworks for PDIE, and applies those to a design for the evaluation of relief and recovery programs following the 2005 Pakistan earthquake. The study contributes to ongoing discussions of impact assessment within the humanitarian sector while also introducing impact evaluation practitioners to the specific issues related to conducting quality impact evaluations in post -disaster settings.

Earl, S., Carden, F., & Smutylo, T. (2001). Outcome mapping: Building learning and reflection into development programs. Ottawa: International Development Research Centre,. Last viewed on 9 November 2009. URL: http://www.idrc.ca/uploads/user-S/124230922130889369593.pdf

Notes: This manual is intended as an introduction to the theory and concepts of Outcome Mapping and as a guide to conducting an Outcome Mapping workshop. Although Outcome Mapping may be appropriate in various contexts, it has primarily been tested by development research organizations and programs working in Canada, Africa, Latin America, and Asia. This manual reflects that perspective and Outcome Mapping may have to be adapted to be used with groups other than our constituency of researchers, scientific organizations, government officials, policymakers, and NGOs (for example, communities).

IIIE. (2008). 3ie Impact Evaluation Practice: a guide for grantees. New Delhi: International Initiative for Impact Evaluation. Last viewed on 23 February 2009. URL: http://www.3ieimpact.org/doc/3ie%20impact%20evaluation%20practice.pdf

Notes: Sets out principles for Impact Evaluation of social and economic development programs. Rigorous impact evaluation studies are analyses that measure the net change in outcomes amongst a particular group, or groups, of people that can be attributed to a specific program using the best methodology available, feasible and appropriate to the evaluation question that is being investigated and to the specific context.

IIIE. (2008). Principles for Impact Evaluation. New Delhi: International Initiative for Impact Evaluation. Last viewed on 23 February 2009. URL: http://www.3ieimpact.org/doc/principles%20for%20impact%20evaluation.pdf

Notes: Impact evaluation embraces a range of evaluation methods and approaches to address the ―evaluation gap‖ - e.g. the lack of evidence to inform the decisions of developing country policy-makers in the design and implementation of large-scale social and economic development programs. This document sets out the principles for impact evaluation developed by the International Initiative for Impact Evaluation (3ie).

Jones, N., Jones, H., Steer, L., Datta, A., Walsh, C., Tincati, C., Caffell, A., Gisby, L., & Marsden, H. (2009). Improving impact evaluation production and use (ODI Working Paper). London: Overseas Development Institute. Last viewed on 12 December 2009. URL: http://www.odi.org.uk/resources/download/3177.pdf

Notes: The past five years have seen a proliferation of impact evaluations (IEs) by development agencies across the globe. This report was commissioned by the UK Department for International Development‘s (DFID‘s) Evaluation Department to inform discussions on impact evaluation production and use within the Network of Networks Impact Evaluation Initiative (NONIE). It builds on an initial scoping study prepared for DFID which made recommendations on improving IE production and use, focusing on clustering, coordination, knowledge management, capacity strengthening and communication and uptake. This the report goes further by expanding both the literature review and the annotated database of IEs, as well as honing in on specific dynamics of IE production across sectors. Findings included that there is variability in the use of IE between sectors, and there are differences in how different sectors view the appropriateness of IE. Interest and innovation in IE are strongest in those sectors where it

Page 44: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

44

is seen most favourably. All sectors recognise that impacts can take several years or longer to appear.

Lautze, S. (2009). Humanitarian Action, Livelihoods, and Socio-Cultural Dynamics in Uganda: An Exploration of Theoretical Considerations for Impact Evaluation. Camp Sherman: The Livelihoods Program. Last viewed on 12 December 2009. URL: http://www.alnap.org/pool/files/theoretical-considerations-for-impact-evaluation.pdf

Notes: This paper provides background and analysis to inform the research teams‘ approaches to phase one of a multi-stakeholder evaluation of ‗Impact of humanitarian assistance on livelihoods affected by humanitarian crises in Uganda: An analysis of communities‘ and humanitarian actors‘ perspectives on socio-cultural dynamics.‘ The study is organised under the auspices of a multi-agency steering committee and led by the World Food Programme. The assessment of impact is understood as ―the systematic analysis of lasting and significant changes – positive or negative, intended or not – in people‘s lives brought about by a given action or series of actions‖

Roche, C. (1999). Impact assessment for development agencies: Learning to value change (Oxfam Development Guidelines). Oxford: Oxfam

Notes: This book shows how and why impact assessment needs to be integrated into all stages of development programmes - from planning to evaluation. Its basic premise is that impact assessment should not refer to the immediate outputs of a project or programme, but to any lasting or significant changes that it brought about. From a theoretical overview, the book moves on to discuss the design of impact-assessment processes and a range of tools and methods, before illustrating its use in development, in emergencies and in advocacy work. It ends by exploring ways in which different organizations have attempted to institutionalize impact-assessment processes and the challenges they have faced in doing so. In-depth case studies by partner organizations of Oxfam and Novib as well as by some Oxfam staff show how a variety of approaches to impact assessment - qualitative, quantitative and participatory - in a range of situations from large-scale integrated development programmes to projects involving only one community.

Savedoff, W. D., Levine, R., Birdsall, N., Bourguignon, F., Duflo, E., Gertler, P., Gueron, J., Gupta, I., Habicht, J., Jamison, D., Kress, D., Kuruneri, P., Levine, D. I., Manning, R., Quick, S., Sachs, B., Shah, R., Singh, S., Szekely, M., Victora, C., & Gottlieb, J. (2006). When Will We Ever Learn? Improving Lives through Impact Evaluation: Report of the Evaluation Gap Working Group. Washington: Center for Global Development. Last viewed on 13 July 2009. URL: http://www.cgdev.org/files/7973_file_WillWeEverLearn.pdf

Notes: This report argues for rigorous impact evaluation of social programmes in the developing world. Each year billions of dollars are spent on thousands of programs to improve health, education and other social sector outcomes in the developing world. But very few programs benefit from studies that could determine whether or not they actually made a difference. This absence of evidence is an urgent problem: it not only wastes money but denies poor people crucial support to improve their lives. This report suggests a solution to this problem. In 2004 the Center for Global Development convened the Evaluation Gap Working Group. The group was asked to investigate why rigorous impact evaluations of social development programs, whether financed directly by developing country governments or supported by international aid, are relatively rare. The Working Group was charged with developing proposals to stimulate more and better impact evaluations. This report, the final report of the working group, contains specific recommendations for addressing this urgent problem.

White, H. (2006). Impact Evaluation - The Experience of the Independent Evaluation Group of the World Bank. Washington: Independent Evaluation Group of the World Bank. Last viewed on 16 June, 2008. URL: http://lnweb18.worldbank.org/oed/oeddoclib.nsf/DocUNIDViewForJavaSearch/35BC420995BF58F8852571E00068C6BD/$file/impact_evaluation.pdf

Notes: Impact evaluation is an assessment of the impact of an intervention on final welfare outcomes.The results agenda has forced agencies to demonstrate that the money

Page 45: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

45

they spend is improving the lives of poor people, thereby increasing demand for impact evaluation. In the current environment, calls for increased aid spending are only credible if it can be shown that current spending is indeed contributing toward the attainment of the Millennium Development Goals. However, the meaning of impact evaluation has taken on different meanings over time, and there continue debates as to how it should be done. This introductory chapter has the following purposes. First, it puts forward the definition of impact evaluation as a ‗counterfactual analysis of the impact of an intervention on final welfare outcomes.‘ Second, it discusses two sources of bias which can result in impact evaluation studies giving misleading results: (1) contagion, and (2) self-selection bias.

EVALUATION OTHER (7) Alkin, M. C., & Christie, C. A. (2004). An evaluation theory tree. In M. C. Alkin (Ed.), Evaluation

roots: Tracing theorists' views and influences (pp. 12-65). Thousand Oaks: Sage. Last viewed on 13 July 2009. URL: http://www.sagepub.com/upm-data/5074_Alkin_Chapter_2.pdf

Notes: This presents the original evaluation theory tree developed by Alkin and Christie. See their revised 2008 version in Christie, C. A., & Alkin, M. C. (2008). Evaluation theory tree re-examined. Studies in Educational Evaluation, 34(3), 131-135. This substantial chapter provides a good overview of who's who in evaluation theory.

Christie, C. A., & Alkin, M. C. (2008). Evaluation theory tree re-examined. Studies in Educational Evaluation, 34(3), 131-135. Last viewed on 13 July 2009. URL: http://www.sciencedirect.com/science/article/B6V9B-4T89378-2/2/719b2df1c4224902c28f1a5b6735366a

Notes: This presents a reworking of a model developed by the authors to characterise the development of evaluation theory. The model was original presented in Alvin (ed), 2006 Evaluation Roots: Tracing Theorists' Views and Influences. The authors classify theories and theoreticians into three broad branches: methods; valuing; and utilisation. In this paper the authors suggest modifications to the theory tree presented in the Roots book, including a repositioning of a few theorists, the addition of theorists, and a reconceptualization of the valuing branch. The theory tree provides a useful framework for thinking about the development of evaluation theories.

General Accounting Office. (1991). Designing Evaluations (Methodology Transfer Paper PEMD-10.1.4). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://www.gao.gov/special.pubs/pe1014.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. It argues that to spend the time to develop a sound design is to invest time in building high quality into the evaluation effort effort. Devoting attention to evaluation design means that factors that will affect the quality of the results can be addressed. Not allowing the time that is necessary for this vital stage of the project is, in the end, self-defeating. It can be a crippling, if not a fatal, blow to any evaluation that skips quickly through this step. The pressure of wanting to get into the field as soon as possible has to be held in check while systematic planning takes place. The design is what guides the data collection and analysis. Having looked at why it is important to design evaluations well, the guide then turns to the various components and processes that are inherent in evaluation design in five major parts: 1) asking the right question; 2) adequately considering the constraints; 3) assessing the design; 4) settling on a strategy that considers strengths and weaknesses; and 4) rigorously monitoring the design and incorporating it into the management strategies of the persons who are responsible for the evaluation.

General Accounting Office. (1992). The Evaluation Synthesis (Methodology Transfer Paper PEMD-10.1.2). Washington: Program Evaluation and Methodology Division of the United States

Page 46: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

46

General Accounting Office. Last viewed on 4 July 2009. URL: http://www.gao.gov/special.pubs/pe1012.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. This paper presents information on evaluation synthesis, a systematic procedure for organizing findings from several disparate evaluation studies, which enables evaluators to gather results from different evaluation reports and to ask questions about the group of reports. The guide presents information on: 1) defining evaluation synthesis and the steps in such synthesis; 2) strengths and limitations of evaluation synthesis; 3) techniques for developing a synthesis; 4) quantitative and nonquantitative approaches for performing the synthesis; 5) how evaluation synthesis can identify important interaction effects that single studies may not identify; and 6) study comparisons and problems.

General Accounting Office, & Datta., L.-e. (1990). Prospective Evaluation Methods: The Prospective Evaluation Synthesis (Methodology Transfer Paper PEMD-10.1.10). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office. Last viewed on 4 July 2009. URL: http://www.gao.gov/special.pubs/pe10110.pdf

Notes: This paper dates from Eleanor Chelimsky's time at the GAO's Program Evaluation and Methodology Department. It shows how the tools of evaluation methodology can be applied in order to provide the best possible information prospectively on the likely outcomes of proposed programs. A Prospective Evaluation Synthese (PES) may be conducted through the comparison of policy or program alternatives, although it is also useful when focused on a single policy or program. It is easiest to perform when an adequate data base already exists. Fortunately, data bases concerning proposed programs frequently do exist, primarily because problems are rarely new. Often they have been addressed by past programs whose experiences can be drawn upon for the PES. paper focused on a: 1) systematic method for providing the best possible information to decisionmakers; and 2) combination of techniques that would assist evaluators analyzing alternative proposals and various projections.

Williams, A. P., & Morris, J. C. (2009). The Development of Theory-Driven Evaluation in the Military: Theory on the Front Line. American Journal of Evaluation, 30(1), 62-79. Last viewed on 7 March 2009. URL: http://aje.sagepub.com/cgi/content/abstract/30/1/62

Notes: The use of theory-driven evaluation is an emerging practice in the military--an aspect generally unknown in the civilian evaluation community. First developed during the 1991 Gulf War and applied in both the Balkans and Afghanistan, these techniques are now being examined in the North Atlantic Treaty Organisation (NATO) as a means to evaluate the effects of military operations in complex, asymmetric conflict environments. In spite of these practices, theory-driven evaluation in the military is still in the developmental stages. This article traces the development to date of theory-driven evaluation in NATO and assesses its strengths and weaknesses in the military context. We conclude that a cross-pollination of ideas between military and civilian evaluators is urgently needed to improve the quality and effectiveness of military evaluation.

Wood, A., Apthorpe, R., & Borton, J. (Eds.). (2001). Evaluating international humanitarian action: reflections from practitioners. London: Zed Books

Notes: This book is a compilation of the experiences of practitioners engaged in humanitarian programme evaluations, and the lessons they have learned. The case studies cover the different kinds of humanitarian emergency characteristic of the past decade. The contributors, who are leading humanitarian evaluators, address the context in which evaluations of humanitarian assistance take place; the actual process of doing evaluations; and the lessons for improving how such evaluations might be better undertaken in future. This book is of particular interest to anyone commissioning, executing, or using evaluations of humanitarian action.

Page 47: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

47

COMMUNICATING (16) Becker, H., & Richards, P. (2007). Writing for social scientists: How to start and finish your thesis,

book, or article (2nd ed.). Chicago: University of Chicago Press

Notes: This is an updated version of a classic book how to conquer the pressures on writers to impress rather than to communicate. Becker‘s message is clear: in order to learn how to write, take a deep breath and then begin writing. Revise. Repeat. It is not always an easy process, as Becker wryly relates. Decades of teaching, researching, and writing have given him plenty of material, and Becker neatly exposes the foibles of academia and its ―publish or perish‖ atmosphere. Wordiness, the passive voice, inserting a ―the way in which‖ when a simple ―how‖ will do—all these mechanisms are a part of the social structure of academic writing. Several reviewers have commented that this book is more about organising your work than about writing as such.

Emerson, R. M., Fretz, R. I., & Shaw, L. L. (1996). Writing ethnographic fieldnotes. Chicago: University of Chicago Press

Notes: In this book, the authors reveal how the ethnographer turns direct experience and observation into written fieldnotes upon which an ethnography is based. Drawing on years of teaching and field research experience, the authors develop a series of guidelines, suggestions and practical advice about how to write useful fieldnotes in a variety of settings, both cultural and institutional. Using actual, unfinished "working" notes as examples, they illustrate options for composing, reviewing and working fieldnotes into finished texts. They analyze the "processing" of fieldnotes - the practice of coding notes to identify themes and methods for selecting and weaving together fieldnote excerpts to write a polished ethnography. This book, however, is more than a "how-to" manual. The authors examine writing fieldnotes as an interactive and interpretive process in which the researcher's own commitments and relationships with those in the field inevitably shape the character and content of those fieldnotes. They explore the conscious and unconscious writing choices that produce fieldnote accounts. And they show how the character and content of these fieldnotes inevitably influence the arguments and analyses the ethnographer can make in the final ethnographic tale.

Gullickson, A., & Stufflebeam, D. L. (2001). Feedback Workshop Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/feedbackworkshop.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist lists actions that evaluators should take before, during, and after a feedback workshop. Feedback workshops help stakeholders and evaluators to (1) ensure consistency between the evaluation, stakeholder values, and program plans; (2) increase understanding of the evaluation and utility of the findings; (3) improve the accuracy and utility of the evaluation report; and (4) review and refine evaluation plans.

Mundy, P., Mathias, E., & Bekalo, I. (2006). Out of heads and onto paper. LEISA-(Low External Input and Sustainable Agriculture), 22(1), 26-27. Last viewed on 9 April 2009. URL: http://www.mamud.com/Docs/outofheads.pdf

Notes: This article describes the innovative "writeshop" process developed by the International Institute of Rural Reconstruction in the Philippines. A ―writeshop‖ is an intensive, participatory workshop that aims to produce some kind of written output. This may be a set of extension brochures, a bound book, a set of leaflets, or a training manual. Participants may include scientists, researchers, government personnel, teachers, NGO staff, extension agents, farmers and other local people: anyone who has, in one way or another, been involved in the experiences to be documented. These participants are assisted by a team of facilitators, editors, computer operators, artists and logistics staff.

OECD/DAC. (2001). Evaluation feedback for Effective Learning and Accountability (Evaluation and aid effectiveness: 5). Paris: Organisation for Economic Co-operation and Development, Development Assistance Committee. Last viewed on 09 July 2009. URL: http://www.oecd.org/dataoecd/10/29/2667326.pdf

Page 48: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

48

Notes: This publication is composed of two parts: The Workshop Report, based on the Tokyo meeting in September 2000, highlights the various issues raised, topics of discussion and different feedback systems, and outlines the areas identified by participants as most relevant for improving evaluation feedback. The Background Synthesis Report, intended as a starting point for discussion at the workshop, outlines the main concerns and challenges facing evaluation feedback and the means to address these. The report is based on an analysis of questionnaire results, and a review of previous initiatives in this area.

Richardson, L. (1990). Writing Strategies: Reaching Diverse Audiences (Sage University Papers Series on Qualitative Research Methods: 21). Thousand Oaks: Sage Publications Inc

Notes: This short book considers different rhetorical ways of crafting writing so that work can be published in various venues at various stages in the research process. The author uses her own experiences to provide a candid view of how authors can use differet forms of encoding to present the same material for different audiences. The book is useful in the way it encourages the reader to think about the different forms of encoding we use automatically to present material for differetn audiences. Encouraging self-awareness in this way can lead to better writing.

Serrat, O. (2008). Identifying and Sharing Good Practices (Knowledge Solutions 14). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Identifying-Sharing-Good-Practices.pdf

Notes: Good practice is a process or methodology that has been shown to be effective in one part of the organization and might be effective in another too. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Posting Research Online (Knowledge Solutions 8). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Posting-Research-Online.pdf

Notes: Dissemination is an indispensable means of maximizing the impact of research. It is an intrinsic element of all good research practice that promotes the profile of research institutions and strengthens their capacities. The challenge is to ensure the physical availability of research material and to make it intelligible to those who access it. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Using Plain English (Knowledge Solutions 5). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Using-Plain-English.pdf

Notes: Many people write too much, bureaucratically, and obscurely. Using plain English will save time in writing, make writing far easier, and improve understanding. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Conducting Effective Presentations (Knowledge Solutions 27). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Conducting-Effective-Presentations.pdf

Notes: Simple planning and a little discipline can turn an ordinary presentation into a lively and engaging event. This is a Knowledge Solution from the Asian Development

Page 49: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

49

Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Monthly Progress Notes (Knowledge Solutions 26). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Monthly-Progress-Notes.pdf

Notes: Feedback is the dynamic process of presenting and disseminating information to improve performance. Feedback mechanisms are increasingly recognized as key elements of learning before, during, and after. Monthly progress notes on project administration, which document accomplishments as well as bottlenecks, are prominent among these. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Staff Profile Pages (Knowledge Solutions 32). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Staff-Profile-Pages.pdf

Notes: Staff profile pages are dynamic, adaptive electronic directories that store information about the knowledge, skills, experience, and interests of people. They are a cornerstone of successful knowledge management and learning initiatives. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Writing Weblogs (Knowledge Solutions 35). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Writing-Weblogs.pdf

Notes: A weblog, in its various forms, is a web-based application on which dated entries of commentary, descriptions of events, or other material such as graphics or video are posted. A weblog enables groups of people to discuss electronically areas of interest and to review different opinions and information surrounding a topic. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Sprent, P. (1995). Getting into print: a guide for scientists and technologists. London: Spon Press

Notes: This is a practical guide to all aspects of writing about science and technology. It is crammed with useful hints on how to make each kind of writing more attractive to the target readership. It also includes detailed advice on how to approach publishers, publishers' contracts and requirements and the author's role at each stage of book production, including tips on presentation of manuscripts on disc or as camera-ready copy. There is clear guidance on the best way to use tables, graphs and diagrams and on how to present formulae and choose examples and exercises. Advice is given for overcoming the often neglected problem of catering for users with widely different technical backgrounds when writing instruction manuals.

The World Bank Group, Carleton University, & IOB/Netherlands Ministry of Foreign Affairs. (2002). International Program for Development Evaluation Training (IPDET): Building Skills to Evaluate Development Interventions: Module 9. Presenting Results. Washington: Operations Evaluation Department of the World Bank and Operations Evaluation Group of the International Finance Corporation. Last viewed on 28 June 2008. URL: http://www.insp.mx/bidimasp/documentos/1/IPDET%20modulo%209.pdf

Page 50: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

50

Notes: The International Program for Development Evaluation Training (IPDET) program was initiated by the World Bank to meet the needs of evaluation and audit units of bilateral and multilateral development agencies and banks; developed and developing country governments, and evaluators working in development and nongovernmental organizations The overall goal of this training program is to enhance the knowledge, skills, and abilities of participants in development evaluation. It is our intention that by the end of the training program, participants will: 1) Understand the development evaluation process; 2) Be familiar with evaluation concepts, techniques, and issues; 3) Be able to weight different options for planning development; evaluations, including data collection, analysis, and reporting; 4) Be able to design a development evaluation. The training program is organized into twelve modules as follows: Module 1- Introduction to Development Evaluation; Module 2- Evaluation Models; Module 3- New Development Evaluation Approaches; Module 4- Evaluation Questions; Module 5- Impact, Descriptive, and Normative Evaluation Designs; Module 6- Data Collection Methods; Module 7- Sampling; Module 8- Data Analysis and Interpretation; Module 9- Presenting Results; Module 10- Putting it all Together; Module 11- Building a Performance-Based Monitoring and Evaluation; System; Module 12- Development Evaluation Issues.

Van Maanen, J. (1988). Tales from the Field. Chicago: University of Chicago Press

Notes: John Van Maanen, an experienced ethnographer of modern organizational structures, is one who believes that the real work begins when he returns to his office with cartons of notes and tapes. In Tales of the Field he offers readers a survey of the narrative conventions associated with writing about culture and an analysis of the strengths and weaknesses of various styles. He introduces first the matter-of-fact, realistic report of classical ethnography, then the self-absorbed confessional tale of the participant-observer, and finally the dramatic vignette of the new impressionistic style. He also considers, more briefly, literary tales, jointly told tales, and the theoretically focused formal and critical tales. Van Maanen illustrates his discussion of each style with excerpts from his own work on the police. Tales of the Field offers an informal, readable, and lighthearted treatment of the rhetorical devices used to present the results of fieldwork. Though Van Maanen argues ultimately for the validity of revealing the self while representing a culture, he is sensitive to the differing methods and aims of sociology and anthropology. His goal is not to establish one true way to write ethnography, but rather to make ethnographers of all varieties examine their assumptions about what constitutes a truthful cultural portrait and select consciously and carefully the voice most appropriate for their tales.

UTILISATION (6) Alkin, M. C., Daillak, R., & White, P. (1979). Using evaluations (Sage Library of Social Research: 76).

Beverly Hills: Sage

Notes: This book uses an analysis of five case studies of school program evaluations ot investigate utilization. The bulk of this book presents descriptions of each of the program evaluations, which are followed by the brief statements of various reviewers. It was suggested that eight interacting categories of factors contributed to the utilization of evaluation: preexisting evaluation bounds; orientation of users; evaluator's approach; evaluator credibility; organizational factors; extra-organizational factors; information content and reporting; and administrator style. The book makes clear that definitions of utilization range from the very narrow to the very broad, and that the extent to which an evaluation is 'utilized' depends on the definition adopted.

Carlsson, J., Eriksson-Baaz, M., Fallenius, A. M., & Lövgren, E. (1999). Are Evaluations Useful? Cases from Swedish Development Co-operation. (Sida Studies in Evaluation 99/1). Stockholm: Swedish International Development Agency. Last viewed on 09 July 2009. URL: http://www2.sida.se/shared/jsp/download.jsp?f=STUD99-1.pdf&a=2355

Notes: This report forms part of the study "Using the Evaluation Tool", initiated by Sida's Department for Evaluation and Internal Audit. The purpose of the study is to analyse the

Page 51: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

51

evaluation process as it currently works in Swedish development co-operation. It attempts to broaden our knowledge of the way in which evaluations are initiated, produced and, finally, distributed and put to use.

Carlsson, J., Forss, K., Metell, K., Segnestam, L., & Strömberg, T. (1997). Using the evaluation tool: A survey of conventional wisdom and common practice at Sida (Sida Studies in Evaluation 97/01). Stockholm: Swedish International Development Agency. Last viewed on 09 July 2009. URL: http://www2.sida.se/shared/jsp/download.jsp?f=STUD97-1.pdf&a=2366

Notes: This study was conducted to map the use of evaluations at Sida to answer the following questions: 1) how and why are evaluations initiated? 2) how is the evaluation process managed, from the formulation of purpose, the decision to evaluation and the commissioning of a study? 3) how are the results of this processed used? The study used tow methods, interview and the application of a quality model to evaluation reports. The quality model applied was the second edition of the program evaluation standards of the Joint Committee on Standards for Education Evaluation.

Patton, M. Q. (2002). Utilization-Focused Evaluation Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/ufe.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist provides guidance for conducting evaluations from a utilization-focused perspective, which is oriented around an evaluation‘s intended users and intended uses. The checklists outlines tasks and potential challenges organized around 12 evaluation stages: (1) Program/organizational readiness assessment, (2) Evaluator readiness and capability assessment, (3) Identification of primary intended users, (4) Situational analysis, (5) Identification of primary intended uses, (6) Focusing the evaluation, (7) Evaluation design, (8) Simulation of use, (9) Data collection, (10) Data analysis, (11) Facilitation of use, and (12) Metaevaluation.

Sandison, P. (2007). The utilisation of evaluations. In ALNAP Review of Humanitarian Action: Evaluation Utilisation (pp. 89-144). London: Active Learning Network on Accountability and Performance in Humanitarian Action. Last viewed on 21 September 2008. URL: http://www.alnap.org/publications/RHA2005/rha05_Ch3.pdf

Notes: This is a well-written summary of evaluation utilisation issues. While the review opens the observation that the credibility of evaluation will be undermined if its poor record of influencing humanitarian performance continues. However, the author notes that In some ways then, the findings indicate a less gloomy outlook: our pessimism partly results from a narrow perception of utilisation that does not do justice to the rich and often indirect use and influence of evaluation.

Serrat, O. (2008). Linking Research to Practice (Knowledge Solutions 7). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Linking-Research-Practice.pdf

Notes: The volume of research greatly exceeds its application in practice. Researchers must pay greater attention to the production of their research findings in a flexible range of formats in recognition of the varied needs of consumers. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

MONITORING AND EVALUATION (5) De Coninck, J., Chaturvedi, K., Haagsma, B., Griffioen, H., & van der Glas, M. (2008). Planning,

Monitoring and Evaluation in Development Organisations: Sharing Training and Facilitation Experiences. Los Angeles: Sage

Page 52: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

52

Notes: Planning, monitoring and evaluation (PME) remains a challenge for many development organisations, increasingly faced with the rigours of designing and using a well-structured monitoring and evaluation system, and of linking this closely with their planning cycles. Effective PME is, nevertheless, essential for their organisational survival and to enable them to make an effective contribution to sustainable development. This book shares the 'real-life' experiences of 20 PME trainers and facilitators from Africa, Asia and Europe and offers some suggestions for effective support to PME processes. It focuses on civil society organisations, including NGOs, church-linked development offices, networks, and people's organisations.

Guijt, I., Woodhill, J., Salm, M., Wright, J., Dayal, R., Kamaté, C., Johnson, D., Mikkelsen, B., Rebien, C., Vela, G., & Zaki, E. (2002). Managing for Impact in Rural Development: A Guide for Project M&E. Rome: International Fund for Agricultural Development. Last viewed on 11 July 2009. URL: http://www.ifad.org/evaluation/guide/m_e_guide.zip

Notes: This guide has been developed together with its potential users through a consultative process lasting over a year. Its overriding goal is to improve the impact of IFAD-funded projects, through the introduction of effective M&E systems. It focuses on a learning approach to management that uses achievements and problems to improve decision-making and accountability. This requires creating an M&E system that helps primary stakeholders, implementing partners and project staff to learn together in order to improve their development interventions on a continual basis. As the ultimate objective is to ensure the maximum possible benefit for the rural poor, they are the ones best placed to assess project impact and must therefore be considered full partners in any future M&E. The guide also suggests ideas for implementing this and other forms of participatory M&E. The primary target audience is composed of staff from project management units, in particular project directors and M&E officers, together with their implementation partners, such as, public services, NGOs and CBOs. The guide is also aimed at technical consultants and supervisors from co-operating institutions. Because the effectiveness of M&E systems also depends on the decisions taken during project design, specific sections of the guide provide advice to project designers, including IFAD staff and their consultants.

IFRC. (2002). Handbook for monitoring and evaluation. Geneva: International Federation of Red Cross and Red Crescent Societies. Last viewed on 28 June 2008. URL: http://www.ifrc.org/docs/evaluations/handbook.pdf

Notes: This is the Monotoring and Evaluation (M&E) guidence produced by the IFRC M&E Division. While this handbook has been drafted for use by all stakeholders it is particular mindful of the role of M&E from a National Society perspective. The handbook contains useful M&E tools and is supported by some theoretical background. It is intended that this handbook will be complemented by a series of training and information sessions either as a stand-alone module, or incorporated into existing programmes like the leadership development programme or the Project Planning Process (PPP).

UNDP. (2002). The Handbook on Monitoring and Evaluating for Results. New York: United Nations Development Programme. Last viewed on 11 July 2009. URL: http://www.undp.org/eo/documents/HandBook/ME-HandBook.pdf

Notes: This publication addresses the monitoring and evaluation of development results. It is intended to support country offices in aligning their monitoring and evaluation systems with RBM methodology—specifically in tracking and measuring the performance of UNDP interventions and strategies and their contributions to outcomes. It aims to provide simple, flexible and forward-looking tools. While its primary audience is country office staff, the Handbook also will be of interest to others within UNDP who use information gained through monitoring and evaluation to report on results, improve interventions and make programme and policy decisions. It also will be of use to staff concerned with policy change and reform.

World Bank. (2004). Monitoring and Evaluation: Some Tools, Methods, and Approaches. Washington: World Bank. Last viewed on 11 July 2009. URL:

Page 53: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

53

http://lnweb90.worldbank.org/oed/oeddoclib.nsf/DocUNIDViewForJavaSearch/A5EFBB5D776B67D285256B1E0079C9A3/$file/MandE_tools_methods_approaches.pdf

Notes: This very short publication by the World Bank's Independent Evaluation Group presents an overview of a range of M&E tools, methods, and approaches including their purpose and use; advantages and disadvantages; costs, skills, and time required; and key references. Those illustrated here include several data collection methods, analytical frameworks, and types of evaluation and review. The M&E Overview discusses: - Performance indicators - The logical framework approach - Theory-based evaluation - Formal surveys - Rapid appraisal methods - Participatory methods - Public expenditure tracking surveys - Impact evaluation - Cost-benefit and cost-effectiveness analysis

RESEARCH TEXT (9) Alasuutari, P., Bickman, L., & Brannen, J. (Eds.). (2008). The Sage handbook of social research

methods (Paperback ed.). Los Angeles: Sage

Notes: The Handbook includes chapters on each phase of the research process: research design, methods of data collection, and the processes of analyzing and interpreting data. There is much more to research than learning skills and techniques; methodology involves the fit between theory, research questions, research design, and analysis. The book also includes several chapters that describe historical and current directions in social research, debating crucial subjects such as qualitative versus quantitative paradigms, how to judge the credibility of types of research, and the increasingly topical issue of research ethics.

Becker, H. (1998). Tricks of the Trade: How to Think about Your Research While You're Doing It. Chicago: University of Chicago Press

Notes: This book helps the readers learn how to think about research projects. The book can help readers to a deeper understanding of their research data, and to look beyond the obvious. The tricks cover four broad areas of social science: the creation of the "imagery" to guide research; methods of "sampling" to generate maximum variety in the data; the development of "concepts" to summarise findings; and the use of "logical" methods to ensure that all aspects of issues are considered. The author provides examples to illustrates all the points he makes. Perhaps the most useful aspect of the book are the tools to help researchers look at their topics in a broader and deeper ways. The tricks are not short-cuts so much as tools to help the researcher achieve good quality work.

Hart, C. (1998). Doing a literature review: releasing the social science research imagination. London: Sage Publications

Notes: This is a text firmly focused on the doctoral and masters student for whom it provides an excellent framework for what is often an experiential process - doing a literature review. It is a guide through the multidimensional sea of academic literature. It sets out a number of important dimensions involved in the process of literature review and by clear signposting, diagrams, and examples will help the student to carry out her or his review more systematically. This book makes explicit those dimensions which could remain implicit or even missed by the student as they wade through all those books, papers, articles, and print-outs. The book also offers some insignts for conducting other types of doument reviews, but this is not the main focus of the text.

Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks: Sage

Notes: This methodological classic continues to provide practical, comprehensive and strategic guidance on qualitative design, purposeful sampling, interviewing, fieldwork, observation methods, and qualitative analysis and interpretation while integrating the extensive qualitative literature of the last decade. New to this edition are primary strategic themes of qualitative inquiry to clarify readers' understanding of the different strands of qualitative research; criteria-based frameworks for presenting and judging qualitative findings. The text identifies and contrasts sixteen different theoretical and philosophical approaches to qualitative inquiry. Additional new coverage on: new issues in and approaches to fieldwork; in-depth treatment of emergent designs and purposeful

Page 54: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

54

sampling; detailed analytical guidelines, including software and computer-assisted options; strategies for enhancing quality and credibility of qualitative findings, mixed methods, and triangulation; and, a review and listing of the latest internet resources The book examines and honours both the science and art of qualitative inquiry.

Potter, S. (2006). Doing postgraduate research (2nd ed.). London: Sage

Notes: This book is published in association with the Open University. It offers a practical and helpful guidebook both for students and supervisors. It is recommended for research students at the early stages of their research studies, because it provides a thought-provoking account of the different aspects of post-graduate research. Additionally, it can be a useful tool and resource pack for advanced research students who want to think about viva and career options after the completion of the PhD. This is a useful addition to the growing literature of books on post-graduate research. It provides not only a helpful reading but also a complete multimedia resource pack for new and more-advanced students and supervisors.

Riessman, C. K. (2008). Narrative methods for the human sciences. Los Angeles: Sage Publications

Notes: This well-regarded book focuses on four particular methods of narrative analysis and provides specific examples of good narrative research, as practiced in several social science and human service disciplines. It discusses the complexities between spoken language and any written transcript. The author presents several ways to think about credibility in narrative studies, contextualizing validity in relation to epistemology and theoretical orientation of a study. The author clarifies distinctions between inductive thematic coding in grounded theory, and other interpretive approaches, and narrative analysis. This text makes the approach accessible to readers not trained in social linguistics in part by providing rich examples from a number of different disciplines in the social and behavioral sciences. The author takes narrative research beyond the spoken or written texts by showing how exemplary researchers have connected participants‘ words and images made during the research process. She also discusses other research that incorporates ―found‖ images (in archives) in a narrative inquiry.

Robson, C. (2002). Real world research : a resource for social scientists and practitioner-researchers (2nd ed.). Oxford: Blackwell Publishers

Notes: This is an excellent and clearly written introductory book about social research methods. The book is well written and the author clearly communicated the main points despite the breath of coverage. It provides a good introduction to both qualitative and quantitative social research methods. The book is a must for anyone beginning to do research or studying research methods.

Webb, E. J., Campbell, D. T., Schwartz, R. D., & Sechrest, L. B. (1966). Unobtrusive measures: nonreactive research in the social sciences. ((Rand McNally sociology series.)). Chicago: Rand McNally

Notes: This is a delightful book that presents a number of examples of unobtrusive measures in the social world that allow measurement in a non-invasive way (such as using carpet wear to indicate which of a museums exhibits is the most popular). The multiple methods presented here may do more than raise these questions for discussion. They may provide alternatives by which ethical criteria can be met without impinging on important interests of the research subjects. Some of the methods describwed here, such as the use of archival records and trace measures, may serve to avoid the problems of invasions of privacy by permitting the researcher to gain valuable information without ever identifying the individual actors of in any way manipulating them. If ethical considerations lead us to avoid participant observation, interviews, or eavesdropping in given circumstances, the novel methods described in this monograph may be of value not only in improving and supplementing our information but also in permitting ethically scrupulous social scientists to do their work effectively and to sleep better at night.

Yates, S. J. (2004). Doing Social Science Research. Los Angeles and Milton Keynes: Sage and the Open University

Page 55: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

55

Notes: This text is a set text for the Open University's Social Science Postgraduate Foundation Module. It gives an introductory overview of the process of social research, from research design to data collection and analysis. It provides students and teachers with a mix of resources to help them to get to grips with the main methods of social research. These resources include a set of self-directed activities designed to get students using research methods, as well as a set of readings to support these activities. The readings also provide critical discussions and examples of the range of research methods. The text offers an introduction to the full range of research methods in the social sciences. The text clearly explains what these methods involve, as well as their basic relationship to arguments in the philosophy of social science. It will provide students and researchers with essential guidance on how to select the most appropriate method in their own research, as well as understanding and evaluating research conducted by others.

OTHER CHECKLIST (10) Bichelmeyer, B. A. (2003). Checklist for formatting checklists. Kalamazoo: The Evaluation Center,

Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/cfc.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist provides guidance for the design, development, and use of evaluation checklists. To be used after content design and prior to final development of an evaluation checklist.

Horn, J. (2001). Checklist for Developing and Evaluating Evaluation Budgets. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/evaluationbudgets.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist presents cost categories that should be considered when preparing a budget for an evaluation. Personnel, travel, supplies, communications, printing, equipment, consultants, in-kind services, and overhead are some of the major categories. Each checkpoint is supported by questions to prompt evaluators to think carefully and thoroughly about all the types of costs involved in doing evaluation

Patton, M. Q. (2003). Qualitative Evaluation Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/qec.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist provides guidance in determining when qualitative methods are appropriate for an evaluative inquiry and factors to consider (1) to select qualitative approaches that are particularly appropriate for a given evaluation‘s expected uses and answer the evaluation‘s questions, (2) to collect high quality and credible qualitative evaluation data, and (3) to analyze and report qualitative evaluation findings.

Shepard, L. A. (1977). Checklist for Evaluating Large-Scale Assessment Programs. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/assessment_eval.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist provides guidelines for evaluating large-scale student assessment programs. Checkpoints organized around goals and purposes, technical aspects, management, intended and unintended effects, and costs are presented to prompt evaluators to ask questions and seek effects in areas they might otherwise have missed.

Stufflebeam, D., & Shinkfield, A. (2007). Evaluation theory, models, and applications. San Francisco: Jossey-Bass

Notes: Recognizing that no single evaluation approach is always best, the authors assist evaluators to select approaches that best fit a particular evaluation assignment. A case drawn from housing and community development illustrates the application of six

Page 56: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

56

noteworthy approaches: experimental design, case study, Stufflebeam′s CIPP Model, Scriven′s consumer–oriented evaluation, Stake′s responsive evaluation, and Patton′s utilization–focused evaluation. This is both a textbook and a handbook. Its chapters can be accessed and used selectively. It includes down–to–earth procedures, checklists, and illustrations of how to carry out a sequence of essential evaluation tasks; identify and assess evaluation opportunities; prepare an institution to support a projected evaluation; design, budget, and contract evaluations; collect, analyze, and synthesize information; and report and facilitate use of findings. The book also addresses and illustrates metaevaluation, the fundamental process by which evaluators hold themselves accountable for delivering evaluation services that are useful, practical, ethical, and technically sound.

Stufflebeam, D. L. (1999). Evaluation Contracts Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/contracts.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist identifies key contractual issues that evaluators and clients should discuss and agree on when undertaking an evaluation. The checklist is geared toward preventing breakdown of the evaluation because of disputes, efforts to compromise the findings, and/or withdrawal of cooperation and funds.

Stufflebeam, D. L. (1999). Evaluation Plans and Operations Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/plans_operations.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist identifies important considerations when planning evaluations. Questions are presented to prompt evaluators to think proactively about an array of issues related to the conceptualization of the evaluation, sociopolitical factors, contractual arrangements, technical design, management plan, moral/ethical imperatives, and utility provisions. This checklist may be used as a planning and management tool, or for formative metaevaluation.

Stufflebeam, D. L. (2000). Guidelines for developing evaluation checklists: the Checklists Development Checklist (CDC). Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/guidelines_cdc.pdf

Notes: This is a checklist from the Evaluation Checklists Project at Western Michigan University. This checklist provides step-by-step guidelines for creating your own evaluation checklist. Checkpoints are supported by detailed explanations and rationales.

Stufflebeam, D. L. (2004). Evaluation Design Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 28 June 2008. URL: http://www.wmich.edu/evalctr/checklists/evaldesign.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist identifies elements commonly included in evaluation designs.This checklist is intended as a generic guide to decisions one typically needs to at least consider when planning and conducting an evaluation.

Stufflebeam, D. L. (2007). CIPP Evaluation Model Checklist. Kalamazoo: The Evaluation Center, Western Michigan University. Last viewed on 06 July 2009. URL: http://www.wmich.edu/evalctr/checklists/cippchecklist_mar07.pdf

Notes: This is a checklist from the Evaluation Checklists project at Western Michigan University. This checklist identifies activities that both evaluators and clients should conduct in evaluations guided by the CIPP Model, which covers context, input, process, and product. This checklist represents the latest incarnation of CIPP, which breaks product evaluation into effectiveness, sustainability, and transportabilty evaluation and also provides for metaevaluation and synthesis. This checklist is geared toward evaluations of programs designed to achieve long-term, sustainable improvements.

Page 57: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

57

KNOWLEDGE MANAGEMENT (22) Serrat, O. (2008). Auditing Knowledge (Knowledge Solutions 13). Manila: Asian Development

Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Auditing-Knowledge.pdf

Notes: Knowledge audits help organizations identify their knowledge-based assets and develop strategies to manage them. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Managing Knowledge Workers (Knowledge Solutions 12). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Managing-Knowledge-Workers.pdf

Notes: A knowledge worker is someone who is employed because of his or her knowledge of a subject matter, rather than ability to perform manual labor. They perform best when empowered to make the most of their deepest skills. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Notions of Knowledge Management (Knowledge Solutions 18). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Notions-Knowledge-Management.pdf

Notes: Knowledge management is getting the right knowledge to the right people at the right time, and helping them (with incentives) to apply it in ways that strive to improve organizational performance. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2008). Picking Investments in Knowledge Management (Knowledge Solutions 24). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Picking-Investments.pdf

Notes: What can be measured is not necessarily important and what is important cannot always be measured. When prioritizing investments in knowledge management, common traps lie waiting. They are delaying rewards for quick wins, using too many metrics, implementing metrics that are hard to control, and focusing on metrics that tear people away from business goals. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Building a Learning Organization (Knowledge Solutions 46). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Building-a-Learning-Organization.pdf

Notes: Learning is the key to success—some would even say survival—in today's organizations. Knowledge should be continuously enriched through both internal and external learning. For this to happen, it is necessary to support and energize organization,

Page 58: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

58

people, knowledge, and technology for learning. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Building Institutional Capacity for Development (Knowledge Solutions 47). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Building-Institutional-Capacity-for-Development.pdf

Notes: The conditions of economic and social progress include participation, democratic processes, and the location of necessarily diverse organizational setups at the community, national, regional, and increasingly global levels. Access to and judicious use of information underpin all these. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Building Trust in the Workplace (Knowledge Solutions 57). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Building-Trust-in-the-Workplace.pdf

Notes: Workplace dynamics make a significant difference to people and the organizations they sustain. High-performance organizations earn, develop, and retain trust for superior results. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Distributing Leadership (Knowledge Solutions 64). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Distributing-Leadership.pdf

Notes: The prevailing view of leadership is that it is concentrated or focused. In organizations, this makes it an input to business processes and performance--dependent on the attributes, behaviors, experience, knowledge, skills, and potential of the individuals chosen to impact these. The theory of distributed leadership thinks it best considered as an outcome. Leadership is defined by what one does, not who one is. Leadership at all levels matters and must be drawn from, not just be added to, individuals and groups in organizations. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Drawing Learning Charters (Knowledge Solutions 65). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Drawing-Learning-Charters.pdf

Notes: Despite competing demands, modern organizations should not forget that learning is the best way to meet the challenges of the time. Learning charters demonstrate commitment: they are a touchstone against which provision and practice can be tested and a waymark with which to guide, monitor, and evaluate progress. It is difficult to argue that what learning charters advocate is not worth striving for. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects.

Page 59: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

59

They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Enhancing Knowledge Management Strategies (Knowledge Solutions 58). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/enhancing-knowledge-management-strategies.pdf

Notes: Despite worldwide attention to strategic planning, the notion of strategic practice is surprisingly new. To draw a strategy is relatively easy but to execute it is difficult—strategy is both a macro and a micro phenomenon that depends on synchronization. One should systematically review, evaluate, prioritize, sequence, manage, redirect, and if necessary even cancel strategic initiatives. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Exercising Servant Leadership (Knowledge Solutions 63). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/exercising-servant-leadership.pdf

Notes: Servant leadership is now in the vocabulary of enlightened leadership. It is a practical, altruistic philosophy that supports people who choose to serve first, and then lead, as a way of expanding service to individuals and organizations. The sense of civil community that it advocates and engenders can facilitate and smooth successful and principled change. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). From Strategy to Practice (Knowledge Solutions 60). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/from-strategy-to-practice.pdf

Notes: Strategic reversals are quite commonly failures of execution. In many cases, a strategy is abandoned out of impatience or because of pressure for an instant payoff before it has had a chance to take root and yield results. Or its focal point is allowed to drift over time. To navigate a strategy, one must maintain a balance between strategizing and learning modes of thinking. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Glossary of Knowledge Management (Knowledge Solutions 39). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/glossary-of-knowledge-management.pdf

Notes: The knowledge management discipline can be cryptic. These Knowledge Solutions define its most common concepts in simple terms. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Harnessing Creativity and Innovation in the Workplace (Knowledge Solutions 61). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL:

Page 60: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

60

http://www.adb.org/documents/information/knowledge-solutions/harnessing-creativity-and-innovation-in-the-workplace.pdf

Notes: Creativity plays a critical role in the innovation process, and innovation that markets value is a creator and sustainer of performance and change. In organizations, stimulants and obstacles to creativity drive or impede enterprise. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Leading in the Workplace (Knowledge Solutions 59). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/leading-in-the-workplace.pdf

Notes: Theories of leadership are divided: some underscore the primacy of personal qualities; others stress that systems are all-important. Both interpretations are correct: a larger pool of leaders is desirable all the time (and superleaders are necessary on occasion) but its development must be part of systemic invigoration of leadership in organizations. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Learning and Development for Management (Knowledge Solutions 49). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Learning-and-Development-for-Management.pdf

Notes: The insights, attitudes, and skills that equip managers for their various responsibilities come from many sources outside formal education or training. To identify areas for improvement, it is first necessary to identify what these responsibilities are. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Learning in Strategic Alliances (Knowledge Solutions 63). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Learning-in-Strategic-Alliances.pdf

Notes: Strategic alliances that bring organizations together promise unique opportunities for partners. The reality is often otherwise. Successful strategic alliances manage the partnership, not just the agreement, for collaborative advantage. Above all, they also pay attention to learning priorities in alliance evolution. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Learning Lessons with Knowledge Audits (Knowledge Solutions 51). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Learning-Lessons-with-Knowledge-Audits.pdf

Notes: Knowledge from evaluations will not be used effectively if the specific organizational context, knowledge, and relationships of evaluation agencies, and the external environment they face, are not dealt with in an integrated and coherent manner. Knowledge management can shed light on this and related initiatives can catalyze and facilitate identification, creation, storage, sharing, and use of lessons. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to

Page 61: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

61

tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). The Roots of an Emerging Discipline (Knowledge Solutions 56). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/Roots-Emerging-Discipline.pdf

Notes: Organizations must become information based: (i) Knowledge workers are not amenable to command and control; (ii) In the face of unremitting competition, it is vital to systematize innovation and entrepreneurship; (iii) In a knowledge-based economy, it is imperative to decide what information one needs to conduct one's affairs. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Understanding and Developing Emotional Intelligence (Knowledge Solutions 49). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/understanding-developing-emotional-intelligence.pdf

Notes: Emotional intelligence describes an ability, capacity, skill, or self-perceived ability to identify, assess, and manage the emotions of one's self, of others, and of groups. The theory is enjoying considerable support in the literature and has had successful applications in many domains. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Value Cycles for Development Outcomes (Knowledge Solutions 53). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/value-cycles-for-development-outcomes.pdf

Notes: Development work is a knowledge-intensive process that is fed by knowledge services and knowledge solutions. Projects are the primary mechanism by which strategic change is brought about. Value cycles can maximize their potential through delivery platforms. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Wearing Six Thinking Hats (Knowledge Solutions 50). Manila: Asian Development Bank. Last viewed on 01 November 2009. URL: http://www.adb.org/documents/information/knowledge-solutions/wearing-six-thinking-hats.pdf

Notes: The difference between poor and effective teams lies not so much in their collective mental equipment but in how well they use their abilities to think together. The Six Thinking Hats technique helps actualize the thinking potential of teams. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Page 62: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

62

LEARNING (5) Britton, B. (2002). Learning for Change: Principles and practices of learning organisations.

Sundbyberg: Swedish Mission Council. Last viewed on 11 November 2009. URL: http://www.hivos.nl/eng/content/download/19179/119448/file/Learning%20for%20Change%20(Britton).pdf

Notes: This book was commissioned by Swedish Mission Council as a way of further developing an understanding of organisational learning in church-related organisations involved in international development. It builds on an earlier paper written by the author, Bruce Britton, in 1998 (Britton, 1998) and includes examples of how practice and thinking in the field of organisational learning have evolved since then.

Jarvis, P., Holford, J., & Griffin, C. (2003). The theory and practice of learning (2nd ed.). Abingdon: Routledge

Notes: An excellent, brief, lucid, and well-written introduction to the issue of learning. Learning is among the most basic of human activities. The study of learning, and research into learning is becoming a central part of educational studies. It is also of critical importance to evaluators, as a great deal of evaluation is formative - conducted to learn lessons. The contents cover: lifelong learning; the social background to learning; cognitivist theory; types of learning; learning using ICT; and philosophical reflections on learning.

Serrat, O. (2008). Action Learning (Knowledge Solutions 19). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Action-Learning.pdf

Notes: Action learning is a structured method that enables small groups to work regularly and collectively on complicated problems, take action, and learn as individuals and as a team while doing so. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Serrat, O. (2009). Learning from Evaluation (Knowledge Solutions 44). Manila: Asian Development Bank. Last viewed on 20 June 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/Learning-from-Evaluation.pdf

Notes: Evaluation serves two main purposes: accountability and learning. Development agencies have tended to prioritize the first, and given responsibility for that to centralized units. But evaluation for learning is the area where observers find the greatest need today and tomorrow. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Smutylo, T. (2001). Crouching impact, hidden attribution: overcoming threats to learning in development programs. Paper presented at the Block Island Workshop on Across Portfolio Learning, 22-24 May 2001. Last viewed on 9 November 2009. URL: http://www.idrc.ca/uploads/user-S/10905186681Crouching_Impact,_Hidden_Attribution.pd.pdf

Notes: This paper outlines a methodology developed by the International Development Research Centre (IDRC) for use in assessing its support of applied research in developing countries. Entitled ―Outcome Mapping‖, this methodology can be used to create planning, monitoring and evaluation mechanisms enabling organizations to document, learn from, and report on, their achievements. It is designed to assist in understanding an organization‘s results while recognizing that contributions by other actors are essential to

Page 63: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

63

achieving the kinds of sustainable, large scale improvements in human and ecological well-being towards which the organization is working. The innovations introduced in Outcome Mapping offer ways of overcoming some of the barriers to learning faced by evaluators. Attribution and measuring downstream results are dealt with through a more direct focus on transformations in the actions of the main actors. The methodology has also shown promise for across portfolio learning in that it facilitates standardization of indicators without losing the richness in each case‘s story.

MANAGEMENT (3) Lindahl, C. (1998). The Management of Disaster Relief Evaluations: Lessons from a Sida evaluation

of the complex emergency in Cambodia (Sida Studies in Evaluation 98/1). Stockholm: Swedish International Development Agency. Last viewed on 09 July 2009. URL: http://www2.sida.se/shared/jsp/download.jsp?f=STUD98-1.pdf&a=2363

Notes: This short paper is a review of disaster relief evaluation through the medium of a case study of the 1994 evaluation of Swedish emergency aid to Cambodia. The five lessons identified are on: 1) writing the terms of reference; 2) team composition and selection; 3) demanding an evaluation methodology; 4) allowing enough time for a good evaluation; and 5) treating evaluation as a continuous project and not an ad-hoc event.

Serrat, O. (2009). Managing By Walking Around (Knowledge Solutions 37). Manila: Asian Development Bank. Last viewed on 13 April 2009. URL: http://www.adb.org/Documents/Information/Knowledge-Solutions/managing-by-walking-around.pdf

Notes: Management by walking around emphasizes the importance of interpersonal contact, open appreciation, and recognition. It is one of the most important ways to build civility and performance in the workplace. This is a Knowledge Solution from the Asian Development Bank. These are handy, quick reference guides to tools, methods, and approaches that propel development forward and enhance its effects. They are offered as resources to ADB staff. They may also appeal to the development community and people having interest in knowledge and learning.

Van der Eyken, W. (1999). Managing evaluation (2nd ed.). London: Charities Evaluation Services

Notes: This is a very short introduction to evaluation for charity managers. The book is so short (less than 40 pages) that it is almost a pamphlet rather than a book. Despite being short it does cover the main points of commissioning an evaluation. However the brevity of coverage and the fact that it addresses general charity evaluation rather than humanitarian or development evaluation specifically means that it is less useful than for example the Sida evaluation guide.

EPISTEMOLOGY (3) House, E., & Howe, K. (1999). Values in evaluation and social research. Thousand Oaks: Sage

Notes: This is a somewhat dense academic text that brings insights from philosophy to examine the question of the fact-value dichotomy in evaluation. The book concentrates on theory and is not an easy read. Some acquaintance with current epistemological debates is assumed. Not a book for the faint-hearted.

Oakley, A. (2000). Experiments in knowing : gender and method in the social sciences. New York: The New Press

Notes: This book ranges over a wide variety of the epistemology (the philosophical theory of knowledge - theories about what is knowledge and how we build it). The book concentrates on the way in which research methods have developed, seen through the lens of a feminist perspective. However, it also encourages the reader to consider more general epistemological questions.

Page 64: Evaluation of Humanitarian Action (EHA) Course …Framework for evaluation 10 Evaluation: Evaluability 3 Joint Evaluation 3 Real Time Evaluation 3 Impact Evaluation 10 Evaluation Other

Training on the evaluation of Humanitarian Action: Bibliography. Channel Research / ALNAP

64

Smith, M. (1998). Social Science in Question. London and Milton Keynes: Sage and the Open University

Notes: This is the Course Text for The Open University's Postgraduate Foundation Module in Social Science. If focuses on social science epistemology. The text takes the reader on an intellectual journey starting with the story of modern science and the impact that this has had on social scientific practice, and going on to outline and critically review the major approaches to social scientific inquiry, ranging from positivism to postmodernism. Throughout, readers are encouraged to think carefully about what it means to: study the social world in a scientific way; make connections between what they do and the everyday lives of the people they study; and look beyond their discipline and think in a postdisciplinary way.


Recommended