+ All Categories
Home > Documents > Note to evaluator…

Note to evaluator…

Date post: 03-Feb-2016
Category:
Upload: camila
View: 29 times
Download: 0 times
Share this document with a friend
Description:
Note to evaluator…. - PowerPoint PPT Presentation
Popular Tags:
34
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level the field” for the simulation (step 8). The main goal is to guide the primary user through the definition of intended outcomes and the selection of required data and appropriate methods to respond to the key evaluation questions. (PLEASE READ THE NOTES SECTION OF THE DIFFERENT SLIDES) Please adapt this presentation to the context of the project that you are
Transcript
Page 1: Note to evaluator…

Note to evaluator…

The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level the field” for the simulation (step 8). The main goal is to guide the primary user through the definition of intended outcomes and the selection of required data and appropriate methods to respond to the key evaluation questions. (PLEASE READ THE NOTES SECTION OF THE DIFFERENT SLIDES)

Please adapt this presentation to the context of the project that you are evaluating and to your facilitation style.

Page 2: Note to evaluator…

Facilitating UFE step-by-step:

a process guide for evaluators

Joaquín Navas & Ricardo Ramírez

February, 2010

Module 3: Step 7 of UFE checklist

Page 3: Note to evaluator…

Meeting’s Objetives

1. Review report of previous meeting(s) & validate preliminary KEQ analysis.

2. Define the program’s intended outcomes.

3. Define required data in order to respond to the KEQ.

4. Select appropriate methods for data collection (Step 7 of UFE).

Page 4: Note to evaluator…

Agenda

1. Discussion on report of previous meeting – validation of preliminary analysis of KEQ.

2. Identification of intended outcomes of the program.

3. Break.

4. Definition of required data.

5. Selection of appropriate data collection methods.

Page 5: Note to evaluator…

What we have accomplished so far…

1. First draft of KEQ that seems useful to guide the remainder of the evaluation process.

2. First 6 steps of the UFE checklist have been covered.

3. The process has been well documented up to this point.

Page 6: Note to evaluator…

Comments on Previous Report

Page 7: Note to evaluator…

Comments on Second Report

Does the report describe the process well?

Is it worth documenting the process in a very detailed manner?

Are you happy with the KEQ? Is the analysis presented on the report valid?

Page 8: Note to evaluator…

KEQ Validation Analysis

#Key Evaluation

Question

Related Primary

Intended Use

KEQ Category

Does the KEQ comply with the desired KEQ

features?

Related specific program objective

KEQ #1

KEQ #2

KEQ #3

KEQ #4

Page 9: Note to evaluator…

1. Objective #1.

2. Objective #2.

3. Objective #3.

Project’s specific objectives

This slide is only for

reference in case

someone in the

audience needs to look

at the objectives to

discuss the table on

slide #8.

Page 10: Note to evaluator…

Categories of key evaluation questions

INPUT / RESOURCES

IMPACT

OUTCOMES

APPROACH / MODEL

PROCESS

QUALITY

COST- EFFECTIVENESS

This slide is only for

reference in case

someone in the

audience needs to look

at the KEQ categories

to discuss the table on

slide #8.

(Adapted from Dart, 2007)

Page 11: Note to evaluator…

• Specific enough to be useful in guiding you through the

evaluation

• Broad enough to be broken down - are not the same as a

question in a survey

• Data (qualitative/quantitative) can be brought to bear on the KEQ

• KEQs are open questions (can’t answer yes or no!)

• Have meaning for those developing the plan

• Lead to useful, credible, evaluation

• There aren’t too many of them (2-4 is enough).

What makes good KEQs? (adapted from Dart, 2007)

This slide is only for

reference in case

someone in the

audience needs to look

at the desired KEQ

features to discuss the

table on slide #8.

Page 12: Note to evaluator…

Utilization-Focused Outcomes Framework as roadmap

Participant target

group

How results will be used Details of data

collection

Desired outcomes

for the target group

Outcome Indicators

Performance

Targets

KEQ

Adapted from Patton (2008: 243-251): Utilization-Focused Outcomes Framework

DO NOT SHOW THIS SLIDE

Page 13: Note to evaluator…

The trajectory of change…

INPUT / RESOURCES

ACTIVITIES

OUPUTS

OUTCOMES

IMPACT / RESULTS

CONTROL

&

PREDICTION?

Page 14: Note to evaluator…

Focusing on outcomes (1/17)

DESIRED/EXPECTED OUTCOMES

Desired or expected outcomes that would result from

the program subject of this evaluation.

What are you trying to achieve with your program?

What type of changes do you want to see in the program

participants in terms of behaviour, attitude, knowledge, skills,

status, etc?

Page 15: Note to evaluator…

Focusing on outcomes (2/17)

DESIRED/EXPECTED OUTCOMES

Specific Objectives OUTCOMES

What do you want to achieve?

Type of change

Proyect objective #1 Outcome #1 X

Proyect objective #2 Outcome #2 Y

Proyect objective #3 Outcome #3 X,Y,Z

Page 16: Note to evaluator…

BREAK

Page 17: Note to evaluator…

Focusing on outcomes (3/17)

DETAILS OF DATA COLLECTION

¿What data do you need in order to answer the

KEQs?

Page 18: Note to evaluator…

Focusing on outcomes (4/17)

#Key Evaluation Questions Required

data

Other considerations

for the evaluation

KEQ #1

KEQ #2

KEQ #3

KEQ #4

DETAILS OF DATA COLLECTION

Page 19: Note to evaluator…

Focusing on outcomes (5/17)

DETAILS OF DATA COLLECTION

¿What methods could be used to collect the

required data?

Page 20: Note to evaluator…

Focusing on outcomes (6/17)

DETAILS OF DATA COLLECTION

1. There is no magic key to tell you the most appropriate method to answer

your KEQ.

2. All methods have limitations, so try using a combination of methods.

3. Each type of question suits specific approaches/methods – so let them

guide you. Other factors to consider: time, cost, resources, knowledge.

4. Primary users should the one to determine what constitutes credible

evidence. The primary user should feel comfortable with the selected

methods and the collected data.

Adapted from Dart, 2007.

Page 21: Note to evaluator…

Focusing on outcomes(7/17)

DETAILS OF DATA COLLECTION

COMPATIBILITY BETWEEN METHODS AND QUESTION CATEGORIES

Impact: Contribution Analysis / Data trawl & expert panel / GEM.

Outcomes: OM / MSC / GEM.

Approach/Model: Comparative studies of different approaches.

Process: Evaluation study: interview process, focus groups.

Quality: Audit against standards, peer review.

Cost-effectiveness: Economic modeling

Adapted from Dart, 2007.

Page 22: Note to evaluator…

Focusing on outcomes (8/17)

DETAILS OF DATA COLLECTION – METHODS SUMMARY (1/3)

Contribution Analysis: Seeks for evidence to show evidence between a given

activity and an outcome in order to show change trends that have resulted from

an intervention. Does not intend to show linear causality.

Data Trawl: Data search and analysis from disperse literature in order to

identify relationships between activities and outcomes.

http://www.kimointernational.org/DataTrawl.aspx

GEM (Gender Evaluation Methodology): Links gender and ICT through

relevant indicators. Read more: http://www.apcwomen.org/gem/

Page 23: Note to evaluator…

Focusing on outcomes (9/17)

DETAILS OF DATA COLLECTION – METHODS SUMMARY (2/3)

Outcome Mapping: Focuses on mid-term outcomes, suggesting that in the best case

scenario these outcomes will lead to long-term impact in a non-linear way. Read more:

http://www.outcomemapping.ca

Most Significant Change: Seeks to identify most significant changes based on

participants´ stories. Read more: http://www.kstoolkit.org/Most+Significant+Change

Expert panels: Group of experts is invited to comment and analyze outcomes and how

they relate to possible impacts. Read more:  http://www.ljmu.ac.uk/EIUA/reda/

Page 24: Note to evaluator…

Focusing on outcomes (10/17)DETAILS OF DATA COLLECTION – METHODS SUMMARY (2/3)

Comparative studies of different approaches: Self-explanatory.

Interview process: Interviews on how participants experienced the process of the

project subject of the evaluation.

Focus Groups: Self-explanatory.

Audit against standards: This might refer to a comparative analysis against specific

standards.

Peer reviews: Self-explanatory.

Economic Modeling: Not sure what this method refers to.

Page 25: Note to evaluator…

Focusing on outcomes (11/17)

DETAILS OF DATA COLLECTION

Given the primary intended USES of the

evaluation, do you think that the results that will

be obtained with these methods will be :

Credible (accurate)?

Reliable (consistent)?

Valid (true, believable and correct)?

Page 26: Note to evaluator…

Focusing on outcomes (12/17)

DETAILS OF DATA COLLECTION

Do you think that these methods are :

Cost-effective?

Practical?

Ethical?

Page 27: Note to evaluator…

Focusing on outcomes (13/17)

DETAILS OF DATA COLLECTION

¿Do you think that you will be able to use the

results that you will obtain by the selected

methods according to the purposes and intended

uses that you defined earlier in the process?

Page 28: Note to evaluator…

Focusing on outcomes (14/17)

DETAILS OF DATA COLLECTION

Formative improvement and learning

To improve the program subject of the evaluation.

Findings’ primary intended uses

Knowledge generation To identify patterns of effectiveness.

To adapt interventions to emerging conditions.

Program development

Evaluation purposes

This is just an

example, please adapt

to your particular

scenario.

Page 29: Note to evaluator…

Focusing on outcomes (15/17)

DETAILS OF DATA COLLECTION

Who will do the data collection? How will you, as

primary users, be involved in the data collection?

Page 30: Note to evaluator…

Focusing on outcomes (16/17)

DETAILS OF DATA COLLECTION

Will the data collection be based on a sample?

How do you think the sampling should be

done? Who will do it?

Page 31: Note to evaluator…

Focusing on outcomes (17/17)

DETAILS OF DATA COLLECTION

Who will manage and analyze collected data?

How will you, as primary users, be involved in

data management and analysis?

Page 32: Note to evaluator…

Conclusions and next steps

Page 33: Note to evaluator…

Conclusions and next steps (for the evaluator only)

Page 34: Note to evaluator…

References

Patton, M.Q. (2008). Utilization focused evaluation, 4th Edition. Sage.

Dart, J. 2007. “Key evaluation questions”. Presentation at the Evaluation in Practice Workshop. Kualal Lumpur, December. http://evaluationinpractice.files.wordpress.com/2008/01/keyquestionschoices.pdf


Recommended