+ All Categories
Home > Documents > Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs....

Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs....

Date post: 19-Jan-2016
Category:
Upload: grace-shepherd
View: 216 times
Download: 0 times
Share this document with a friend
Popular Tags:
39
Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an evaluation a. Planning a Formative evaluation b. Planning a Summative evaluation 6. Cost benefit analysis 7. An experimenting society 8. Words of warning
Transcript
Page 1: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Outline – Part I

1. Definitions

2. Pseudo-evaluation vs. legitimate evaluation

3. Formative vs. summative evaluation

4. Necessary skills

5. Planning an evaluationa. Planning a Formative evaluation

b. Planning a Summative evaluation

6. Cost benefit analysis

7. An experimenting society

8. Words of warning

Page 2: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Definition (1)

Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object.

-- William Trochim (Cornell University)

Page 3: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

3

Pseudo-evaluation vs. legitimate evaluation

Pseudo-evaluations Evaluation usually occurs in a political context.

Legitimate evaluations

Be careful not to engage in the first type (Pseudo-evaluation)!

Be careful not to engage in pseudo-evaluations

Doing so may facilitate inappropriate decisions

It will also damage your professional reputation

Page 4: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Pseudo-evaluations – a taxonomy

Here are some procedures to watch out for (E.A. Suchman, 1967):

Eyewash – emphasis on surface appearances

Whitewash – attempts to cover up known failures

Page 5: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Pseudo-evaluations – a taxonomy

Submarine – political use of evaluation to destroy a programme

Posture – ritualistic evaluation to satisfy a funding requirement, without real interest in, or intention to use, its findings

Postponement – using the need for evaluation to delay action

Page 6: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Legitimate Evaluations – Four Criteria

Here are four criteria to help you recognize a legitimate evaluation:

1. Utility

2. Feasibility

3. Propriety

4. Technical adequacy

Page 7: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Four criteria for legitimate evaluations

1. Utility – will someone be able to use it?

As Robson says, “the purpose of an evaluation is not to prove, but to improve.” (2002, p. 209)

Page 8: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Four criteria for legitimate evaluations

2. Feasibility – will you have the resources, time, and co-operation you need? If not, don’t do the evaluation.

Won’t achieve anything useful May damage your professional reputation. Especially an issue in formative evaluation, where

results may be needed for program planning. Remember the engineer’s maxim:

“Good, fast, cheap. Pick any two.”

Page 9: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Four criteria for legitimate evaluations

3. Propriety – only do an evaluation if you can do it fairly and ethically.

No ‘submarines’ Acceptable outcome measures Say ‘no’ if you believe the course of action has

already been decided on, and a decision maker just wants ‘cover.’

Page 10: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Four criteria for legitimate evaluations

4. Technical adequacy – if you are satisfied on the first three issues, carry out the evaluation with technical skill and sensitivity.

How can you tell whether you have the technical skill? What do you have to think about in planning? What are the relevant skills?

We’ll consider these issues below…

Page 11: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What to think about in planning

Reasons for evaluating

Why is the evaluation being done? Who should have access to the information obtained?

What value will results have?

Will action be taken? Will someone not want results published?

Page 12: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What to think about in planning

Interpretation

Is the nature of the evaluation agreed upon by those involved?

Outcome measures

What type of change is good, or bad?

Page 13: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What to think about in planning

Subject

What kind of information do you need?

Evaluators

Who will gather the information?

Who will analyze the data and write the report?

Page 14: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What to think about in planning

Methods

What method is appropriate given the questions?

Can you develop your method in the time allowed?

Is your method acceptable to those involved? (Service providers and consumers.)

Page 15: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What to think about in planning

Time

What time is available? Is this sufficient?

Permissions and control

Necessary permissions obtained? Is participation voluntary? Who decides what goes into the report?

Page 16: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What to think about in planning

Use

Who decides how the evaluation will be used?

Will those involved (providers, consumers) see the report in a modifiable draft version?

Is the form of the report appropriate for the intended audience (style, length, stats)?

Page 17: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

An evaluation culture

These ideas are based on Donald Campbell’s (1969) concept of an experimenting society, and Trochim’s related concept of an evaluation culture

To learn more about Trochim’s ideas, see: http://www.socialresearchmethods.net/kb/evalcult.php

Page 18: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

An evaluation culture

An evaluation works and improves because the culture is:

Action-oriented Teaching-oriented Diverse, inclusive, participatory, responsive and

fundamentally non-hierarchical. Humble, self-critical

Page 19: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

An evaluation culture

An evaluation works and improves because the culture is:

Interdisciplinary Truth-seeking, forward-Looking Ethical, and democratic

Page 20: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Words of warning

Keep it simple

Avoid complex designs and data analysis

Think defensively

Anything that can go wrong, will go wrong. Try to anticipate potential problems and plan how

you will deal with them.

Page 21: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Words of warning

Change will always have sponsors and critics. People’s lives may be radically changed On the basis of your findings. jobs may be on the linecareers may be advanced or sloweda program may be expanded or cut back

Page 22: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Words of warning

There will be many stakeholders – politicians, administrators, deliverers, targets, unions, taxpayers.

It is unlikely that the interests of all these groups will coincide.

Page 23: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Outline – Part II

1. Formative & Summative evaluation defined

2. Elements of a Formative evaluation

3. Elements of a Summative evaluation

4. Evaluation strategiesA.Scientific-Experimental Paradigms

B.Management-oriented systems models

C.Qualitative-Anthropological models

D.Participant-oriented models

5. Necessary Skills

Page 24: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Two Types of Evaluation

Formative evaluation

Helps in the development of a program or service.

Summative evaluation

Assesses the effects and effectiveness of the program

Covers all effects, not just those intended

Page 25: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Formative Evaluation - Elements

Questions about the process being evaluated:

1. Structured conceptualization

2. Logic model

3. Process evaluation

4. Implementation evaluation

Page 26: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Formative Evaluation – elements

1. Structured conceptualization – helps stakeholders define program, targets, and desired outcomes.

Stakeholders – who are they? Outcomes – how do you plan to measure them?

Page 27: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Formative Evaluation – elements 2. A logic model makes explicit the

steps that are expected to produce the desired change. It is often shown as a flow chart or map.

A good logic model may reveal hidden assumptions about how intervention will work.

Page 28: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Formative Evaluation – elements

3. Process evaluation – What alternative procedures are available for delivery of the program?

4. Implementation evaluation – Is program being delivered the way it is supposed to be? Are there unexpected consequences?

Page 29: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Summative Evaluation

Outcome evaluation Did program cause demonstrable effects on

predefined outcome measures?

Impact evaluation Broader – assesses overall effects, intended

and unintended, of a program

Page 30: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Summative Evaluation Cost-benefit analysis

Questions about efficiency Standardizes outcomes in terms of dollar

costs and dollar benefits Important when you have to choose how

to spend limited amounts of money

Page 31: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Cost-Benefit Analysis

To do cost-benefit analysis you need to know (in addition to program cost)

(a) magnitude of benefits a program produces and

(b) that the program produced these benefits.

These things can only be learned through an experimental design.

Page 32: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

Cost-Benefit Analysis

Some issues to consider before you do CBA…

1. Opportunity cost

2. Present value of money

3. Fairness

4. Complexity

Page 33: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

CBA and Opportunity Cost

CBA expresses values in dollars. This reveals opportunity cost – if you do X with

your money, you cannot do Y with the same money.

Some values are difficult to express in dollars. E.g., what is the value of having mail delivery in rural areas?

How do you express non-market values in dollars?

Page 34: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

CBA & Present Value

CBA works with the Present Value (PV) of money.

Future outcomes are uncertain. Inflation alters value of money – e.g., PV of

$1m in 50 years at 5% inflation = $87,000 .

Page 35: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

CBA & Present Value

$100 of benefit today is worth more in Present Value than $100 of benefit 5 years from now.

This makes sense, but biases program evaluation away from long-term outcomes

Page 36: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

CBA & Fairness

CBA compares benefits and costs without regard to who benefits and who pays costs. Is that fair? Is it unavoidable?

For example, people who live in the city subsidize mail delivery to people who live in the country. Is that fair? CBA doesn’t answer that question.

Page 37: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

CBA & Complexity

Most social problems, and many problems in the private sector are complex.

They have many interacting causes, so establishing cause may be difficult.

Any program is likely to make only a small difference.

But it still makes sense to quantify the value of a program, to see if we could spend our money to better effect.

Page 38: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

The relevant skills? (Robson, 2002)

Writing a proposal Clarifying purposes of an evaluation Identifying, organizing and working with an

evaluation team Choosing design & data-collection techniques Interviewing Questionnaire construction and use

Page 39: Outline – Part I 1. Definitions 2. Pseudo-evaluation vs. legitimate evaluation 3. Formative vs. summative evaluation 4. Necessary skills 5. Planning an.

What are the relevant skills?

Observation Management of complex information systems Data analysis Report-writing Encouraging people to use the findings Sensitivity to political concerns


Recommended