Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Post on 05-Jan-2016

31 views 0 download

Tags:

description

Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice. Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 mjh51@cornell.edu. Brief outline. What we do (The “Evaluation Partnership” approach) - PowerPoint PPT Presentation

transcript

Cornell UniversityCornell Office for Research on Evaluation (CORE)

Building Evaluation Capacity:The “Systems Evaluation Protocol” in Practice

Monica Hargraves, PhD

Manager of Evaluation for Extension and Outreach

Cornell University

October 2009mjh51@cornell.edu

Brief outline

1. What we do (The “Evaluation Partnership” approach)

2. Key steps in the training (with stories from extension program partners)

3. “Swimming against the tide…”

4. Making it feasible and sustainable

Evaluation Partnerships

Evaluation Partnerships – CORE provides training, brings evaluation

expertise– Partners bring experience, expertise in their

programs, their communities, their “systems”

Planning Phase is a one-year commitment, with intentions and clarity of roles captured in an MOU

What the EP entails, in the “planning year”

Stages:• Preparation for Partnership (Jan – March)

• Modeling (intensive!) (April – June)

• Evaluation Planning (July – Oct/Nov)

Formats this year:• Two in-person, full-day training meetings • Web-conferences• Listserve, e-mail, phone support

2006: NYC

History of the Project within Cornell Cooperative Extension (CCE)

2007: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, Ulster

History of the Project within Cornell Cooperative Extension (CCE)

2009: Chemung, Chenango, Clinton, Cortland, Franklin, Fulton & Montgomery, Genesee, Jefferson, Madison, Monroe, Oneida, Ontario, Oswego, Rensselaer, Saratoga, Seneca, Tioga, Tompkins, Ulster, Wayne

History of the Project within Cornell Cooperative Extension (CCE)

“Systems Evaluation Protocol”: Planning Phase

Stakeholder Analysis

Dairy Program Parents

FFA Teachers

Youth

Jefferson County Legislatures

4-H MembersNational Dairy Industry

Jefferson County Fair Board

FundersCCE Staff

Breed Associations

Local Ag Businesses

Surrounding County Youth

NYS 4-H

SUNY Morrisville Cobleskill

Other Youth Programs

JCADCA

Jefferson County Dairy Producers

Media

Volunteers

CCE Board of Directors

State Fair

NYS Jr. Holstein Association

Local School Districts

Cornell University

Taxpayers

CCE-Jefferson 4-H Dairy Program: Stakeholder Map

Developing Stakeholder charts

Stakeholder Analysis … why it matters:

Logic Model Development

Quick “poll” on formal modeling …

Think of programs you are evaluating, or wish to evaluate.

How many of those have a written-down model (Logic Model, or something similar)?

A – allB – manyC – someD – few E – none

Focus on Activities, Outputs, and Outcomes

Make connections (create “links”)

Pathway Model Development

4-H “SET-To-Go” (an after-school science program), CCE-Cortland County

Pathway Model, October 2009

“Mining the Model”

Comments from an Evaluation Partner…

Shawn Smith

4-H Issue Area Leader & Evaluation Project Manager

CCE – Cortland County (CCECC)

CCECC Advisory Committee Input

Internal Stakeholder Analyses

CCECC PM “Final” To date

Note: You can do Pathway model visuals without the Netway!

Impa

ct

initiation growth

Program Life Cycle

maturity transformation

Time

Source: Program Leadership Certification, “Accountability and Evaluation” PowerPoint, Michael Duttweiler

Quick Poll on Program Lifecycles

Think about a program you are evaluating or are going to be evaluating

What lifecycle stage is it in?

A – early development, pilotB – still revising/tweakingC – implemented consistentlyD –consistent across sites/facilitators and documentedE –well-established, stable, candidate for replication

Program & Evaluation AlignmentE

valu

atio

n S

pe

cial P

roje

cts

Process assessment and post-only evaluation of participant reactions and satisfaction.

Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability).

Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement.

Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review.

Controls and comparisons (control groups, control variables or statistical controls).

Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness.

Multi-site analysis of integrated large data sets over multiple waves of program implementation.

Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis).

Pha

se I

Pha

se I

IP

hase

III

Pha

se I

V

Program Lifecycle Evaluation Lifecycle

Init

iati

on

Dev

elo

pm

ent

Mat

uri

tyD

iss

emin

ati

on

Pro

cess &

Resp

on

seC

han

ge

Co

mp

arison

& C

on

trol

Gen

eralizability

Phase IAIs program in initial implementation(s)?

Is program in revision or reimplementation?

Phase IB

Is program being implemented consistently?

Does program have formal written procedures/protocol?

Is program associated with change in outcomes?

Is effective program being implemented in multiple-sites?

Does program have evidence of effectiveness?

Phase IIA

Phase IIB

Phase IIIA

Phase IIIB

Phase IVA

Phase IVBIs evidence-based program being widely distributed?

Determining Evaluation Scope

It’s all about making GOOD CHOICES…

• What kind of evaluation is appropriate for the program lifecycle stage?

• What are the key outcomes this program should be attaining?

• What do important stakeholders care most about?

• What will “work best” in this kind of program?

• What kind of evaluation is feasible for this year? What should wait until a future year?

Middle-TermOutcome

Activity Activity Activity Activity

Output Output Output Output

Middle-TermOutcome

Middle-TermOutcome

Middle-TermOutcome

Long-TermOutcome

Long-TermOutcome

Short-TermOutcome

Short-TermOutcome

Short-TermOutcome

Key Outcomes

Components

Key Links

Key Pathway

Scope

Stakeholders 1 2

3

1 1

1

2

2

3

3

2Internal Priorities

Determining Evaluation Scope

Comments from another Evaluation Partner…

Linda Schoffel

Rural Youth Services Program Coordinator

CCE – Tompkins County (CCETC)

Using the Pathway Model for making evaluation choices – RYS Rocketry Program

Evaluation should “fit” the program …(lifecycle, stakeholders, context, etc.)

RYS Rocketry Program, CCE-Tompkins CountyPathway Model, October 2009

Do youth who participate in RYS Rocketry feel like they are part of the group? (belonging)

“Swimming against the Tide”

The most frequently cited challenge to program evaluation is lack of time.

The systems approach involves spending a lot of time before you even get to the point of choosing

measures…

Programs often face significant pressure for more evaluation, and for evidence of “impact” …

The systems approach argues, essentially, that “less is more” if the evaluation truly “fits” the program

Making it feasible for the long-term

Key ingredients that help:

• focusing on the most valuable elements (choosing well)

• identifying interim benefits of the process• integrating with other needs • building on others’ progress• sharing resources

Wrapping Up …

Thank you!

Any questions for any of us, before returning to Bill…?

For follow-up questions later, Monica Hargraves: mjh51@cornell.edu

Shawn Smith: scs239@cornell.eduLinda Schoffel: ljs48@cornell.edu

Also see our website at http://core.human.cornell.edu/