+ All Categories
Home > Documents > Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Date post: 05-Jan-2016
Category:
Upload: ronald
View: 31 times
Download: 0 times
Share this document with a friend
Description:
Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice. Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 [email protected]. Brief outline. What we do (The “Evaluation Partnership” approach) - PowerPoint PPT Presentation
Popular Tags:
34
Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 [email protected]
Transcript
Page 1: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Cornell UniversityCornell Office for Research on Evaluation (CORE)

Building Evaluation Capacity:The “Systems Evaluation Protocol” in Practice

Monica Hargraves, PhD

Manager of Evaluation for Extension and Outreach

Cornell University

October [email protected]

Page 2: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Brief outline

1. What we do (The “Evaluation Partnership” approach)

2. Key steps in the training (with stories from extension program partners)

3. “Swimming against the tide…”

4. Making it feasible and sustainable

Page 3: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Evaluation Partnerships

Evaluation Partnerships – CORE provides training, brings evaluation

expertise– Partners bring experience, expertise in their

programs, their communities, their “systems”

Planning Phase is a one-year commitment, with intentions and clarity of roles captured in an MOU

Page 4: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

What the EP entails, in the “planning year”

Stages:• Preparation for Partnership (Jan – March)

• Modeling (intensive!) (April – June)

• Evaluation Planning (July – Oct/Nov)

Formats this year:• Two in-person, full-day training meetings • Web-conferences• Listserve, e-mail, phone support

Page 5: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

2006: NYC

History of the Project within Cornell Cooperative Extension (CCE)

Page 6: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

2007: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, Ulster

History of the Project within Cornell Cooperative Extension (CCE)

Page 7: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

2009: Chemung, Chenango, Clinton, Cortland, Franklin, Fulton & Montgomery, Genesee, Jefferson, Madison, Monroe, Oneida, Ontario, Oswego, Rensselaer, Saratoga, Seneca, Tioga, Tompkins, Ulster, Wayne

History of the Project within Cornell Cooperative Extension (CCE)

Page 9: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

“Systems Evaluation Protocol”: Planning Phase

Page 10: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Stakeholder Analysis

Dairy Program Parents

FFA Teachers

Youth

Jefferson County Legislatures

4-H MembersNational Dairy Industry

Jefferson County Fair Board

FundersCCE Staff

Breed Associations

Local Ag Businesses

Surrounding County Youth

NYS 4-H

SUNY Morrisville Cobleskill

Other Youth Programs

JCADCA

Jefferson County Dairy Producers

Media

Volunteers

CCE Board of Directors

State Fair

NYS Jr. Holstein Association

Local School Districts

Cornell University

Taxpayers

CCE-Jefferson 4-H Dairy Program: Stakeholder Map

Page 11: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Developing Stakeholder charts

Page 12: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Stakeholder Analysis … why it matters:

Page 13: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Logic Model Development

Page 14: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Quick “poll” on formal modeling …

Think of programs you are evaluating, or wish to evaluate.

How many of those have a written-down model (Logic Model, or something similar)?

A – allB – manyC – someD – few E – none

Page 15: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Focus on Activities, Outputs, and Outcomes

Page 16: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Make connections (create “links”)

Page 17: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Pathway Model Development

4-H “SET-To-Go” (an after-school science program), CCE-Cortland County

Pathway Model, October 2009

Page 18: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

“Mining the Model”

Page 19: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Comments from an Evaluation Partner…

Shawn Smith

4-H Issue Area Leader & Evaluation Project Manager

CCE – Cortland County (CCECC)

Page 20: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

CCECC Advisory Committee Input

Page 21: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Internal Stakeholder Analyses

Page 22: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

CCECC PM “Final” To date

Page 23: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Note: You can do Pathway model visuals without the Netway!

Page 24: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Impa

ct

initiation growth

Program Life Cycle

maturity transformation

Time

Source: Program Leadership Certification, “Accountability and Evaluation” PowerPoint, Michael Duttweiler

Page 25: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Quick Poll on Program Lifecycles

Think about a program you are evaluating or are going to be evaluating

What lifecycle stage is it in?

A – early development, pilotB – still revising/tweakingC – implemented consistentlyD –consistent across sites/facilitators and documentedE –well-established, stable, candidate for replication

Page 26: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Program & Evaluation AlignmentE

valu

atio

n S

pe

cial P

roje

cts

Process assessment and post-only evaluation of participant reactions and satisfaction.

Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability).

Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement.

Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review.

Controls and comparisons (control groups, control variables or statistical controls).

Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness.

Multi-site analysis of integrated large data sets over multiple waves of program implementation.

Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis).

Pha

se I

Pha

se I

IP

hase

III

Pha

se I

V

Program Lifecycle Evaluation Lifecycle

Init

iati

on

Dev

elo

pm

ent

Mat

uri

tyD

iss

emin

ati

on

Pro

cess &

Resp

on

seC

han

ge

Co

mp

arison

& C

on

trol

Gen

eralizability

Phase IAIs program in initial implementation(s)?

Is program in revision or reimplementation?

Phase IB

Is program being implemented consistently?

Does program have formal written procedures/protocol?

Is program associated with change in outcomes?

Is effective program being implemented in multiple-sites?

Does program have evidence of effectiveness?

Phase IIA

Phase IIB

Phase IIIA

Phase IIIB

Phase IVA

Phase IVBIs evidence-based program being widely distributed?

Page 27: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Determining Evaluation Scope

It’s all about making GOOD CHOICES…

• What kind of evaluation is appropriate for the program lifecycle stage?

• What are the key outcomes this program should be attaining?

• What do important stakeholders care most about?

• What will “work best” in this kind of program?

• What kind of evaluation is feasible for this year? What should wait until a future year?

Page 28: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Middle-TermOutcome

Activity Activity Activity Activity

Output Output Output Output

Middle-TermOutcome

Middle-TermOutcome

Middle-TermOutcome

Long-TermOutcome

Long-TermOutcome

Short-TermOutcome

Short-TermOutcome

Short-TermOutcome

Key Outcomes

Components

Key Links

Key Pathway

Scope

Stakeholders 1 2

3

1 1

1

2

2

3

3

2Internal Priorities

Determining Evaluation Scope

Page 29: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Comments from another Evaluation Partner…

Linda Schoffel

Rural Youth Services Program Coordinator

CCE – Tompkins County (CCETC)

Page 30: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Using the Pathway Model for making evaluation choices – RYS Rocketry Program

Page 31: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Evaluation should “fit” the program …(lifecycle, stakeholders, context, etc.)

RYS Rocketry Program, CCE-Tompkins CountyPathway Model, October 2009

Do youth who participate in RYS Rocketry feel like they are part of the group? (belonging)

Page 32: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

“Swimming against the Tide”

The most frequently cited challenge to program evaluation is lack of time.

The systems approach involves spending a lot of time before you even get to the point of choosing

measures…

Programs often face significant pressure for more evaluation, and for evidence of “impact” …

The systems approach argues, essentially, that “less is more” if the evaluation truly “fits” the program

Page 33: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Making it feasible for the long-term

Key ingredients that help:

• focusing on the most valuable elements (choosing well)

• identifying interim benefits of the process• integrating with other needs • building on others’ progress• sharing resources

Page 34: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice

Wrapping Up …

Thank you!

Any questions for any of us, before returning to Bill…?

For follow-up questions later, Monica Hargraves: [email protected]

Shawn Smith: [email protected] Schoffel: [email protected]

Also see our website at http://core.human.cornell.edu/


Recommended