+ All Categories
Home > Documents > Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity:...

Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity:...

Date post: 13-Jan-2016
Category:
Upload: theodore-robertson
View: 213 times
Download: 0 times
Share this document with a friend
Popular Tags:
34
Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 [email protected]
Transcript
Page 1: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Cornell UniversityCornell Office for Research on Evaluation (CORE)

Building Evaluation Capacity:The “Systems Evaluation Protocol” in Practice

Monica Hargraves, PhD

Manager of Evaluation for Extension and Outreach

Cornell University

October [email protected]

Page 2: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Brief outline

1. What we do (The “Evaluation Partnership” approach)

2. Key steps in the training (with stories from extension program partners)

3. “Swimming against the tide…”

4. Making it feasible and sustainable

Page 3: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Evaluation Partnerships

Evaluation Partnerships – CORE provides training, brings evaluation

expertise– Partners bring experience, expertise in their

programs, their communities, their “systems”

Planning Phase is a one-year commitment, with intentions and clarity of roles captured in an MOU

Page 4: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

What the EP entails, in the “planning year”

Stages:• Preparation for Partnership (Jan – March)

• Modeling (intensive!) (April – June)

• Evaluation Planning (July – Oct/Nov)

Formats this year:• Two in-person, full-day training meetings • Web-conferences• Listserve, e-mail, phone support

Page 5: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

2006: NYC

History of the Project within Cornell Cooperative Extension (CCE)

Page 6: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

2007: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, Ulster

History of the Project within Cornell Cooperative Extension (CCE)

Page 7: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

2009: Chemung, Chenango, Clinton, Cortland, Franklin, Fulton & Montgomery, Genesee, Jefferson, Madison, Monroe, Oneida, Ontario, Oswego, Rensselaer, Saratoga, Seneca, Tioga, Tompkins, Ulster, Wayne

History of the Project within Cornell Cooperative Extension (CCE)

Page 9: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

“Systems Evaluation Protocol”: Planning Phase

Page 10: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Stakeholder Analysis

Dairy Program Parents

FFA Teachers

Youth

Jefferson County Legislatures

4-H MembersNational Dairy Industry

Jefferson County Fair Board

FundersCCE Staff

Breed Associations

Local Ag Businesses

Surrounding County Youth

NYS 4-H

SUNY Morrisville Cobleskill

Other Youth Programs

JCADCA

Jefferson County Dairy Producers

Media

Volunteers

CCE Board of Directors

State Fair

NYS Jr. Holstein Association

Local School Districts

Cornell University

Taxpayers

CCE-Jefferson 4-H Dairy Program: Stakeholder Map

Page 11: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Developing Stakeholder charts

Page 12: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Stakeholder Analysis … why it matters:

Page 13: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Logic Model Development

Page 14: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Quick “poll” on formal modeling …

Think of programs you are evaluating, or wish to evaluate.

How many of those have a written-down model (Logic Model, or something similar)?

A – allB – manyC – someD – few E – none

Page 15: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Focus on Activities, Outputs, and Outcomes

Page 16: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Make connections (create “links”)

Page 17: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Pathway Model Development

4-H “SET-To-Go” (an after-school science program), CCE-Cortland County

Pathway Model, October 2009

Page 18: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

“Mining the Model”

Page 19: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Comments from an Evaluation Partner…

Shawn Smith

4-H Issue Area Leader & Evaluation Project Manager

CCE – Cortland County (CCECC)

Page 20: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

CCECC Advisory Committee Input

Page 21: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Internal Stakeholder Analyses

Page 22: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

CCECC PM “Final” To date

Page 23: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Note: You can do Pathway model visuals without the Netway!

Page 24: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Impa

ct

initiation growth

Program Life Cycle

maturity transformation

Time

Source: Program Leadership Certification, “Accountability and Evaluation” PowerPoint, Michael Duttweiler

Page 25: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Quick Poll on Program Lifecycles

Think about a program you are evaluating or are going to be evaluating

What lifecycle stage is it in?

A – early development, pilotB – still revising/tweakingC – implemented consistentlyD –consistent across sites/facilitators and documentedE –well-established, stable, candidate for replication

Page 26: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Program & Evaluation AlignmentE

valu

atio

n S

pe

cial P

roje

cts

Process assessment and post-only evaluation of participant reactions and satisfaction.

Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability).

Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement.

Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review.

Controls and comparisons (control groups, control variables or statistical controls).

Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness.

Multi-site analysis of integrated large data sets over multiple waves of program implementation.

Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis).

Pha

se I

Pha

se I

IP

hase

III

Pha

se I

V

Program Lifecycle Evaluation Lifecycle

Init

iati

on

Dev

elo

pm

ent

Mat

uri

tyD

iss

emin

ati

on

Pro

cess &

Resp

on

seC

han

ge

Co

mp

arison

& C

on

trol

Gen

eralizability

Phase IAIs program in initial implementation(s)?

Is program in revision or reimplementation?

Phase IB

Is program being implemented consistently?

Does program have formal written procedures/protocol?

Is program associated with change in outcomes?

Is effective program being implemented in multiple-sites?

Does program have evidence of effectiveness?

Phase IIA

Phase IIB

Phase IIIA

Phase IIIB

Phase IVA

Phase IVBIs evidence-based program being widely distributed?

Page 27: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Determining Evaluation Scope

It’s all about making GOOD CHOICES…

• What kind of evaluation is appropriate for the program lifecycle stage?

• What are the key outcomes this program should be attaining?

• What do important stakeholders care most about?

• What will “work best” in this kind of program?

• What kind of evaluation is feasible for this year? What should wait until a future year?

Page 28: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Middle-TermOutcome

Activity Activity Activity Activity

Output Output Output Output

Middle-TermOutcome

Middle-TermOutcome

Middle-TermOutcome

Long-TermOutcome

Long-TermOutcome

Short-TermOutcome

Short-TermOutcome

Short-TermOutcome

Key Outcomes

Components

Key Links

Key Pathway

Scope

Stakeholders 1 2

3

1 1

1

2

2

3

3

2Internal Priorities

Determining Evaluation Scope

Page 29: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Comments from another Evaluation Partner…

Linda Schoffel

Rural Youth Services Program Coordinator

CCE – Tompkins County (CCETC)

Page 30: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Using the Pathway Model for making evaluation choices – RYS Rocketry Program

Page 31: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Evaluation should “fit” the program …(lifecycle, stakeholders, context, etc.)

RYS Rocketry Program, CCE-Tompkins CountyPathway Model, October 2009

Do youth who participate in RYS Rocketry feel like they are part of the group? (belonging)

Page 32: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

“Swimming against the Tide”

The most frequently cited challenge to program evaluation is lack of time.

The systems approach involves spending a lot of time before you even get to the point of choosing

measures…

Programs often face significant pressure for more evaluation, and for evidence of “impact” …

The systems approach argues, essentially, that “less is more” if the evaluation truly “fits” the program

Page 33: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Making it feasible for the long-term

Key ingredients that help:

• focusing on the most valuable elements (choosing well)

• identifying interim benefits of the process• integrating with other needs • building on others’ progress• sharing resources

Page 34: Cornell University Cornell Office for Research on Evaluation (CORE) Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice Monica.

Wrapping Up …

Thank you!

Any questions for any of us, before returning to Bill…?

For follow-up questions later, Monica Hargraves: [email protected]

Shawn Smith: [email protected] Schoffel: [email protected]

Also see our website at http://core.human.cornell.edu/


Recommended