+ All Categories
Home > Documents > EPIC Evaluation: Measuring Community Change › _data › files ›...

EPIC Evaluation: Measuring Community Change › _data › files ›...

Date post: 27-Jun-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
28
EPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator, College of Education and Human Development, University of Minnesota Mike Greco, Director, Resilient Communities Project, University of Minnesota
Transcript
Page 1: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

EPIC Evaluation: Measuring Community Change

Cynthia Matthias, Program Evaluator, College of Education and Human Development,

University of Minnesota

Mike Greco, Director, Resilient Communities Project, University of Minnesota

Page 2: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Overview

Ripple Effects Mapping (REM): what it is, why it’s useful in this context

Resilient Communities Project (RCP) evaluation findings

Using evaluation findings to inform the evolution of RCP

Page 3: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Why did RCP do this?

We wanted to answer some questions about the program:

What project-specific and community-wide changes result from RCP

partnerships?

What’s working well from partner perspectives?

What could we do to improve the partnership experience for communities?

Is the program making good on its promises to partner communities?

What goals is the program achieving?

Are these the goals we want to achieve?

Page 4: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Evaluation Approach & Rationale

Interrogating our program model using:

● Bottom-up approach: partners tell us what benefits/outcomes they realize

● Reflection: are these the outcomes we want to see?

Developmental evaluation approach—incorporate ongoing feedback

into program processes in real time

● Observe, propose, test, repeat

● Not formative or summative evaluation

● What are the standards for “measuring merit, value, and worth”?

Page 5: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Methods: Surfacing Community Outcomes

Method: Ripple Effects Mapping--reflecting on work, collecting data

● Participatory group process with 6-14 people in a room

● Pair-share using a short interview, report back to the group

● Facilitators frantically record, move comments around, organize

● “But for…” principle

Approach: Appreciative Inquiry

● Pair-share interview questions focus on outcomes

● Used seven community capitals as a heuristic device

● Did some projects make leaps forward

● Also asked participants to air grievances

Page 6: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Overall Data Collection Process

REM Sessions (worked with REM expert from UMN Extension)

● One session for each partner community (4 total, 6-8 attendees each)

● Communities chosen based on how much time had passed

Post-Session Follow-Ups

● Emailed interview questions to people who couldn’t attend

● Follow-up interviews to get more depth

Generate a map during the REM session

Note: We used XMind software, but could do low-tech version with Post-Its + flipcharts

Page 7: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,
Page 8: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,
Page 9: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Making Sense of Outcomes for RCP

Recontextualizing REM data for RCP program improvement

● Read REM responses, thought about categories of

outcomes

● Generated (as a team) new themes for organizing outcomes

● Used an iterative process to name outcomes

● Mapped outcomes backward to program activities

● Generated an “outcome map” to visualize how RCP works

Page 10: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,
Page 11: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

“Pros” of the Process

Group discussion surfaces lots of information

Structure helps participants think about outcomes systematically

Participants connected ideas and events they hadn’t before

Sessions produced a tangible reminder of the partnership

⇢ validation of all the work partner communities did

Sessions give partner communities a chance to “air grievances”

Helped RCP recognize consistent challenges across communities

Generated metadata about projects

Page 12: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

“Cons” of the Process

It’s hard to get a bunch of busy people in the room at

the same time

The projects represented are the ones you hear about

Time is short, and revelations are incomplete

Staff dynamics have a strong impact on success of

these sessions

Page 13: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

What We Learned

Page 14: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Impacted Municipal Programs, Policies, & Processes

Adopting new policies

Undertaking new programs and initiatives

Incorporating new design ideas

Hiring new staff

Page 15: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Strengthened Relationships

Establishing ongoing relationships between city/county staff and

University of Minnesota

Strengthening interagency relationships

Improving communication with residents and businesses

Page 16: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Changed Staff Attitudes and Perceptions

Working across silos

New perspectives and ideas bring new energy and motivation

Shifting focus to devote time to neglected issues

Page 17: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Reframed Issues

Rethinking problems and strategies

Refocusing community goals

Validating existing community initiatives

Building momentum for ongoing projects

Page 18: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Created Space to Consider New Ideas

Floating unpopular or “radical” ideas

Raising awareness of an issue

Creating opportunities to engage with the public

Page 19: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Saved Money/Was Cost-Effective

Monetary savings from process efficiencies and cost-beneficial

solutions

Laying groundwork to engage professionals more effectively/efficiently

Providing professional-level assistance more cheaply (sometimes at the

expense of consultants)

Ultimately, a good return on investment

Page 20: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Challenges

Workload associated with RCP projects

Projects that lead to “less useful” results

Projects assigned to unwilling/unmotivated staff

Staff turnover resulting in lack of continuity/follow-through

Information overload

Page 21: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

What We Learned

Validated many of the “benefits” claimed for the EPIC model(. . .and highlighted contingencies that can intervene)

A lot of impactful program work happens before students begin work on

projects (the pre-work can make or break a partnership)

Many outcomes involve changes in how staff see, think about, and do things

Some staff found the demands of the partnership overwhelming

Some partners were unable to fully capitalize on the partnership

once the formal relationship ended (free kittens vs. free beer)VS.

Page 22: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Building the Plane While Flying It:

Responding to What We Learned

Page 23: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Potential Program Changes

Scale and length of partnerships (“deep immersion”)

Pre-partnership preparation

● Community involvement (listening session? charette? advisory committee?)

● "Resiliency” assessment

● Asset mapping

● Project project lead orientation/training (job descriptions?)

Non-course based assistance

Page 24: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Potential Program Changes

Program efficiencies (survey example)

Post-partnership processing

● Distill and aggregate findings

● Focused follow-up work on selected projects

● Prioritize projects for further action (longitudinal implementation plan)

Page 25: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Assessment and Evaluation Changes

● Document feedback from project leads on the heels of presentations

○ Reaction to presentation and key takeaways

○ Next steps

○ Resources to assist (poster, project brief, community presentation…)

● Redesigned and streamlined end-of-semester surveys for students,

faculty, and staff (what, so what, now what)

● Two-year follow-up with partners using REM/interviews

● Two-year retrospective survey to students and faculty to assess outcomes

● Regularly incorporate feedback to inform program operation/strategic plan

Page 26: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

Resources

Ripple Effects Mapping

Emery, M., Higgins, L., Chazdon, S., & Hansen, D. (2015). Using Ripple Effect

Mapping to Evaluate Program Impact: Choosing or Combining the Methods That

Work Best for You. Journal of Extension, 53(2), n2.

Kollock, D. H., Flage, L., Chazdon, S., Paine, N., & Higgins, L. (2012). Ripple

effect mapping: A" radiant" way to capture program impacts. Journal of Extension,

50(5), 1-5.

Chazdon, S., Emery, M., Hansen, D., Higgins, L., & Sero, R. (2017). A Field Guide

to Ripple Effects Mapping. https://conservancy.umn.edu/handle/11299/190639

Page 27: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

EPIC-N Evaluation and

Assessment Resources

Elise Amel, Faculty Director, Sustainable Communities Partnership,

University of St. Thomas

Marshall Curry, EPIC-N Program Associate, University of Oregon

Page 28: EPIC Evaluation: Measuring Community Change › _data › files › 4a-GrecoMatthias-Presentation.pdfEPIC Evaluation: Measuring Community Change Cynthia Matthias, Program Evaluator,

. www.rcp.umn.edu

@RCPumn

[email protected]

www.epicn.org

@epicn.org

@EPICNtweetQuestions?


Recommended