Date post: | 17-Jan-2016 |
Category: |
Documents |
Upload: | marianna-mason |
View: | 213 times |
Download: | 0 times |
WilderResearch
Evaluation 101 Workshop
With Nicole MartinRogers, Ph.D.
Great Lakes Culture Keepers conference
@ the Mille Lacs Indian Museum and Trading Post
April 28, 2015
Introductions and framing questions
Evaluation overview
Logic models
(lunch break)
Evaluation plans
Collecting data
Using evaluation results to improve programs
Evaluation in the real world
Agenda
Introduce yourself, the organization you are from, and answer any or all of the following:
What are your biggest concerns about evaluation?
What is your best evaluation experience?
What is one thing you hope to learn from today’s session?
Framing questions
WilderResearch
Evaluation Overview
Why we evaluate
What it is
The evaluation process
Terminology and approaches
Considerations and limitations
Because Isaidso!
Why evaluate?
Ongoing learning and program improvement
Guide programming decisions
Assess effectiveness, identify best practices
Demonstrate program impact
Meet funder requirements and seek funding
Generate support for the program
Recognize a job well done
Why evaluate?
Systematic process
Collects information about questions/issues
Improves knowledge and decision-making
Asks questions about issues that come from your everyday practices
What is evaluation?
Developmental evaluation
Program implementation (process)
Satisfaction
Program impact (outcomes)
Approaches to evaluation
Approaches to evaluation (cont.)Initiative is innovatingand in development• Exploring • Creating
• Emerging
Initiative is formingand under refinement
• Improving • Enhancing• Standardizing
Initiative is stabilizingand well-established• Established • Mature
• Predictable
Implementers are experimenting with different approaches and activities.
There is a degree of uncertainty about what will work, where, and with whom.
New questions, challenges, opportunities, successes, and activities continue to emerge.
Try Developmental Evaluation
Core elements of the initiative are taking shape.
Implementers are refining their approach and activities.
Outcomes are becoming more predictable.
The context is increasingly well known and understood.
Try Formative or Process Evaluation
The initiative’s activities are definable and established, and do not change significantly as time passes.
Implementers have significant experience with (and an increasing sense of certainty) about what works.
The initiative is ready for a determination of merit, worth, value, or significance.
Try Summative or Outcomes Evaluation
Engaging the community
Cultural metaphors
Ways of knowing
Core cultural values
Telling a story (and knowing your audience)
Responsive information gathering
Looking to our gifts (“strengths-based”)
Interpreting and sharing the information*Adapted from LaFrance & Nichols (2009)
An Indigenous framework for evaluation*
Evaluation process
Stakeholders are:
– Organizations and individuals who care about the program and/or the evaluation findings
– In general, anyone who has something to gain or lose from the program
Not everyone can be or has to be at the table
– Stakeholders can be engaged in different ways and at different levels
Engaging stakeholders in evaluation
WilderResearch
Logic models
Program theory
Terminology
Uses
Cultural metaphors
Activity
Fill in the Blanks
©1995
Program theory
You are here.
You need to be here.
What needs to happen to get from here to there?
Click icon to add SmartArt graphic for agenda
• IF the activity/program is provided THEN what should be the result (impact) for participants?
• What ACTIVITIES needs to happen, and in what INTENSITY and DURATION, for participants to experience the desired OUTCOME?
• What EVIDENCE do you have that this activity/program will lead to the desired result?
Build consensus and clarity about essential program activities and outcomes
Identify opportunities for program improvement
Be clear about beliefs and assumptions that underlie program design
Promote evidence-based thinking
Avoid scope creep or mission drift
Evaluate your impact
Logic models help you…
Inputs: any resources or materials used by the program to provide its activities
Activities: any services or programs provided by the program
Outputs: any quantifiable documentation of the activities of a program
Elements of a logic model
Outcomes: any characteristics of the participants that, according to the theory and goals of the services, can be reasonably expected to change as a result of the participant’s receiving services
A common progression is:– Short-term : Changes in knowledge or awareness
– Intermediate outcomes: Behavioral changes
– Long-term outcomes: Sustained behavior change, individual impacts, community impacts
Elements of a logic model (cont.)
To what extent are the activities likely to create change given dosage?
Which users are likely to be impacted?
What other factors may influence whether change occurs?
A logic model should provide a realistic statement of outcomes that are likely to occur given the inputs and activities, as well as the circumstances of participants and the context within which the program occurs.
Outcomes: consider the following
Program staff: state in plain language how and why you are doing what you are doing and what you expect to come from it– Working sessions to identify key inputs, activities, and
outcomes can be great team-building exercises
Participants: ask them what they got out of the program and how it could be improved
Literature and other programs: the field and prior research can tell you what outcomes can be expected from your program (and strengthen your model)
How to build a logic model
WilderResearch
Logic models
A few examples
Click icon to add picture
WilderResearch
Lunch break!
Enjoy your lunch
See you back here at 1:00!!
WilderResearch
Logic model activity
Maple Sap Harvest Workshop
Communicate to key stakeholders:
– Board
– Funders
– Staff and volunteers
– Participants
Share it on your website, in your annual report, on social media, in evaluation reports, in funding requests
Convert your logic model to narrative using evaluation language (“outcomes”)
Design your evaluation plan (part 2 of this training)
Use your logic model to…
WilderResearch
Evaluation plans
Evaluation questions
Methods
Community engagement
Activity
The evaluation plan is where you document your evaluation questions and your specific plan for gathering, analyzing, and reporting the data to answer those questions.
Documentation of these steps is critical for accountability and to avoid evaluation scope creep.
Why you need an evaluation plan
What is it you want to learn about your program?
“To what extent does (what we do) affect/change (a behavior or characteristic) ?”
Evaluation questions
Was the program implemented as intended? Were the targeted participants served – number
and type? Which aspects of the program worked well? Which aspects were problematic? Why?
** Consider setting specific output targets for the activities in your logic model (some funders may require this)**
Process issues – what happened?
Prioritize your process evaluation questions based on how much the answer to the question will…
Influence participant outcomes or satisfaction
Concern staff members or other key stakeholders
Help with planning or improvement decisions
Add contextual understanding of the program
Process issues (cont.)
Do elements of participant satisfaction make a difference in positive outcomes?
Will you be able to do anything with your satisfaction results? Or is it beyond your resources or control?
Are there key stakeholders whose satisfaction will influence your program’s sustainability?
Satisfaction – what do people think?
Did the program achieve the desired outcomes?
** Use your logic model to determine which outcomes to measures in which timeframe AND to demonstrate which outcomes you do not need to measure because the link has already been demonstrated in the literature **
Outcomes – what changed?
Prioritize your evaluation questions based on which outcomes will be the most . . .
Important to participants
Important to other stakeholders, including funders
Useful in understanding success and guiding improvements
Achievable
Feasible to measure (last but not least!)
Outcomes (cont.)
WilderResearch
Evaluation plan activity
Developing evaluation questions
Valid (measuring what you want to measure) Reliable (consistent measurement) Culturally responsive questions and approach (give gifts) Ethical and legal Useful for multiple purposes Sensitive to change Easy to use Accessible Relevant Focused
Measure things right!
Qualitative data Word information:
– Interviews
– Focus groups
Analyzed by:
– Grouping data by key themes
Quantitative data Numerical information:
– Closed-ended surveys
– Administrative data
Analyzed by:
– Calculating/computing data
Methods
.
Information collected specific for your evaluation
– Surveys
– Interviews
– Focus groups
– Administrative/program data
Data that have already been collected about the target population
– Existing population-level data sets
– Previous research and program evaluations
Data sources
Primary data sources Secondary data sources
Surveys
– Paper, Web, Phone
Interviews
– Informal, Semi-structured, Structured
Focus groups and talking circles
Observation
Interactive observation
Administrative/program records
Data collection
Interactive observation example – Wordle
Survey Monkey and other online survey tools
– But most of these tools won’t help you design a good survey; they just make it easier to administer
– Check out: https://www.surveymonkey.com/mp/survey-guidelines/
Focus groups: – http://www.eiu.edu/~ihec/Krueger-FocusGroupInterviews.pdf
Wordle:
– http://www.wordle.net/
Data collection resources
WilderResearch
Using evaluation results to improve your program
Data-based decision-making
Continuous quality improvement and a learning culture
What are the most important findings?
How should the program be improved, if at all?
What things about the program worked really well? How great are the outcomes? How strong is the evidence?
Do the results lead to additional questions about the program?
Who should see the results? In what format?
*Helpful hint: You need to take time to dothese things as part of your evaluation plan
Using evaluation for program improvement
WilderResearch
Evaluation use activity
Analyzing, interpreting, and using evaluation data
WilderResearch
Evaluation in the real world
Considerations for doing evaluation in-house
Working with professional evaluators
Evaluation resources
What skills do you and your staff have?
What additional skills might you need?
What is your budget for evaluation?
Do you and your staff have time to spend on the evaluation work in addition to your other tasks?
How can you use volunteers in your evaluation?
How can you incorporate evaluation data collection into standard operating procedures?
Considerations
What specific skills or expertise will add value?
– Topical/program expertise?
– Experience with similar grants?
– Cultural knowledge?
What capacity does the contractor have?
– Do you want an independent contractor or a larger evaluation firm?
What is your budget for evaluation?
Who will manage and guide the work of the contractor?
Working with professional evaluators
www.wilderresearch.org – find evaluation tip sheets, evaluation reports on a variety of topics, and more!
Cite LaFrance and Nichols book here
Visitor Studies Association (VSA): http://visitorstudies.org/resources
Institute of Museum and Library Services (IMLS): http://www.imls.gov/research/evaluation_resources.aspx
Evaluation resources
WilderResearch
Wrap-up
Evaluation & feedback on this session
Next steps
What do you think of when you think of evaluation? What has changed after today’s session?
What do you hope to get out of today? Is there anything that was not covered that you still want to learn about evaluation?
What went well today?
Was there anything that could have been improved?
Full circle: follow-up questions
Which of these tools, if any, do you think you plan to use in your work?
What additional support would you need to be able to use these tools?
What additional evaluation training or resources do you plan to seek out in the future?
Anything else?
Next steps
Chi’miigwech!
Nicole MartinRogers, Ph.D.Senior Research Manager
WILDER RESEARCH
Follow me on Twitter @NMartinRogers
www.wilderresearch.org