+ All Categories
Home > Documents > Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your...

Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your...

Date post: 19-Jan-2016
Category:
Upload: erika-hutchinson
View: 217 times
Download: 1 times
Share this document with a friend
Popular Tags:
22
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online at www.cdc.gov/ncipc/pub-res/dypw/dypw.pdf?file=%2F2%2Fmodules%2Fmodule01.swf
Transcript
Page 1: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Planning EvaluationPlanning EvaluationSetting the Course

Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online atwww.cdc.gov/ncipc/pub-res/dypw/dypw.pdf?file=%2F2%2Fmodules%2Fmodule01.swf

Page 2: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Planning EvaluationPlanning EvaluationWhat’s going on?

If these youth were part of a 4-H program, how would you show evidence for program quality and outcomes?What would they (or their parents, teachers, or peers) tell you about their experience?

Page 3: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Why Evaluate?

• Brainstorm reasons for evaluating programs

Page 4: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Reasons to Evaluate

• Prove (scientists “show evidence”)– Program impact (school/college/career success)

– Program outcomes

(knowledge-attitude-skills-aspirations)

– Program quality (best practices)

• Improve: guidance to reach audience

• Approve: feedback for staff

Page 5: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Rationale for Evaluation

• Demonstrate solid evidence for success

• Allow other programs to learn

• Monitor ongoing quality and outcomes

Page 6: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Summing up evaluation

“…the process of determining whether a program or certain aspects of a program are appropriate, adequate, effective, or efficient, and if not, how to make them so.”

Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury Prevention & Control. Online atwww.cdc.gov/ncipc/pub-res/dypw/dypw.pdf?file=%2F2%2Fmodules%2Fmodule01.swf

Page 7: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Evaluationmay bring more than you expected

• People talk…and feel good that you listen

• You talk…stakeholders and media listen

• Problems become opportunities

• Programs are sometimes

‘better than expected’

Page 8: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Begin with the end in mind

• Clear and definite objectives

• Distinctive target population

• Straightforward indicators of success

• Evaluation integrated with programming

• Appropriate, well-tested methods and tools

• Comparison data (population, control)

• Information about process and quality

Page 9: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Shakespeare evaluates

• Stage 1: Formative (Implementation)—Is it in place?

• Stage 2: Formative (Process/Progress)—is it serving target audience?

• Stage 3: Summative (Outcome)—Is it getting results?

• Stage 4: Summative (Impact)—Is it building results?

Page 10: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Planning EvaluationPlanning EvaluationFormative: Implementation

Is the project being implemented according to plan? (e.g., participant selection and involvement, activities and strategies, adjustments matching program plan, capable staff members hired, trained, and well-managed, materials and equipment ready, timelines maintained, appropriateness of personnel, and the development and fulfillment of the management plan.

Page 11: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Planning EvaluationPlanning EvaluationFormative: Progress

Is the project progressing toward planned results? (e.g., participant progress on key indicators, activities and strategies fostering progress?

Page 12: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Program Fidelity

• How can you say that changes in youth knowledge, attitudes, skills, or aspirations result from your program rather than some external factor?

Page 13: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Program Fidelity Keys

• Document pre- and post-project scores• Monitor best practices and youth progress via

– External observers– Youth participant feedback

Page 14: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Planning EvaluationPlanning EvaluationSummative: (Short-term) Outcomes

At the completion of each/all “units,” how have participants changed? (e.g., knowledge, attitudes, skills, aspirations)

Page 15: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Planning EvaluationPlanning EvaluationSummative: (Long-term) Impacts

As a result of program participation, what profound changes occurred in a youth (family, community)?(e.g., behavior, application of program lessons)

Page 16: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Outcome Expectations

• What kinds of changes are significant?

• How much change is enough?

• What if some participants don’t change?

• How long will changes “stick”?

Page 17: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Answers on Outcome Expectations

• It depends

Page 18: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Clarifying Expectations

• What kinds of changes are significant?– Depends on the factor (e.g., attitude toward

reading vs. reading comprehension)

– Depends on audience (e.g., competent readers vs. struggling readers)

– Depends on program (e.g., one-time/short-term vs. all year/all summer)

– Depends on context (e.g., stage/pace-appropriate vs. constrained or chaotic)

Page 19: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Clarifying Expectations

• How much change is enough?– Depends on the above (reality, research)

– Depends on funder expectations

…often critical first steps or progress toward a goal is a key indicator of continued success

(think about staying up on your first bike)

Page 20: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Clarifying Expectations

• What if some participants don’t change?– See the above (clarify expectations first)

– Critically examine threshold criteria (e.g., minimal health, safety, and education goals vs. substantial or optimal improvement)

– Critically examine program potential (e.g., relative benefit for specific participants)

Page 21: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

Clarifying Expectations

• How long will changes “stick”?– See the above (check research and reason)

– Depends on the nature of the change• Interest in science or practice of healthy eating

sustained through life (turning point)• Increasing involvement and growth in ongoing

programming (cumulative benefits)

Page 22: Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.

So where do we begin?

• Create a “logic model” that describes what results you want and how to get to them

• Check the research to see what others have learned

• Get to know your audience so that you know what results are relevant for them


Recommended