+ All Categories
Home > Documents > Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice:...

Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice:...

Date post: 28-Dec-2015
Category:
Upload: berniece-holt
View: 227 times
Download: 0 times
Share this document with a friend
Popular Tags:
34
Evaluation Theory and Practice Framework: UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011 Judith M. Ottoson, Ed.D., M.P.H.
Transcript
Page 1: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation Theory and Practice Framework:

UCSF EPI246Translating Evidence Into Practice:

Individual-Centered Implementation Strategies

March 10, 2011

Judith M. Ottoson, Ed.D., M.P.H.

Page 2: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation practice

What is it?

Of what use is this information?

Is it good or bad? How do you know?

Evaluation Questions

Page 3: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Public Health Core Functions and Essential Services

Page 4: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation is…...

… the systematic assessment of the operation and/or outcomes

of a program or policy, compared to a set of explicit or implicit

standards as a means of contributing to the improvement

of the program or policy

Weiss, p4

Page 5: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Sources of EvidenceEvaluation & Research

- intended for use - production of knowledge

- program derived questions - researcher derived questions

- judgmental quality - paradigm stance, e.g. neutral, curious, advocate

- action setting - more controlled setting

- role conflicts - clearer role

- not published - published

- multiple allegiance - clearer allegiance

- guts

Source: Weiss, 1998

Page 6: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

StepsEngage

Stakeholders

Describe the program

Focus the evaluation

design

Gather credible evidence

Justify conclusions

Ensure use and share lessons

learnedStandards

UtilityFeasibilityProprietyAccuracy

Framework for Program Evaluation in Public Health, MMWR, 1999

Page 7: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

The Linchpin

The fundamental purpose of evaluation theory is to specify feasible practices that evaluators can use to construct knowledge of the value of social programs that can be used to ameliorate the social problem to which programs are relevant.

Shaddish, Cook and Leviton, 1991, p36

Page 8: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Practice-Engage the stakeholders-Describe the program-Ask questions-Make values transparent-Focus the design-Gather credible evidence-Justify conclusions-Ensure use & lessons -Manage evaluation

Program Need/problem

- Structure - Context - Change process

-Kinds of use -By when & who -Reporting -Dissemination

Valuing- Criteria of success- Standards of success- Who decides?

Knowledge-“Real” knowledge- Design: who? when? - Methods - Analysis

Adapted from:Shadish, Cook, & Leviton,Foundations of Program Evaluation, 1991

Feasib

ility

Utility

&

acco

untabili

ty

Accuracy

ProprietyUse

Theories ofprogram evaluation

Page 9: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Practice-Engage the stakeholders-Describe the program-Ask questions-Make values transparent-Focus the design-Gather credible evidence-Justify conclusions-Ensure use & lessons -Manage evaluation

Program Need/problem

- Structure - Context - Change process

-Kinds of use -By when & who -Reporting -Dissemination

Valuing- Criteria of success- Standards of success- Who decides?

Knowledge-“Real” knowledge- Design: who? when? - Methods - Analysis

Adapted from:Shadish, Cook, & Leviton,Foundations of Program Evaluation, 1991

Feasib

ility

Utility

&

acco

untabili

ty

Accuracy

ProprietyUse

Theories ofprogram evaluation

Page 10: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Program as Evaluand

• What is the problem or need? • What is the “program?”

– policy, program, project, component, element– before, during, after

• Internal process and structure• External context• Change process – levers of change

Page 11: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Types of Program Failure

Successful Program Program

Program

Program

“causalProcess”

“causalProcess”

“causalprocess”

Desiredeffect

Desiredeffect

Desiredeffect

Program Theoryfailure

Implementation failure

Set in motion

Set in motion

Did notset in motion

Whichled to

Which didnot lead to

Which would have led to

Source: Weiss, 1972

Page 12: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Basic Logic Model InputResources &/orbarriers that enableor limit program effectiveness, e.g.funds, people, supplies, equipment

Activities/process Activities, techniques, tools, events, actions, technology, e.g. products, services, infrastructure

OutputSize & scope of services; dosee.g. # materials,rate participation,# hours

OutcomeShort-term Changes in attitudes,behavior, knowledge,skills, status, levelof functioning

OutcomeLong-termChanges in organization,community, &/orsystem, e.g. improvedcondition, capacity,policy changes

Adapted from: W.F. Kellogg Foundation, Logic Model Development Guide: http://www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf

Page 13: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Development Strategies for Logic Models

• “But how?” questions: reverse logic– Start with distal effects– Ask: how to generate that effect – Create downstream of proximal effects & activities connected to needs

• “But why?” questions: forward logic– Start with needs – Ask: so what happens next to create distal effects– Create upstream of activities and proximal effects connected to

distal effects

• “But how?” & “But why?” questions: middle-road logic– Start with activities or outputs – Ask: what will create downstream activities that connect to needs– Ask: but how will these activities or outputs create upstream distal effects

Page 14: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Patient Needs:Common medical problem w/ great impact resulting in ↓ Quality of Life

(QOL) and loss of work

Provincial Health Authority (PHA) Needs:

Save $$, ↓ length of stay (LOS), and ↓waiting list

Rapid Access Disc Herniation Program

(RADH)

Teaching Resources: CD, pamphlet, helpline, contact no. Small group discussion Paramedical interview and teaching: nurses, PT

Practice Empowerment Opportunity with involvement Planning care and activities after surgery

Resources OR suited for MIS Day Surgery Unit Help line

Output# packages distributed# procedures performed# interviews with provider

Surgeons Knowledge Training Expertise with Minimally Invasive Surgery (MIS)

Patient PHA

Impact Reallocation of hospital resources Center of Excellence Raise standard of Disc treatment to Nat’l level

Referral from GP/Specialists

Administration Support/ attitude Resources ($$,equipment) FacilityOutput

# patients participated# surveys# follow-ups# discharges

Outcome ↑QOL -pain reduction -improved activities Early RTW

Outcome ↓ LOS ↓waiting list ↓ $$

Page 15: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Practice-Engage the stakeholders-Describe the program-Ask questions-Make values transparent-Focus the design-Gather credible evidence-Justify conclusions-Ensure use & lessons -Manage evaluation

Program Need/problem

- Structure - Context - Change process

-Kinds of use -By when & who -Reporting -Dissemination

Valuing- Criteria of success- Standards of success- Who decides?

Knowledge-“Real” knowledge- Design: who? when? - Methods - Analysis

Adapted from:Shadish, Cook, & Leviton,Foundations of Program Evaluation, 1991

Feasib

ility

Utility

&

acco

untabili

ty

Accuracy

ProprietyUse

Theories ofprogram evaluation

Page 16: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Focusing the Evaluation by Asking Questions

• Ask guiding (key) questions about the evaluand– Based on the logic model– Program level, not individual level

• Questions are something in which multiple stakeholders can engage vs. developing “measures,” writing objectives, or study design

• Questions become the guide to measures, analysis, & reporting

• One question may cover multiple measures. Group evaluation results by the forest (questions), not the trees (measures)

Page 17: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Determining the Valueof a program or policy

• What is valued?• Who decides?• Prescribe or describe values• Making values transparent • Valuing logic

– dimensions (criteria) of worth/merit– standards of worth/merit– performance

Page 18: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

The logic of valuing

• Determine criteria of “success”– the dimensions of the evaluand on which stakeholders…

• …have questions• …identify as key, core, or essential • …are willing to hang evaluand success

– Dimensions include: input, process, output, outcomes– Process ex: diversity, enrollees, materials, activities– Outcome ex: knowledge, behavior, jobs, scores, health status

• Set standards of “success”– how well must performance be on the criteria– Ex: #, %, increase or decrease, spread,

• Measure performancecriteria = bar

Standard =how high

Page 19: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Practice-Engage the stakeholders-Describe the program-Ask questions-Make values transparent-Focus the design-Gather credible evidence-Justify conclusions-Ensure use & lessons -Manage evaluation

Program Need/problem

- Structure - Context - Change process

-Kinds of use -By when & who -Reporting -Dissemination

Valuing- Criteria of success- Standards of success- Who decides?

Knowledge-“Real” knowledge- Design: who? when? - Methods - Analysis

Adapted from:Shadish, Cook, & Leviton,Foundations of Program Evaluation, 1991

Feasib

ility

Utility

&

acco

untabili

ty

Accuracy

ProprietyUse

Theories ofprogram evaluation

Page 20: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Knowledge construction

• Is evaluation knowledge special?

• What counts as “real” evaluation knowledge to you? To others?

• What are feasible, ethical, useful, and accurate ways to construct knowledge?

Page 21: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation Design Basics

• Who? (sample)

• When? (timing)

• What? (answer the questions)

• How? (data collection)

Page 22: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Practice-Engage the stakeholders-Describe the program-Ask questions-Make values transparent-Focus the design-Gather credible evidence-Justify conclusions-Ensure use & lessons -Manage evaluation

Program Need/problem

- Structure - Context - Change process

-Kinds of use -By when & who -Reporting -Dissemination

Valuing- Criteria of success- Standards of success- Who decides?

Knowledge-“Real” knowledge- Design: who? when? - Methods - Analysis

Adapted from:Shadish, Cook, & Leviton,Foundations of Program Evaluation, 1991

Feasib

ility

Utility

&

acco

untabili

ty

Accuracy

ProprietyUse

Theories ofprogram evaluation

Page 23: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation Use

• Kinds of use– instrumental– conceptual

• Who uses & when• Facilitators and obstacles to use

Page 24: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

The Program Evaluation Standards

• key features– Standards identify and define evaluation quality– Guide evaluators and evaluation users in pursuit of

evaluation quality– “laws” vs voluntary, consensus

• Revised 2011– Clarifications– Now fifth standard of evaluation accountability

• Trade-offs among standards

Page 25: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Standards of Program Evaluation

• Utility -- The utility standards support high quality evaluation use through attention to all aspects of an evaluation (8)

• Feasibility -- The feasibility standards encourage evaluation to be effective and efficient. (4)

• Propriety -- The propriety standards are intended to ensure that an evaluation will be proper, fair, legal, right, acceptable, and just. (7)

• Accuracy -- Accuracy is the truthfulness of evaluation representations, propositions, and findings, which is achieved through sound theory, methods, designs, and reasoning. (8)

• Evaluation Accountability -- Documenting and improving evaluation accountability requires similar efforts to those required for program accountability, i.e., an evaluation of the evaluation (metaevaluation) (3)

Page 26: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation StandardsDescribing the program

• U2: attention to stakeholders

• F2: Practical procedures

• F3: Contextual viability

• P1: Responsive & inclusive orientation

• P5: Transparency & disclosure

• P6: Conflicts of interest

• A2: Valid information

• A3: Reliable information

• A4: Explicit program & context descriptions

• A7: Explicit evaluation reasoning

• A8: Communication & reporting

• E1: Evaluation documentation

The program evaluationstandards, 2011

Page 27: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

The Evaluation Debates

Page 28: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation Practice Debates

• Whether to evaluate? Who wants the evaluation and why?

• Do evaluators need to be content experts and/or evaluation experts?

• What are the advantages and disadvantages of being an internal vs external evaluator?

• Who counts as a stakeholder for you? Do you hold bias towards any stakeholders?

• Evaluation design, data collection, analysis, and reporting.

• Are evaluation and research the same?

Page 29: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Program Debates

• What counts as the evaluand, e.g., process, structure, context, people, planning?

• How do programs relate to policy (landscape) and components (portrait)?

• What assumptions guide the program? How is it supposed to work?

• What is the relationship between type of evaluation & stage of program development/ stability?

• What factors influence incremental vs radical program change? What are the levers of change?

Page 30: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Valuing Debates

• What kinds of values determine program value, e.g., instrumental, terminal?

• Where are the values embedded, e.g. needs, goals, problems?

• How might the use of prescriptive or descriptive values influence the evaluation?

• Who decides on evaluation criteria and standards, e.g. evaluator? stakeholders?

• Should evaluators synthesize or describe findings?

Page 31: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Knowledge Construction Debates

• What counts as “real” knowledge to you? To other stakeholders?

• What is acceptable evidence of program success? To whom?

• Is the quantitative / qualitative debate over for everybody?

• Is it possible to combine these different understandings of knowledge construction?

Page 32: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Evaluation Use Debates

• What counts as use of an evaluation: conceptual vs instrumental use?

• How much? By when? With what fidelity should evaluation findings be used?

• Who uses evaluation results?

• What are the evaluator’s responsibilities towards use?

Page 33: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Key Evaluation Resources

• American Evaluation Association – www. eval.org– November 2-5, 2011, Anaheim

• San Francisco Bay Area Evaluators– www. sfbae.org/

• The Evaluators Institute: http://tei.gwu.edu– April 4-16, 2011

Page 34: Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

References• Centers for Disease Control and Prevention. Framework for program evaluation in

public health. MMWR 1999;48 (no.RR-11) http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm

• Ottoson, J.M. & Martinez, D. (2010). An ecological understanding of evaluation use: A case study of the Active for Life evaluation. The Robert Wood Johnson Foundation.

www.rwjf.org/pr/product.jsp?id=71148• Ottoson, J.M. & Hawe, P. (eds). (Winter 2009). Knowledge utilization, diffusion,

implementation, transfer, and translation: Implications for evaluation. New Directions for Evaluation, 124. Jossey-Bass: San Francisco.

• Shadish, W.R., Cook, T.D.,& Leviton, L.C. (1991). Good theory for social program evaluation. In The foundations of program evaluation. Newbury Park: Sage.

• The Joint Committee on Standards for Educational Evaluation. (2011). The program evaluation standards (3rd ed.). Thousand Oaks: Sage.

• Weiss, C.H. (1998) Evaluation. (2nd Ed.). New Jersey: Prentice-Hall.

• W.K. Kellogg Foundation, Logic Model Development Guide. (2004). http://www.wkkf.org/knowledge-center/resources/2010/Logic-Model-Development-Guide.aspx


Recommended