+ All Categories
Home > Documents > Program Evaluation

Program Evaluation

Date post: 23-Jan-2016
Category:
Upload: astin
View: 42 times
Download: 0 times
Share this document with a friend
Description:
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration. Evaluation Formative – help form the program Ongoing assessment to improve implementation Outcome – after the fact. Needs Assessment. - PowerPoint PPT Presentation
Popular Tags:
46
Program Evaluation
Transcript
Page 1: Program Evaluation

Program Evaluation

Page 2: Program Evaluation

Program evaluation

Methodological techniques of the social sciences

social policy

public welfare administration.

Page 3: Program Evaluation

Evaluation

Formative – help form the program

Ongoing assessment to improve implementation

Outcome – after the fact

Page 4: Program Evaluation

Needs Assessment

Program Theory Assessment

Process Evaluation

Outcome Evaluation

Efficiency Assessment

Page 5: Program Evaluation

Needs assessment

Who needs the program?

How great is the need?

What might work to meet the need?

What resources are available?

Page 6: Program Evaluation

“Evaluability” assessment

Is an evaluation feasible?

How stakeholders can shape its usefulness.

Page 7: Program Evaluation

Structured Conceptualization

Define the program or technology.

Define the target population.

Define possible outcomes

Page 8: Program Evaluation

Process Evaluation

Investigates the process of delivery and alternatives.

Summative – summarize the effects

Page 9: Program Evaluation

Implementation evaluation

Monitors the fidelity of delivery

Page 10: Program Evaluation

Outcome Evaluations

Demonstrable effects on defined targets.

Page 11: Program Evaluation

Impact evaluation

Net effects intended and unintended on program as a whole

Page 12: Program Evaluation

Cost-effectiveness / Cost benefit.

Examines efficiency by standardizing outcomes in dollar costs and values.

Page 13: Program Evaluation

Secondary analysis

Examine existing data to address new questions or use different methods.

Page 14: Program Evaluation

Meta analysis

Integrates outcome with other studies to get summary judgment.

Page 15: Program Evaluation

Meta-analysis

Analysis of analyses

Summarize a body of work

Replication is good but can lead to inconsistent results

Page 16: Program Evaluation

Useful for

1)clarifying inconsistencies

2) program evaluation

3) review work

4) broadly framed questions

Page 17: Program Evaluation

replications treatment control diff

Exp 1 22 19 3

Exp 2 20 18 2

Exp 3 23 17 6

Exp 4 15 16 -1

Page 18: Program Evaluation

• Sampling

• Error in measurement

• Systematic error

• 3 in 4 studies show..

• Or Mean difference = 2.5

• (average out experimental errors….)

Page 19: Program Evaluation

replications treatment control diff

Exp 1 (n=10)

22 19 3

Exp 2

(n = 10)

20 18 2

Exp 3

(n= 15)

23 17 6

Exp 4

(n = 1000)

15 16 -1

Page 20: Program Evaluation

replications treatment control diff

Exp 1 22 19 3 p<0.05

Exp 2 20 18 2 p<0.05

Exp 3 23 17 6 p<0.05

Exp 4 15 16 -1 p<0.001

Page 21: Program Evaluation

• Pooled data 35 people in 1000 show….

• Can overpower data

• Statistics based on large N tend to be more reliable – but only if the study is valid

• Meta-analysis tends to decrease random and systematic errors

Page 22: Program Evaluation

• What if studies are not replications but variations on a theme…

• Exp 1 uses a scale from 1-5• Exp 2 uses scale from 1-100

treatment control difference

Exp 1 500 400 100

Exp 2 24 22 2

Average difference =51 ???

Page 23: Program Evaluation

• Average difference =51???????????

treatment control difference Effect size d

Exp 1 500 400 100 0.5

Exp 2 24 22 2 0.67

Average d = 0.58

Page 24: Program Evaluation

What is summarized?

1) count studies for and againstdoes not give magnitude and has low power

2) combine significance levels

3) combine effect sizes(effect gives the magnitude of the relationship between 2 variables)

Advantage -a) increase sample size and powerb) increase internal validity-

soundness of conclusions about relationshipc) increase external validity –

generalizability to other places people etcd) shows effect even if small if it is consistent

Page 25: Program Evaluation

• Synthesis is a better estimate of effect size

• If effect is real and consistent it will be detected

• BUT Limited by the original studies

Page 26: Program Evaluation

Steps in meta-analysis

1)Formulate the question

2) Collect previous studies

3) Evaluate and code

4) Analyze and interpret

5) Presentation

Page 27: Program Evaluation

Data Sources

Study Selection

Data Abstraction

Statistical Analysis

Page 28: Program Evaluation

Data Sources

1. Computer searches

2. Cross-referencing

3. Hand-searching

4. Expert(s) to review list

Page 29: Program Evaluation

Study Selection

1. Study designs2. Subjects3. Publication types4. Languages5. Interventions6. Time Frame

Page 30: Program Evaluation

• Need to establish criteria for inclusion

• Eg if reading program for schools then maybe it is only effective for younger children . …

• Determine cut-off of age acceptable.

• Or separate analyses for two groups

• Or use it as a moderating factor

Page 31: Program Evaluation

Data Abstraction

1. Number of items coded2. Inter-coder bias3. Items coded

Page 32: Program Evaluation

Coding…

Are all studies the same?

One has N=10 another has N= 1000….

Different DV scales 1-5 vs 500 point scale

How flawed is ok??? Do we include a study if we think it has a confound?

Publication bias…

Page 33: Program Evaluation

Statistical Analysis

1. Choice of metric

2. Choice of model/ heterogeneity

3. Publication bias

4. Study quality

5. Moderator analysis

Page 34: Program Evaluation

Choice of Metric Original Standardized mean difference

(Mean/Standard Deviation)

Choice of Model/ Heterogeneity Fixed Effects – current group of studies

explained Random Effects – assumes that this is

a random group from all possible

Page 35: Program Evaluation

Publication Bias Graphical methods Quantitative methods

Study Qualitya. Difficult to assessb. Interpret with cautionc. Numerous scales and checklists

available

Page 36: Program Evaluation

Moderator Analysisa. Categorical Analysisb. Regression Analysis

Allows for explanation of effects

Page 37: Program Evaluation

• Meta analysis compared to review

• Objective or subjective???

Page 38: Program Evaluation

The Contingent Smile: A Meta-Analysis of Sex Differences in Smiling

M LaFranceM A. HechtE Levy Paluck

Psychological Bulletin.2003, Vol. 129, No. 2, 305–334

Page 39: Program Evaluation

Based on 20 published studies, the effect size (d) she reported was a moderate 0.63. In a follow-up report, J. A. Hall and Halberstadt (1986) added seven new cases and reported a somewhat lower weighted effect size of 0.42.

Page 40: Program Evaluation

We included in our meta-analysis unpublished studies such as conference papers and theses, as well as previously unanalyzed data that were not includedin their prior meta-analysis.

Second, we explored the influence of several moderators derived from work in other areas of sex difference research

Page 41: Program Evaluation

The third goal for the present meta-analysis was to conduct amore fine-grained analysis of several moderators previously consideredby J. A. Hall and Halberstadt (1986)

Page 42: Program Evaluation

Method

• Retrieval of Studies• We searched the empirical literature for studies

that documented a quantitative relationship between sex and smiling, even if that relationship was not the central one of the investigation.

• Along with published articles, unpublished materials such as conference papers, theses, dissertations, and other unpublished papers were included. This was done to counter the publication bias toward positive results

Page 43: Program Evaluation
Page 44: Program Evaluation
Page 45: Program Evaluation
Page 46: Program Evaluation

Recommended