+ All Categories
Home > Documents > Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’...

Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’...

Date post: 24-Dec-2015
Category:
Upload: gary-barnett
View: 220 times
Download: 0 times
Share this document with a friend
Popular Tags:
22
Practical Meta-Analysis -- D. B. Wilson 1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010
Transcript
Page 1: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

1

Practical Meta-Analysis

David B. Wilson

Evaluators’ Institute

July 16-17, 2010

Page 2: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

2

Overview of the Workshop

Topics covered will include Review of the basic methods

Problem definition Document Retrieval Coding Effect sizes and computation Analysis of effect sizes Publication Bias

Cutting edge issues Interpretation of results Evaluating the quality of a meta-analysis

Page 3: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

3

Forest Plot from a Meta-Analysis ofCorrectional Boot-Camps

Favors Comparison Favors Bootcamp

Harer & Klein-Saffran, 1996 Jones & Ross, 1997

Fl. Dept. of JJ (Stuart Co.), 1997 Fl. Dept. of JJ (Polk Co., Boys), 1997

Jones (FY97), 1998 Jones (FY94-95), 1998

Mackenzie & Souryal (Illinois), 1994 Mackenzie & Souryal (Louisiana), 1994

Jones (FY91-93), 1998 Mackenzie & Souryal (Florida), 1994

Jones (FY96), 1998 Marcus-Mendoza (Men), 1995

Mackenzie, et al. 1997 Penn. Dept. of Corrections, 2001

Flowers, Carr, & Ruback 1991 Bureau of Data and Research, 1996

Mackenzie & Souryal (Oklahoma), 1994 T3 Associates, 2000

Mackenzie & Souryal (New York), 1994 Peters, 1996b

Camp & Sandhu, 1995 Mackenzie & Souryal (S.C., New), 1994

Jones, 1996 NY DCS (88-96 Releases), 2000 Marcus-Mendoza (Women), 1995

Austin, Jones, & Bolyard, 1993 Burns & Vito, 1995

Peters, 1996a Fl. Dept. of JJ (Bay Co.), 1997

NY DCS (96-97 Releases), 2000 NY DCS (97-98 Releases), 2000

Fl. Dept. of JJ (Pinellas Co.), 1996 Fl. Dept. of JJ (Manatee Co.), 1996

CA Dept. of the Youth Authority, 1997 Boyles, Bokenkamp, & Madura, 1996

Mackenzie & Souryal (S.C., Old), 1994 Fl. Dept. of JJ (Polk Co., Girls), 1997

Jones, 1997 Thomas & Peters, 1996

Wright & Mays, 1998 Mackenzie & Souryal (Georgia), 1994

Overall Mean Odds-Ratio

Page 4: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

4

The Great Debate

1952: Hans J. Eysenck concluded that there were no favorable effects of psychotherapy, starting a raging debate

20 years of evaluation research and hundreds of studies failed to resolve the debate

1978: To proved Eysenck wrong, Gene V. Glass statistically aggregate the findings of 375 psychotherapy outcome studies

Glass (and colleague Smith) concluded that psychotherapy did indeed work

Glass called his method “meta-analysis”

Page 5: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

5

The Emergence of Meta-analysis

Ideas behind meta-analysis predate Glass’ work by several decades Karl Pearson (1904)

averaged correlations for studies of the effectiveness of inoculation for typhoid fever

R. A. Fisher (1944) “When a number of quite independent tests of significance

have been made, it sometimes happens that although few or none can be claimed individually as significant, yet the aggregate gives an impression that the probabilities are on the whole lower than would often have been obtained by chance” (p. 99).

Source of the idea of cumulating probability values

Page 6: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

6

The Emergence of Meta-analysis

Ideas behind meta-analysis predate Glass’ work by several decades W. G. Cochran (1953)

Discusses a method of averaging means across independent studies

Laid-out much of the statistical foundation that modern meta-analysis is built upon (e.g., Inverse variance weighting and homogeneity testing)

Page 7: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

7

The Logic of Meta-analysis

Traditional methods of review focus on statistical significance testing

Significance testing is not well suited to this task Highly dependent on sample size Null finding does not carry the same “weight” as a significant

finding significant effect is a strong conclusion nonsignificant effect is a weak conclusion

Meta-analysis focuses on the direction and magnitude of the effects across studies, not statistical significance Isn’t this what we are interested in anyway? Direction and magnitude are represented by the effect size

Page 8: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

8

Illustration

Simulated data from 21 validity studies. Taken from: Schimdt, F. L. (1996). Statistical significance testing and cumulative knowledge in psychology: implications for training of researchers. Psychological Methods, 1, 115-129.

Table 121 Validity Studies, N = 68 for Each

Observedvalidity

Study coefficient1 0.042 0.143 0.31 *4 0.125 0.38 *6 0.27 *7 0.158 0.36 *9 0.20

10 0.0211 0.2312 0.1113 0.2114 0.37 *15 0.1416 0.29 *17 0.26 *18 0.1719 0.39 *20 0.2221 0.21

* p < .05 (two tailed).

Page 9: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

9

Illustration (Continued)

Table 295% Confidence Intervals for Correlations From Table1, N = 68 for Each

Observedvalidity

Study coefficient Lower Upper1 0.39 0.19 0.592 0.38 0.18 0.583 0.37 0.16 0.584 0.36 0.15 0.575 0.31 0.09 0.536 0.29 0.07 0.517 0.27 0.05 0.498 0.26 0.04 0.489 0.23 0.00 0.46

10 0.22 -0.01 0.4511 0.21 -0.02 0.4412 0.21 -0.02 0.4413 0.20 -0.03 0.4314 0.17 -0.06 0.4015 0.15 -0.08 0.3816 0.14 -0.09 0.3717 0.14 -0.09 0.3718 0.12 -0.12 0.3619 0.11 -0.13 0.3520 0.04 -0.20 0.2821 0.02 -0.22 0.26

95% confidenceinterval

Page 10: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

10

When Can You Do Meta-analysis?

Meta-analysis is applicable to collections of research that Are empirical, rather than theoretical Produce quantitative results, rather than qualitative findings Examine the same constructs and relationships Have findings that can be configured in a comparable statistical

form (e.g., as effect sizes, correlation coefficients, odds-ratios, proportions)

Are “comparable” given the question at hand

Page 11: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

11

Forms of Research Findings Suitable to Meta-analysis Central tendency research

Prevalence rates Pre-post contrasts

Growth rates Group contrasts

Experimentally created groups Comparison of outcomes between treatment and comparison

groups Naturally occurring groups

Comparison of spatial abilities between boys and girls Rates of morbidity among high and low risk groups

Page 12: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

12

Forms of Research Findings Suitable to Meta-analysis Association between variables

Measurement research Validity generalization

Individual differences research Correlation between personality constructs

Page 13: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

13

Effect Size: The Key to Meta-analysis

The effect size makes meta-analysis possible It is the “dependent variable” It standardizes findings across studies such that they can be

directly compared

Page 14: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

14

Effect Size: The Key to Meta-analysis Any standardized index can be an “effect size” (e.g.,

standardized mean difference, correlation coefficient, odds-ratio) as long as it meets the following Is comparable across studies (generally requires standardization) Represents the magnitude and direction of the relationship of

interest Is independent of sample size

Different meta-analyses may use different effect size indices

Page 15: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

15

The Replication Continuum

PureReplications

ConceptualReplications

You must be able to argue that the collection of studies you are meta-analyzing examine the same relationship. This may be at a broad level of abstraction, such as the relationship between criminal justice interventions and recidivism or between school-based prevention programs and problem behavior. Alternatively it may be at a narrow level of abstraction and represent pure replications.

The closer to pure replications your collection of studies, the easier it is to argue comparability.

Page 16: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

16

Which Studies to Include?

It is critical to have an explicit inclusion and exclusion criteria (see pages 20-21) The broader the research domain, the more detailed they tend to

become Refine criteria as you interact with the literature Components of a detailed criteria

distinguishing features research respondents key variables research methods cultural and linguistic range time frame publication types

Page 17: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

17

Methodological Quality Dilemma

Include or exclude low quality studies? The findings of all studies are potentially in error (methodological

quality is a continuum, not a dichotomy) Being too restrictive may restrict ability to generalize Being too inclusive may weaken the confidence that can be

placed in the findings Methodological quality is often in the “eye-of-the-beholder” You must strike a balance that is appropriate to your research

question

Page 18: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

18

Searching Far and Wide

The “we only included published studies because they have been peer-reviewed” argument

Significant findings are more likely to be published than nonsignificant findings

Critical to try to identify and retrieve all studies that meet your eligibility criteria

Page 19: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

19

Searching Far and Wide (continued)

Potential sources for identification of documents Computerized bibliographic databases “Google” internet search engine Authors working in the research domain (email a relevant

Listserv?) Conference programs Dissertations Review articles Hand searching relevant journal Government reports, bibliographies, clearinghouses

Page 20: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

20

A Note About Computerized Bibliographies Rapidly changing area Get to know your local librarian! Searching one or two databases is generally inadequate Use “wild cards” (e.g., random? will find random,

randomization, and randomize) Throw a wide net; filter down with a manual reading of

the abstracts

Page 21: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

21

Strengths of Meta-analysis

Imposes a discipline on the process of summing up research findings

Represents findings in a more differentiated and sophisticated manner than conventional reviews

Capable of finding relationships across studies that are obscured in other approaches

Protects against over-interpreting differences across studies

Can handle a large numbers of studies (this would overwhelm traditional approaches to review)

Page 22: Practical Meta-Analysis -- D. B. Wilson1 Practical Meta-Analysis David B. Wilson Evaluators’ Institute July 16-17, 2010.

Practical Meta-Analysis -- D. B. Wilson

22

Weaknesses of Meta-analysis Requires a good deal of effort Mechanical aspects don’t lend themselves to capturing

more qualitative distinctions between studies “Apples and oranges” criticism Most meta-analyses include “blemished” studies to one

degree or another (e.g., a randomized design with attrition)

Selection bias posses a continual threat Negative and null finding studies that you were unable to find Outcomes for which there were negative or null findings that

were not reported

Analysis of between study differences is fundamentally correlational


Recommended