Date post: | 17-Jan-2016 |
Category: |
Documents |
Upload: | shanon-harvey |
View: | 216 times |
Download: | 0 times |
Berkeley Policy Associates
Lessons Learned About Random Assignment Evaluation Implementation
in Educational Settings
SREE ConferenceMarch 4, 2010
Raquel Sanchez and Fannie Tseng
bpa Berkeley Policy Associates
Introduction
Hands-on overview of our experiences implementing random assignment evaluations in the classroom.
Extending list of lessons discussed in past literature.
Brief description of the random assignment evaluations upon which our experiences were drawn.
Discuss difficulties with implementing random assignment in classroom settings.
Discussion of lessons learned
bpa Berkeley Policy Associates
Past Literature on Random Assignment Implementation
Gueron (2002)
Ritter and Holley (2008)
Raudenbush (2005)
Burghardt and Jackson (2007)
bpa Berkeley Policy Associates
Overview of Our Random Assignment Evaluation Studies
Two school-level random assignment studies of the effectiveness of professional development programs that focus on developing the reading comprehension skills of English language learners (ELLs)
One center-level random assignment study of a professional development program targeting caregivers of children ages 0-3
One student-level random assignment study of a curriculum that combines explicit and implicit approaches to instruction in increasing the literacy skills of adult ESL students
bpa Berkeley Policy Associates
Challenges to Implementing RCTs in Educational Settings
Threats to integrity of random assignment– Crossovers and contamination
Recruiting and obtaining buy-in Dilution of intervention effectiveness
– Lack of teacher buy-in
– Effect of crossovers and contamination
Documenting treatment dosage Conflicting interventions on the ground Local conditions and circumstances
bpa Berkeley Policy Associates
Lessons Learned
Perform in-person recruiting visits at all levels of school administration to maximize buy-in
Foster good communication between school staff, program developers and research team
Follow-up data collection requires persistence, patience and adequate funding– Keep in touch with assessment data administrators,
even in off-months of study– If possible, retain local research staff
Use conservative statistical power calculations to factor in potential implementation challenges
bpa Berkeley Policy Associates
Statistical Power Example 1
Table 1: Statistical Power Implications of Weaker than Expected Implementation
Pre-Implementation Power Calculations
Post-Implementation Power Calculations
Expected Effect Size 0.164 0.146
Statistical Power 80.0% 71.2%
Number of Schools 50 50
Students Per School 1,000 1,000
Note: These calculations also assume that the correlation coefficient is .05,
the significance level is .05, and the R2 is .25.
bpa Berkeley Policy Associates
Statistical Power Example 2
Table 2: Statistical Power Implications of Dropout Schools
Pre-Implementation Power Calculations
Post-Implementation Power Calculations
Expected Effect Size 0.164 0.164
Statistical Power 80.0% 75.7%
Number of Schools 50 45
Students Per School 1,000 1,000
Note: These calculations also assume that the correlation coefficient is .05,
the significance level is .05, and the R2 is .25.
bpa Berkeley Policy Associates
Lessons Learned (2)
Throughout the course of the study, establish a separate identity from the program you are evaluating
– Hand out separate evaluation study materials with research organization’s logo
– Gift cards help
Be proactive about developing plans for documenting treatment dosage
– If possible, use more than one source of data
Importance of qualitative studies to accompany impact studies
bpa Berkeley Policy Associates
Contact Us
Raquel Sanchez, [email protected]
Fannie Tseng, [email protected]
Berkeley Policy Associates440 Grand Ave., Suite 500Oakland, CA 94610-5085Ph: 510-465-7884Fax: 510-465-7885www.berkeleypolicyassociates.com