The Implementation of Measure Y

Lessons Learned About Random
Assignment Evaluation Implementation
in Educational Settings
SREE Conference
March 4, 2010
Raquel Sanchez and Fannie Tseng
Berkeley Policy Associates
Introduction

Hands-on overview of our experiences implementing
random assignment evaluations in the classroom.

Extending list of lessons discussed in past literature.

Brief description of the random assignment
evaluations upon which our experiences were drawn.

Discuss difficulties with implementing random
assignment in classroom settings.

Discussion of lessons learned
bpa
Berkeley Policy Associates
Past Literature on Random
Assignment Implementation

Gueron (2002)

Ritter and Holley (2008)

Raudenbush (2005)

Burghardt and Jackson (2007)
bpa
Berkeley Policy Associates
Overview of Our Random
Assignment Evaluation Studies

Two school-level random assignment studies of the
effectiveness of professional development programs
that focus on developing the reading comprehension
skills of English language learners (ELLs)

One center-level random assignment study of a
professional development program targeting
caregivers of children ages 0-3

One student-level random assignment study of a
curriculum that combines explicit and implicit
approaches to instruction in increasing the literacy
skills of adult ESL students
bpa
Berkeley Policy Associates
Challenges to Implementing
RCTs in Educational Settings

Threats to integrity of random assignment
– Crossovers and contamination

Recruiting and obtaining buy-in

Dilution of intervention effectiveness
– Lack of teacher buy-in
– Effect of crossovers and contamination

Documenting treatment dosage

Conflicting interventions on the ground

Local conditions and circumstances
bpa
Berkeley Policy Associates
Lessons Learned



Perform in-person recruiting visits at all levels of
school administration to maximize buy-in
Foster good communication between school staff,
program developers and research team
Follow-up data collection requires persistence,
patience and adequate funding
– Keep in touch with assessment data administrators,
even in off-months of study
– If possible, retain local research staff

bpa
Use conservative statistical power calculations to
factor in potential implementation challenges
Berkeley Policy Associates
Statistical Power Example 1
Table 1: Statistical Power Implications of Weaker than Expected Implementation
Pre-Implementation
Power Calculations
Post-Implementation
Power Calculations
0.164
0.146
80.0%
71.2%
Number of Schools
50
50
Students Per School
1,000
1,000
Expected Effect Size
Statistical Power
Note:
bpa
These calculations also assume that the correlation coefficient is .05,
the significance level is .05, and the R2 is .25.
Berkeley Policy Associates
Statistical Power Example 2
Table 2: Statistical Power Implications of Dropout Schools
Pre-Implementation
Power Calculations
Post-Implementation
Power Calculations
0.164
0.164
80.0%
75.7%
Number of Schools
50
45
Students Per School
1,000
1,000
Expected Effect Size
Statistical Power
Note:
bpa
These calculations also assume that the correlation coefficient is .05,
the significance level is .05, and the R2 is .25.
Berkeley Policy Associates
Lessons Learned (2)

Throughout the course of the study, establish a
separate identity from the program you are evaluating
– Hand out separate evaluation study materials with
research organization’s logo
– Gift cards help

Be proactive about developing plans for documenting
treatment dosage
– If possible, use more than one source of data

bpa
Importance of qualitative studies to accompany
impact studies
Berkeley Policy Associates
Contact Us

Raquel Sanchez, Ph.D.
[email protected]

Fannie Tseng, Ph.D.
[email protected]
Berkeley Policy Associates
440 Grand Ave., Suite 500
Oakland, CA 94610-5085
Ph: 510-465-7884
Fax: 510-465-7885
www.berkeleypolicyassociates.com
bpa
Berkeley Policy Associates