The Struggle to Pass Algebra I in Urban High Schools

The Struggle to Pass Algebra I in Urban High Schools:
Online vs. Face-to-Face Credit Recovery for At-Risk Students
Association for Public Policy Analysis and Management
Annual Meeting
November 2013
Jessica Heppen1, Kirk Walters1, Elaine Allensworth2, Nicholas Sorensen1, Amber Pareja2,
Suzanne Stachel1, Takako Nomi3
The consequences of failing core academic courses during the first year of high school
are dire. More students fail courses in ninth grade than in any other grade, and a disproportionate
number of these students subsequently drop out (Herlihy, 2007). As shown in Chicago and
elsewhere, academic performance in core courses during the first year of high school is the
strongest predictor of eventual graduation (Allensworth & Easton, 2005). Spearheaded by
research from Chicago and other large U.S. districts, the use of “early warning” data systems to
identify students at risk of academic failure and then appropriately intervene is now widely
recommended (Dynarski et al., 2009; Heppen & Therriault, 2008; Jerald, 2006) and gaining
momentum around the country. Identification is a critical first step, but it is only the first step.
There is a lack of critical information about the types of interventions that can, in fact, get offtrack students back on track for graduation and improve schools’ graduation rates.
Algebra failure is of particular concern in high schools across the country. It is
considered a key gatekeeper for higher-level mathematics course-taking in high school and for
college enrollment (Adelman, 2006; Gamoran & Hannigan, 2000). Yet, pass rates are
consistently low in many places. For example, at least 20% of ninth graders in Michigan fail
1
American Institutes for Research; 2 University of Chicago Consortium on Chicago School Research, 3 St. Louis
University
Funding for this study has been provided by the U.S. Department of Education’s Institute of Education Sciences,
grant R305A110149.
1
Algebra I (Higgins, 2008). Six years after the implementation of an initiative to increase access
to algebra, failure rates for freshmen in Milwaukee were 47% (Ham & Walker, 1999). In Los
Angeles, 44% of ninth graders failed Algebra I (Helfand, 2006). In the Chicago Public Schools
(CPS), only 13% of students who fail both semesters of Algebra I in 9th grade graduate in 4
years, and the largest share of 9th grade algebra failures occur in the second semester of the
course. Identifying ways that students can get back on track is of utmost policy importance.
Offering credit recovery options is one strategy to deal with high failure rates. The
primary goal of credit recovery programs is to give students an opportunity to retake classes that
they failed in an effort to get them back on track and keep them in school (Watson & Gemin,
2008). Student populations specifically targeted by credit recovery programs include undercredited older students (11th and 12th graders) and students who were initially unsuccessful in a
course (Blackboard K–12, 2009). Credit recovery exists in numerous forms. Students can take
standard classes during the school day; attend after-school, evening, weekend, or summer
courses; or enroll in alternative programs, such as student-teacher correspondence courses
(Watson & Gemin, 2008).
Most recently, as schools across the nation struggle to keep students on track and reengage students who are off track, online learning has emerged as a promising and increasingly
popular strategy for credit recovery. Results from a recent national survey of K–12 districts
indicate that 75% of U.S. districts have students enrolled in online courses and that the number
of K–12 students engaged in online courses in 2007–08 was over 1 million (Picciano & Seaman,
2009). Credit recovery is one of the most common applications of online courses; more than half
of respondents from another national survey of administrators from 2,500 school districts
2
reported using online learning in their schools for credit recovery, with just over a fifth (22%)
reporting “wide use” of online learning for this purpose (Greaves & Hayes, 2008).
The increasing use of online courses for credit recovery signals general agreement in the
field that expanding credit recovery options through the offering of online courses may help
more students get back on track toward graduation (e.g. Hammond et. al, 2007). However, no
rigorous evidence currently exists about the efficacy of online credit recovery courses. With
support from the Institute of Education Sciences’ National Center for Education Research,
researchers from the American Institutes for Research (AIR) and CCSR partnered with the
Chicago Public Schools to investigate the effects of offering Algebra I as an online summer
credit recovery option for at-risk ninth graders.
At the core of this project is an efficacy study that used a student-level random
assignment design to test the impact of online Algebra I for credit recovery against the standard
face-to-face (f2f) version of the course. The study targeted at-risk students who failed the course
during their freshman year. To examine the short- and long-term outcomes of students who took
online and f2f credit recovery classes in algebra, the study is analyzing outcomes for two cohorts
of ninth grade students – students who took either online or f2f algebra for credit recovery during
the summers of 2011 and 2012 – through the end of the 2014-15 school year.
This project was also designed to address broader questions regarding the general
effectiveness of expanding credit recovery options through the offering of online and f2f summer
classes in Algebra I. The study is designed to ultimately inform education decisionmakers about
the extent to which expanding credit recovery helps students get back on track for graduation,
and whether and how recovering credits via online and/or f2f summer courses lead otherwise
3
high-risk students to resemble lower-risk students who passed Algebra I during their freshman
year.
This paper provides an overview of the project’s background, design, participants,
methods, and findings to date regarding the efficacy of online relative to f2f algebra credit
recovery. This paper is organized into four sections. The first section describes the rationale for
the study—Why study credit recovery for ninth graders, and why a particular focus on Algebra
I? The second section provides an overview of the online Algebra I course implemented for the
study, the theory of action, and a description of how the intervention was implemented in this
research. The third section describes the study design for the focal comparison of online vs. f2f
credit recovery, including the random assignment strategy, participating schools and students,
key measures and analytic approach. The fourth section provides results of impact analyses to
date, contextualized by brief descriptive analyses of implementation. The paper concludes with a
discussion of the most salient lessons learned so far, and next steps for further analysis.
I.
STUDY RATIONALE
The high school dropout problem has been called a national crisis. The graduation rate
across the United States, based on 2007–08 data, is 74.9% (Stillwell, 2010)—a quarter of all high
school students leave the public school system before graduating. The problem is particularly
severe among students of color and students with disabilities (Greene & Winters, 2005; Stillwell,
2010; U.S. Department of Education, 2006). The collective and individual costs of the dropout
problem are staggering. According to multiple estimates, a single cohort of dropouts costs the
nation over $300 billion in lost wages and taxes (Alliance for Excellent Education, 2007; Rouse,
2005) and billions more in costs to public health, crime and justice, and public assistance (Levin,
Belfield, Muennig, & Rouse, 2007). High school dropouts earn, on average, $9,200 less per year
4
than high school graduates, and their lifetime earnings are $1 million less than those of college
graduates (Bridgeland, DiIulio, & Morison, 2006).
The reasons that students drop out of high school are varied and multifaceted. However,
researchers generally agree that dropout is typically a cumulative process of increased
disengagement with school (Fine, 1991; Orfield, 2004). Indicators that students are at risk for
dropping out are often displayed early in high school, in middle school or even earlier; these
indicators are primarily related to student attendance and performance in school, especially in
core academic subjects such as English language arts and mathematics (Allensworth & Easton,
2005, 2007; Balfanz & Legters, 2004; Neild & Balfanz, 2006; Neild & Farley, 2004; Roderick,
1993; see reviews by Jerald, 2006, and Heppen & Therriault, 2008).
Research is particularly clear that ninth grade is a critical transition year. More students
fail ninth grade than any other grade, and a disproportionate number of students who are held
back in ninth grade subsequently drop out (Herlihy, 2007). As shown in Chicago and elsewhere,
academic performance in core courses during the first year of high school is the strongest
predictor of eventual graduation (Allensworth & Easton, 2005). Many factors drive freshman
year course performance (see Allensworth & Easton, 2007), but the bottom line is clear: failure
during the ninth grade has dire consequences for the successful completion of high school.
Spearheaded by research from Chicago and other large U.S. districts, using “early
warning” data systems to identify students at risk of academic failure and then appropriately
intervene is now widely recommended (Dynarski et al., 2009; Heppen & Therriault, 2008;
Jerald, 2006) and gaining momentum. Identification is a critical first step, but it is only the first
step. Data about the types of interventions that can get off-track students back on track for
graduation are lacking. Credit recovery is a promising mechanism for responding to early
5
warning indicators, but there is not yet strong evidence about the degree to which credit recovery
itself reduces dropout rates, increases graduation rates, and otherwise improves outcomes for
students who are extremely at-risk for academic failure.
Use of Online Courses for Credit Recovery. Online learning is expanding rapidly in
secondary schools around the country, and credit recovery is one of the fastest-growing areas of
K–12 online education (Greaves & Hayes, 2008; Picciano & Seaman, 2009). For students who
have failed courses, an online course offers another chance to learn the content of the course in a
different format and with a more individualized approach (Archambault et al., 2010). Offered
during the summer and, increasingly, before and after school during the school year, online
courses for credit recovery can be a flexible and cost-effective option for schools and districts
(Picciano & Seaman, 2009; Blackboard K–12, 2009).
Online courses are delivered in varying formats. Some are fully online and completely
self-paced; others are hybrid or blended models that combine online learning with f2f teacher
support for students (Picciano & Seaman, 2009; Tucker, 2007; Watson & Ryan, 2006). Survey
research suggests that for credit recovery, hybrid or blended models that include an f2f teacher or
mentor component are most promising because struggling students who have failed a course
previously tend to need additional support and individual attention to succeed (Picciano &
Seaman, 2009).
The promise of online courses for credit recovery may lie in features that make them
seem new to students or different from the f2f course they failed. For example, online courses
can use technology to engage students in content with animations, simulations, video, and other
interactive content. Students receive immediate feedback on activities and assessments, and the
6
pacing of course content can be flexible and individualized (Archambault et al., 2010;
Blackboard K–12, 2009; U.S. Department of Education, 2009).
While the technology evolves and the uptake of online courses for credit recovery
continues to grow, rigorous evidence about their effectiveness is lacking. A recent meta-analysis
reviewed 99 studies of online learning and found that on average, online instruction yields
positive effects relative to f2f instruction (U.S. Department of Education, 2009). However, this
finding is based almost completely on postsecondary research. At the K–12 level, the authors
found just five published studies that compared online with f2f instruction that met their strong
criteria for rigor, and none of the five examined the use of online learning for credit recovery.
This study redresses this gap and thereby provides actionable information to education decision
makers faced with the challenge of improving outcomes and raising graduation rates for
struggling students.
Focus on Algebra I. Algebra I is considered a key gatekeeper for higher-level
mathematics course-taking in high school and for college enrollment (Adelman, 2006; Gamoran
& Hannigan, 2000). Historically, Algebra I differentiated college-bound students from other
students, but in recent years, the standards movement and other reform efforts, combined with
results on international assessments showing U.S. students lagging behind many nations, led to a
call for “Algebra for All” in hopes that requiring algebra for graduation will increase access to
more-advanced courses and future college success (Lee & Ready, 2009). Increased graduation
requirements and elimination of low-level mathematics courses have become common in states
and school districts across the country (Council of Chief State School Officers [CCSSO], 2009).
These policies have immediate effects on the number of students taking algebra as well as
7
higher-level mathematics classes (Cavalluzzo et al., 2003; Everson & Dunham, 1996; Ham &
Walker, 1999; Kim et al., 2001).
Yet Algebra I is a common hurdle for ninth graders across the country. For example, at
least 20% of ninth graders in Michigan fail Algebra I (Higgins, 2008), and the percentages are
consistently high in urban districts. Six years after the implementation of an initiative to increase
access to algebra, failure rates for freshmen in Milwaukee were 47% (Ham & Walker, 1999). In
Los Angeles, 44% of ninth graders failed Algebra I (Helfand, 2006). Because passing the course
and subsequent courses like Geometry are requirements for graduation, these students are
significantly less likely to graduate and are more likely to drop out (Legters & Kerr, 2001).
CPS Policy Context. In Chicago, district policy implemented in 1997 mandates that all
high school students enroll in a college-preparatory curriculum. The policy raised graduation
requirements and eliminated the previously available remedial courses so that all freshmen take
Algebra I or a higher course in the mathematics sequence (Geometry, Algebra II) in ninth grade.
Ten years later, CCSR examined the effects of requiring students to begin high school with
Algebra I instead of remedial mathematics. Using an interrupted time series design, Allensworth,
Nomi, Montgomery, and Lee (2009) examined changes in the extent to which students received
credit for Algebra I in ninth grade, their grades, test scores, and credits in higher-level
mathematics in later years. The findings showed that following implementation of the new high
school math coursetaking policy, more freshmen did enroll in Algebra I, as expected. However,
their grades and pass rates were lower than those of freshmen prior to the policy and they were
no more likely to take advanced mathematics courses. As of 2009, 27% of first-time freshmen
failed one or both semesters of Algebra I and failure rates for the second semester (Algebra IB)
8
are higher than for the first.2 Typically, even students who fail Algebra I can enroll in Geometry,
the next mathematics course in the sequence, in 10th grade, but they are still required to pass
Algebra I to graduate. Therefore, to earn a diploma, they must eventually recover the Algebra I
credit in their high school careers.
However, the rates of recovery are low in the district. For many students, the provision to
make up Algebra I credits later in high school is not effective. Students can recover credits in the
summer after ninth grade—most feasible is recovery of a ½ credit in the summer. Summer
school with standard f2f offerings is available in most high schools, and CPS has recently begun
to use online courses for credit recovery in hopes that offering potentially more-engaging courses
will increase enrollments and success rates. However, prior to this study the district estimated
that less than 20% of CPS students who fail algebra in the spring (but not in the fall) actually
make up the credit during the summer. For these reasons, CPS was an ideal policy context to
study the efficacy of online credit recovery.
II.
ONLINE COURSE DESCRIPTION
In this section, we describe the theory of action that specifies the mechanisms through
which an online credit recovery course in Algebra I course, theoretically, change outcomes for
students who fail traditional algebra courses in their first year of high school. We then describe
the selection of the online course used in this study, and then provide an overview of how it was
implemented in the schools that participated.
Theory of Action. The theory of action behind this study is shown in Exhibit 1. We start
with the supposition that students fail algebra because they are poorly engaged in the class and
put in little effort—the strongest predictors of ninth grade course failure are students’ attendance
2
Calculated by CPS, using data on students who were in ninth grade in 2008–09.
9
and work effort. For example, in CPS, background factors such as prior test scores, gender, race,
SES, age, and mobility together account for 12% of the variation in course failures. Absences
and self-reported study behaviors bring the prediction up to 73% (Allensworth & Easton, 2007).
Low engagement leads students to learn little and to subsequently fail. Because they lack an
understanding of algebra, they struggle in subsequent classes, particularly in mathematics and
science. Failure in these classes, combined with failure in algebra, leads students to have
insufficient credits to graduate. As the likelihood of obtaining sufficient credits diminishes,
students eventually drop out. The relationship between credit attainment and graduation is so
strong that each semester course failure in ninth grade is associated with a 15 percentage point
decline in 4-year graduation rates (Allensworth & Easton, 2007).
Figure 1. Theory of Action Behind Summer Online Algebra Credit Recovery
Exhibit 1. Theory of Action behind Summer Online Algebra Credit Recovery
Poor
engagement in
spring semester
algebra
Online summer
course
 Personal
support
 Engagement
Ninth Grade
Algebra Failure
Insufficient
algebra skills
Geometry,
Chemistry
Failure
Enhanced
algebra
learning
Algebra
credit
Insufficient
Credits
Dropout
Dashed lines indicate a negative relationship or disruption in the path.
Expanded credit recovery through online algebra interrupts this process in two ways.
First, delivery through online courses allows a more individualized, interactive experience.
Furthermore, students receive personal support and monitoring from on-site mentors. These
characteristics—individualization, interactive pedagogy, and personal support—have all been
associated with greater engagement and learning (Archambault et al., 2010; Lee & Smith, 1999;
Newmann et al., 1996; Slavin & Madden, 1989). Students should be more engaged in online
10
algebra and more likely to persist in the course, thus more likely to learn algebra content and
receive course credit. These short-term outcomes should lead to improvements in other shortterm achievement outcomes, including scores on the mathematics exam (that includes an algebra
portion) taken in the fall of 10th grade. Better algebra skills should also make students more
likely to pass their 10th and 11th grade mathematics and science classes (including Geometry,
Chemistry, and Algebra II). Attainment of the credit in algebra, plus improvement in
performance in subsequent mathematics and science classes, will help them make progress
toward graduation. Thus, we should see improvements in long-term outcomes, including final
mathematics and science GPAs, ACT math scores, dropout rates in the 2nd and 3rd years of high
school, and 4-year graduation rates.
Ultimately, this study was designed to provide important causal information to schools
about whether improving credit recovery has a substantial effect on improving graduation rates
and should be an area in which to invest increasingly limited resources.
Selection of Online Course for the Study. A number of providers offer online courses for
credit recovery, including commercial providers and courses offered as part of district or state
virtual schools programs (Archambault et al., 2010). In the years prior to the launch of this study,
CPS had piloted several different online courses and providers, including those offered by
Aventa Learning (now K12). According to Aventa/K12, their approach to credit recovery
assumes that one reason students fail traditional courses is that the individual learning needs of
at-risk students are poorly addressed in traditional settings. For example, if an at-risk student gets
behind in a traditional course—especially a course like Algebra I, where topics build on each
other sequentially—it can become so difficult to catch up that the student often just gives up. The
Aventa Algebra I course purportedly counters this problem by personalizing instruction for at-
11
risk students. The course includes both an online teacher, who communicates individually with
students through email and class message boards, and an in-class mentor, who provides inperson assistance and support. Students who have questions as they proceed through the
activities have access to two sources of help. This may be particularly helpful for students who
prefer private to public feedback—these students have direct access to an online teacher and, as a
result, may be more inclined to engage in the course material.
Aventa also targets its instruction for at-risk Algebra I students by allowing them to
demonstrate mastery of concepts that they had previously mastered in the course that they failed.
Students can then spend more time on the topics they need to master and receive a potential
boost in self-confidence as they realize they are not starting “from scratch.” The Aventa Algebra
I course has several types of instructional supports for at-risk students, such as lowered reading
level of the content, shorter topics, an audio “read aloud” function, targeted vocabulary
instruction, and formative and summative assessments. The reading support is intended to
increase the likelihood that students will comprehend the material and therefore be able to
progress through the course. Small content “chunks” are designed to increase students’ retention
and expand assessment opportunities. The assessments allow students to get quick feedback on
their learning.
Aventa/K12 operates online courses in every U.S. state and their Algebra I course had
been implemented widely for credit recovery—in an estimated 500 schools around the country in
addition to its recent expansion in CPS. Aventa had conducted several district case studies that
illustrate the feasibility of implementation and participants’ satisfaction with the program. Given
its widespread use, a delivery model that appears particularly appropriate for the purpose of
credit recovery, and promising evidence of effectiveness in CPS and other places, the Aventa
12
online Algebra I course was selected as the intervention under test in this study of the efficacy of
online credit recovery.
Planned Implementation of Online Algebra I Course. The online course provided by
Aventa/K12 (as well as a traditional f2f version of the course) was offered in each participating
school to first-time freshmen who needed to recovery credit for the second semester (Algebra IB)
in summer 2011 and 2012. The online course included Aventa’s complete Algebra IB
curriculum, the web-based course software, an online teacher who was certified in mathematics
and trained to teach online, and an on-site mentor responsible for supporting the students taking
the online course. The Aventa/K12 online course addresses topics typically included in the first
and second semesters of Algebra I, and all of the topics are aligned to state and local standards.
The lessons are designed to be interactive and allow students to regulate and receive
immediate feedback on what they are learning. Because students progress at their own pace,
students working in the same computer lab could focus on different parts of the course and/or
different lessons within each part of the course. All lessons include avatars, flash technology,
animations, and interactive games to promote student engagement with the content. Lessons are
linked to a learning management system, which allows students to take exams, upload
assignments, and monitor progress. Aventa’s online teachers were licensed in mathematics and
were said to receive ongoing professional development in online instruction, including training
on the key characteristics of the online learner, best practices in online instruction, strategies to
stimulate and sustain student engagement, and the use of tools and assessments that are integral
to the course. When delivering the course, online teachers could communicate with students
through the learning management system, online chats, and online “whiteboard” demonstrations.
13
The online course also has a “live elluminate” platform that facilitates teacher-to-student and
student-to-student dialogue.
All online classes in the study were required to have a site-based mentor to support
students. The in-person mentors were certified teachers, though not necessarily licensed in
mathematics, and were responsible for helping students navigate the curriculum, proctoring
online assessments, troubleshooting technological issues, and communicating with online
teachers about students’ progress. Mentors were also responsible for communicating and
coordinating with online teachers to form a team of support for each student. Prior to summer
school in June 2011 and June 2012, all of the mentors designated by the participating schools
were trained on how to use the online course, including how to monitor student progress and
success and communicate with the online teacher.
Planned Implementation of the F2F Algebra I Courses. The control condition was the
typical f2f Algebra IB course offered in schools participating in the study. The course followed
the standard CPS Algebra I curriculum and was taught by licensed CPS mathematics teachers
chosen by the participating schools. The course used whatever technology is normally used in f2f
instruction (e.g., calculators, computer presentation systems and/or software). Unlike the online
course, in which students progress through the material at their own pace and are grouped with
students who are at different places in the curriculum, the f2f courses generally focus on the
same topics at the same time for all students.3 We anticipated that the content coverage, rigor,
and quality of the f2f courses would vary. This natural variation represents a policy-relevant and
appropriate control condition for the study.
3
Some summer school teachers may, in fact, differentiate instruction to allow certain groups of students to progress
through the material more quickly than others. However, given that the majority of credit recovery students are at
similarly low-ability levels and given conversations with CPS regarding how these courses are typically taught, we
did not anticipate that f2f classes would have different groups of students learning at their own pace.
14
The key characteristics of the two study conditions are summarized in Table 1.
Table 1. Characteristics of the Treatment (Online) and Control (F2F) Conditions
Characteristic
Content is one semester of Algebra I per 3-4 week summer session
Content is aligned to CPS content standards for Algebra I
Students progress through the course at their own pace
Students must demonstrate mastery at different time points to progress through
(and ultimately complete) the course4
Students have instant access to assessments and grades
Teacher is certified and licensed to teach mathematics
Trained mentor supports students and communicates with mathematics teacher
III.
Online Course
(Treatment)
X
X
X
F2F Course
(Control)
X
X
X
X
X
X
X
STUDY DESIGN
The study design utilizes within-school randomization of first-time freshmen who failed
second-semester Algebra I to one of two conditions: online Algebra I (treatment) or standard f2f
Algebra I (control). This powerful and simple design allows us to estimate the impact of online
vs. f2f credit recovery by comparing outcomes for students who took the previously described
Aventa online course with those for students who took a “typical” f2f course that would
otherwise be offered by the participating schools.
The study was implemented in summers 2011 and 2012, with two separate cohorts of
freshmen in need of algebra credit recovery in public high schools in CPS, and each summer
cohort included two, 60-hour summer sessions. Fifteen schools participated in the first year
(summer 2011) and 13 schools participated in the second year (summer 2012). These schools
were selected and recruited for participation because they had the largest number of students who
failed the second semester of Algebra I in the 2009-2010 school year, did not otherwise have
existing expanded summer credit recovery programs in place, and were open for summer school
4
Aventa’s online course is structured so that students must demonstrate mastery—typically 70% or higher—on
periodic assessments in order to move through the curriculum. While teachers in the f2f condition may have similar
mastery requirements, they are likely not as highly structured or central as these assessments are to the Aventa
course.
15
(e.g., not undergoing renovation). The characteristics of the participating schools are described
in the following section on the Study Sample.
The efficacy trial was designed to test a hypothesis that first-time freshmen who failed
the second semester of Algebra I and attempted to recover the credit with an online course would
exhibit more positive outcomes than those who attempted to recover the credit with a standard
f2f course. Per the theory of action described above, we proposed this hypothesis because the
online credit recovery course is designed to engage at-risk students and move them through key
course content in ways that encourage persistence and course completion, relative to retaking the
traditional course in which these students were already unsuccessful. Specifically, we
hypothesized that students who enrolled in online Algebra I for credit recovery, in comparison to
students who retook traditional Algebra I for credit recovery, would have better short-term and
long-term academic achievement, including:

Higher scores on end-of-course assessments of algebra learning and on high school
mathematics achievement tests (10th grade PLAN algebra subtest, 11th grade ACT)

Better mathematics course performance (higher probability of earning credits in
subsequent mathematics and science courses, and earning passing grades in those
courses)

Lower probability of dropping out of high school over the 2nd, 3rd and 4th years of high
school, and higher probability of graduating (using a 4-year cohort rate, calculated by
following individual students’ administrative records).
The study was designed to test this primary hypothesis by comparing these highly policy
relevant outcomes for students in the online and f2f conditions. In this paper, we provide
preliminary results regarding the impact of online vs. f2f algebra credit recovery on the short
term outcomes listed above—an end of course algebra assessment,10th grade PLAN scores, 10th
grade mathematics course performance, and 10th grade enrollment status. The findings to date
16
also include results of analyses comparing grades earned in the online and f2f credit recovery
courses and the likelihood of recovering credit by condition.
Although this paper is focused on the design and results for the head-to-head comparison
of online vs. f2f algebra credit recovery at the heart of this efficacy trial, it is important to note
that surrounding the experimental efficacy design is a broader set of questions about the general
potential benefits of expanding access to credit recovery—both online and f2f alternatives. In
particular, because the available data include those for all students in CPS, this study was
designed to gauge the extent to which online and f2f credit recovery opportunities can help atrisk students get back on track, relative to students who passed Algebra I in 9th grade, and
relative to students who failed the course but did not attempt credit recovery the summer after
their freshman year. Moreover, the study was designed to estimate the overall impact of
expanding credit recovery options in schools with relatively high Algebra I failure rates. In
general, this study is addressing key questions about what it means to “get back on track” for
different types of students with varying risk for academic success and failure.
Random Assignment Strategy. Within each school, the study team randomly assigned
eligible students to condition on site, on the first day of summer school. Students were eligible
for random assignment if they failed second semester Algebra I, were willing to enroll in
summer credit recovery, and showed up for class on one of the first two days of the term. Prior to
the first day of the summer session in each school, the study team collected lists of students who
had failed Algebra IB and included information for each student about their gender and whether
they has passed or failed first semester algebra (Algebra IA). This information was included in
order to allow us to block students by gender and Algebra IA pass/fail status for random
17
assignment. On the first day of each summer session, study team members randomly assigned
the eligible students to take either the online or f2f version of the course.
The study’s random assignment approach was intended to explicitly protect against the
threat to internal validity of “no-shows.” The study has an intent-to-treat (ITT) framework for
analysis, meaning that students who are randomly assigned to condition are always considered
part of the student sample with treatment status according to original assignment. In CPS, like
many other districts around the country, a significant proportion of students who fail courses
during the academic year and can make up credits in the summer do not show up for summer
school.5 If random assignment had taken place prior to the start of summer school, using the
schools’ lists of those who had failed Algebra IB, the study would always have to account for a
substantial number of no-shows. Therefore, students were randomly assigned on-site and in
person, on the first two days of summer school.
This on-site random assignment procedure involved setting up “check in” stations in each
of the schools. Students who needed to recover Algebra IB credits were asked to complete an
intake form. Study team members received the intake forms from each student and then located
the student on the school-provided list of eligible students. The student’s name was then entered
onto a list for their own gender and Algebra IA pass/fail status block (that is, girls who failed
Algebra IA; girls who passed Algebra IA; boys who failed Algebra IA; boys who passed Algebra
IA).6 Each row on the list had a pre-set assignment to online (treatment) or f2f (control) based
5
The district estimated in 2010 that approximately 1/3 of the students who are eligible for summer school actually
show up. However, this number cannot be verified because of the record keeping practices in the schools, wherein
students are dropped from the rosters if they miss more than two days of summer school.
6
In some cases, it was unknown whether the student had passed Algebra IA, due to difficulty in obtaining student
records at some of the schools.
18
on a random number generator. Study team members then told each student their assignment and
worked with school staff to direct them to the respective classrooms.
Statistical Power. Based on power analyses conducted prior to the implementation of the
study, we sought to randomly assign an average of 40 students per school, in a total of 16-20
schools per year, into online and f2f classes of 20 students each, for a total of 640-800 students
per cohort. Minimum detectable effect sizes (MDESs) for impacts on student achievement and
other outcomes range from 0.14 to 0.24, for analyses conducted separately by cohort. These
MDESs are reasonable given our theory of action (Figure 1) that hypothesizes the online course
is more engaging for at-risk students, who are thus more likely to persist and complete than
students who retake the traditional course in which they were previously unsuccessful. These
MDESs are also reasonable based on previous research that finds the average effect of online
learning relative to f2f is about 0.24 standard deviations (U.S. Department of Education, 2009).
Sample
Participating Schools. In the first year (summer 2011), 15 schools participated in the
study. These schools were selected and recruited for participation because they had the largest
number of students who failed the second semester of Algebra I in the 2009-2010 school year,
did not otherwise have existing expanded summer credit recovery programs in place, and were
open for summer school (e.g., not undergoing renovation). Because eligibility was primarily
determined by the total number of students who failed algebra, eligible schools tended to be
larger—the average number of students in study high schools was 1,785 students, compared to
729 students in non-study high schools—with higher algebra failure rates, than typical schools.
In other ways, they were similar to other schools in the district. One exception is that the study
schools, on average, served a significantly higher proportion of Hispanic students and students in
19
which English was not the primary home language. Table 2 provides descriptive information
about the schools that participated in Year 1 and the district overall.
Table 2. Characteristics of CPS High Schools Participating in Credit Recovery Study in
Summer 2011 and District High Schools Overall, as of 2010
Characteristics
Female
Race/Ethnicity
White
African American
Hispanic
Asian
Native American
Other Race
Eligible for free or reduced-price lunch
Home language not English
Eligible for special education services
2011 Study Schools
Average
Average
Number
Percent
906
50.7%
All CPS high
schools
Average
Percent
49.6%
136
628
957
4
6.0%
42.4%
48.0%
0.2%
4.7%
59.4%
32.5%
0.2%
8
17
1468
975
269
0.4%
0.9%
91.5%
48.3%
15.4%
0.2%
1.2%
91.3%
31.2%
18.9%
Number of study schools is 15. Averages are calculated from all students in grades 9-12 active during the fall
semester, 2010. District averages include all schools with students in grades 9-12 (total school N=150).
During the first summer of implementation, we established a total of 36 sections (18
online and 18 f2f) in the 15 participating schools, and randomly assigned about 600 students to
these sections. These students comprise the cohort 1 sample, described in the participating
students section below.
For summer 2012, we wanted to retain as many of these 15 schools as possible, plus
additionally open up recruitment to include a few additional schools. In an effort to keep the
participating schools engaged in the project during the 2011-2012 school year, we reported back
to them lessons learned during Year 1 and provided implementation reports customized for each
school. These reports including information about the characteristics of participating schools and
students, the implementation of the online course, and high-level findings (across conditions
within schools, not separately by condition) about student engagement and credit recovery rates.
20
Discussions with schools about eligibility to participate in Year 2 focused primarily on
the number of students who were likely to fail Algebra IB in spring 2012, the number of those
who were likely to show up for summer school, and whether the school would be open for
summer school in 2012. Some schools that participated in Year 1 had changing circumstances in
2012; among those, a total of four schools were not able to participate in the second year. For
three of those four schools, the reason for not participating in Year 2 was that they were
scheduled for construction/renovation and would be closed for the summer.7 The fourth school
had lower overall enrollment numbers and fewer students who failed Algebra IB in spring 2012
than they did the previous year, and so was not able to participate in Year 2. Table 3 provides
descriptive information about the participating schools and the district overall.
Table 3. Characteristics of CPS High Schools Participating in Credit Recovery Study
During Summer 2012 and District High Schools Overall, as of 2011
Characteristics
Female
Race/Ethnicity
White
African American
Hispanic
Asian
Native American
Other Race
Eligible for free or reduced-price lunch
Home language not English
Eligible for special education services
2012 Study Schools
Average
Average
Number
Percent
907
49.6%
All CPS high
schools
Average
Percent
49.4%
179
497
1076
2
8.2%
33.3%
54.5%
0.1%
4.7%
58.4%
33.1%
0.2%
8
22
1572
1074
287
0.4%
1.4%
91.0%
53.4%
16.0%
0.2%
1.3%
91.6%
31.4%
19.2%
Number of study schools is 13. Averages are calculated from all students in grades 9-12 active during the fall semester, 2011.
District averages include all schools with students in grades 9-12 (total school N=150).
7
One school that was scheduled for closure during summer 2012 arranged to offer just the Algebra IB credit
recovery sections for the study at another school so they could participate again in summer 2012.
21
Across the 13 schools, we established 20 pairs of online and f2f Algebra IB course
sections. Seven high schools offered one pair of sections, 5 schools offered 2 pairs of sections,
and 1 school offered 3 pairs of sections, across the two summer school sessions.
Student Outreach in Participating Schools. One of the key features of this study is the
fact that the grant provides resources for schools to offer additional sections of Algebra I credit
recovery courses, beyond what they would typically be able to offer in business as usual summer
school. Although CPS high schools, particularly those that participated in the study, have many,
many students that could potentially benefit from summer school, they do not typically have to
turn students away due to overfull courses (at least with freshman courses that are required for
graduation but do not need to be passed in order to move onto the next course in the sequence).
Thus the study plan included an emphasis on working with schools and students to expand
enrollments—that is, to attract more students in need of credit recovery to summer school than
otherwise would likely attend.
To attract students to summer school, the study team, in collaboration with the district
and the participating schools, implemented a systematic set of outreach activities to encourage
the target students to enroll in summer credit recovery. Based on early indicators of risk for
failure of Algebra IB, including 3rd quarter grades and partial 4th quarter performance, we
prepared for recruitment of students beginning in May 2011 for cohort 1, and March 2012 for
cohort 2.
The primary strategy for this outreach was to communicate to students and parents that
the consequences of failing Algebra I in the 9th grade are dire and that their chances of
graduating high school, much less going to college, are significantly lower. In some schools,
study team members had the opportunity to communicate directly with first-time freshmen with
22
failing grades in Algebra IB; in other schools, assistant principals or other school leaders and
staff conducted these communications. School staff and the study team communicated to
students and parents that the school was participating in this project in order to expand options
for credit recovery and create more opportunities for students to recover credit required for
graduation. Outreach also included mailings with flyers about the study and letters from the
school. The purpose of this outreach was to ensure sufficient enrollment and uptake of the online
and f2f classes, as well as to ensure sufficient contrast between schools participating in the study
that have “expanded credit recovery options” and other schools that did not have expanded credit
recovery options. Due to the start date of the project (March 2011), the planning and
implementation of the outreach to students occurred on a tight timeline with Cohort 1. In Year 2,
there was more time to implement the outreach strategies, resulting in a larger number of total
students in Cohort 2 (N=792) than in Cohort 1 (N=591).
Participating Students. In Year 1 (summer 2011) we randomly assigned a total of 304
students to treatment (online course) and 287 to control (f2f course). The number and percent of
students within each condition by block is shown in the top portion of Table 4. In Year 2
(summer 2012), we randomly assigned a total of 395 students to the online course and 397 to the
f2f course. The number and percent of students within each condition by block is shown in the
bottom portion of Table 4.
23
Table 4: Number and Percentage of Students per Condition by Block
Passed Algebra IA
Condition Gender Number
Cohort 1 – Summer 2011
F2F
Female
45
Male
70
Total
115
Online
Female
44
Male
73
Total
117
Cohort 2 – Summer 2012
F2F
Female
56
Male
83
Total
139
Online
Female
53
Male
81
Total
134
Failed Algebra IA
Algebra IA Status
Unknown
Total
Percent
Number
Percent
Number
Percent
Number
16%
24%
40%
14%
24%
38%
27
59
86
36
61
97
9%
21%
30%
12%
20%
32%
30
56
86
35
55
90
10%
20%
30%
12%
18%
30%
102
185
287
115
189
304
14%
21%
35%
13%
21%
34%
52
95
147
55
93
148
13%
24%
37%
14%
24%
37%
41
70
111
44
69
113
10%
18%
28%
11%
17%
29%
149
248
397
152
243
395
Source. Study records.
In Year 1, we had a total of 36 sections in the 15 participating schools, 18 online and 18
f2f. With a total of 591 students, there was an average of 16.4 students per section. In Year 2, we
had a total of 40 sections in the 13 participating schools, 20 online and 20 f2f. With a total of
792 students, there was an average of 19.8 students per section.
We used student-level records from the district and blocking information gathered for
random assignment to examine of the characteristics of the students who participated in the study
in 2011 and 2012, overall and by condition. We have also conducted tests for differences in
student characteristics by condition, modeling schools and summer school session as fixed
effects to account for the clustering of students within schools within summer school sessions.
Tables 5 and 6 show the results of these analyses for the cohorts 1 and 2 respectively.
The results in Tables 5 and 6 show that there were no significant differences by condition
in any of the student characteristics we examined, suggesting that the random assignment
procedure did, as intended, produced two groups of students that did not differ on any measured
24
characteristics. In addition, nearly all of the students participating in cohorts 1 and 2 were firsttime freshmen – our target population for the study. Approximately 10% were eligible for special
education services and 78% were eligible for free or reduced-priced lunch. The cohort 1 sample
was 58% Latino, 36% African American, 6% other race/ethnicities and 37% female. Similarly,
the cohort 2 sample was 58% Latino, 29% African American, 12% other race/ethnicities and
38% female.
Table 5. Baseline Characteristics of Cohort 1 (Summer 2011)
Characteristic
Mean spring 2010 Explore math scaled score
Mean concentrated poverty (2009 ACS) a
Mean social status (2009 ACS) b
Mean number of unexcused absences
(2010-2011school year)
Percent first-time freshman
Percent special education
Percent African American
Percent Latino
Percent Other Race (non-Latino, non-African
American)
Percent Suspended
(2010-2011 school year)
Percent Moved Schools
(2010-2011 school year)
Percent Female (blocking variable)
Percent Passed Algebra 1A (blocking variable)
Percent Failed Algebra 1A (blocking variable)
Percent Unknown Pass/Fail in Algebra 1A (blocking
variable)
Online
13.45
(2.92)
0.13
(0.75)
-0.57
(0.87)
32.05
(23.48)
88
10
38
56
F2F
13.25
(2.96)
0.12
(0.74)
-0.54
(0.85)
30.49
(23.32)
91
7
35
59
p-value
6
6
0.821
46
46
0.830
5
5
0.801
38
39
32
36
40
30
0.629
0.574
0.575
30
30
0.989
0.193
0.912
0.743
0.289
0.194
0.216
0.226
0.253
Note: Sample includes 15 schools; 591 students (304 Online, 287 F2F). Values represent unadjusted means. Differences in
characteristics by condition were tested using a model that modeled schools and summer school session as fixed effect to
account for the clustering of students within schools and summer school session. Figures in parentheses are standard deviations.
a. Concentration of poverty is a standardized measure of poverty for the census block group in which the student lives. A large
positive number indicates a high level of poverty concentration; a large negative numbers indicates a low level of poverty
concentration. This measure is calculated from Census data (the percent of adult males employed and the percent of families
with incomes above the poverty line), and is standardized such that a “0” value is the mean value for census block groups in
Chicago.
b. Social status is a standardized measure of educational attainment/employment status for the census block group in which the
student lives. A large positive number indicates a high social status; a large negative numbers indicates a low social status.
This measure is calculated from Census data (mean level of education of adults and the percentage of employed persons who
work as managers or professionals), and is standardized such that a “0” value is the mean value for census block groups in
Chicago.
Source: Chicago Public Schools (CPS) Administrative Data
25
Table 6. Baseline Characteristics of Cohort 2 (Summer 2012)
Characteristic
Mean spring 2011 Explore math scaled score
Mean concentrated poverty (2009 ACS) a
Mean social status (2009 ACS) b
Mean number of unexcused absences
(2011-2012 school year)
Percent first-time freshman
Percent special education
Percent African American
Percent Latino
Percent Other Race (non-Latino, non-African
American)
Percent Suspended
(2011-2012 school year)
Percent Moved Schools
(2011-2012 school year)
Percent Female (blocking variable)
Percent Passed Algebra 1A (blocking variable)
Percent Failed Algebra 1A (blocking variable)
Percent Unknown Pass/Fail in Algebra 1A (blocking
variable)
Online
13.64
(2.83)
-0.03
(0.79)
-0.40
(0.86)
24.03
(20.85)
87
9
31
58
F2F
13.78
(2.88)
0.01
(0.76)
-0.45
(0.87)
25.86
(21.51)
88
10
28
59
p-value
12
13
0.511
34
37
0.391
5
6
0.437
39
34
38
38
35
37
0.740
0.688
0.790
29
28
0.868
0.354
0.574
0.475
0.246
0.586
0.521
0.107
0.533
Note: Sample includes 13 schools; 792 students (395 Online, 397 F2F). Values represent unadjusted means. Differences in
characteristics by condition were tested using a model that modeled schools and summer school session as fixed effect to
account for the clustering of students within schools and summer school session. Figures in parentheses are standard deviations.
a. Concentration of poverty is a standardized measure of poverty for the census block group in which the student lives. A large
positive number indicates a high level of poverty concentration; a large negative numbers indicates a low level of poverty
concentration. This measure is calculated from Census data (the percent of adult males employed and the percent of families
with incomes above the poverty line), and is standardized such that a “0” value is the mean value for census block groups in
Chicago.
b. Social status is a standardized measure of educational attainment/employment status for the census block group in which the
student lives. A large positive number indicates a high social status; a large negative numbers indicates a low social status.
This measure is calculated from Census data (mean level of education of adults and the percentage of employed persons who
work as managers or professionals), and is standardized such that a “0” value is the mean value for census block groups in
Chicago.
Source: Chicago Public Schools (CPS) Administrative Data
Cohorts 1 and 2 constitute the student samples that we will track over the four-year
project. We will collect outcomes for both cohorts through Spring 2014, when on-time (4-year)
graduation would occur for the students in Cohort 1.
Characteristics of the F2F Teachers, Mentors, and Online Teachers. The study schools
were responsible for identifying staff to serve as the f2f algebra teachers and online mentors.
Aventa selected the online teachers for the study from among their pool of algebra credit
26
recovery instructors. In both implementation years (2011 and 2012), all of the f2f algebra
teachers and Aventa online teachers were certified to teach mathematics, compared to about half
of the online mentors (47% in 2011 and 53% in 2012), who were not required to be certified in
mathematics to serve in this role.8 In terms of experience, the f2f teachers averaged 14.8 and
12.2 total years of teaching experience in 2011 and 2012, respectively; the online mentors had
13.5 and 11.0 years of teaching experience in these corresponding summers; the Aventa online
teachers averaged 5.3 years of total teaching experience in 2011.9
Measures
To capture the extent to which participation in credit recovery courses affects current and
future performance of students in high school, we are collecting achievement data, course-taking
data, dropout/enrollment status, and ultimately, (for Cohort 1) graduation status.
Short-Term Outcome Measures. The focus of this paper is on findings to date on shortterm measures of academic success related to algebra credit recovery. These measures include
(1)
scores on an end-of-course assessment composed of NAEP algebra items
administered by the study team,
(2)
grades in the credit recovery courses,
(3)
whether or not the credit was successfully recovered,
(4)
scores on the PLAN assessment (the “pre-ACT”), including the algebra strand
which is taken in October of grade 10 by CPS students, and
(5)
mathematics classes taken during the year following the summer credit recovery
course, and likelihood of passing those classes.
In this paper, we report results of analyses of all five of these outcomes for Cohort 1, and
the first three for Cohort 2.
8
These percentages are based on survey data from 2011 and 2012, with the exception of the 2012 Aventa online
teacher data, which were reported by the Aventa staff member who selected the online teachers for the study.
9
The Aventa online teachers had not yet completed the survey at the time of this report.
27
In addition, the analyses reported in this paper compare students assigned to online vs. f2f
algebra credit recovery courses on measures of classroom/instructional experience that are key to
the hypothesized theory of action for online credit recovery. These measures include
engagement, classroom personalism, academic demand (teacher expectations, class difficulty),
and self-efficacy in mathematics (beliefs about the usefulness of mathematics, liking of and
confidence in mathematics).
Longer-Term Outcome Measures. Longer-term achievement and course-taking
measures include ACT scores (taken in spring of grade 11), and performance in math and science
courses in grades 10, 11, and 12. We will also examine dropout/enrollment status for all students
in both cohorts and, in spring 2014, graduation status for Cohort 1. Future analyses will focus on
these outcomes as the student samples move through high school and the data become available.
Implementation Measures. The study team also collected a host of implementation data,
including classroom observations of the online and f2f algebra courses, archival data generated
by the online course (e.g., the number and type of online interactions between online teachers
and students, number of chapters completed, etc.), f2f classroom materials (e.g., syllabi, pacing
guides), student and teacher surveys, and daily activity logs for the online mentors. These data
allow for description of the treatment and control conditions and to address key research
questions related to the conditions under which the credit recovery courses (online and/or f2f) are
most effective.
DATA COLLECTION PROCEDURES
Data collected to date include the extant data obtained from CPS central office and
primary data collected on-site during the implementation of the credit recovery courses. As
agreed with CPS, we obtained written documentation of consent for students to participate in the
28
collection of primary data during summer school. After students were assigned to the online or
f2f classes, a study team member visited each classroom to introduce the study, describe its
goals, and pass out consent forms. Staff hired to collect study data visited the schools on a
regular basis throughout the summer session to pick up signed consent forms, and distribute lost
forms. Using this process, we obtained consent forms from 84% of the students in Year 1 and of
those, 90% indicated agreement with the terms of the study. (Thus the overall rate of consent for
the full Cohort 1 sample, across conditions, was 76%.) The rates were similar for Cohort 2, from
whom we obtained 89% of the consent forms and of those, 90% indicated agreement with the
terms of the study. (Thus the overall rate of consent for the full Cohort 2 sample, across
conditions, was 80%.)
Data collected during the summer school sessions included classroom observations,
archived online course data, online classroom mentor logs, student end-of-course assessments,
and student and teacher surveys. District administrative records are collected at regular intervals,
as available, and used to provide baseline information and all outcome data following the end of
the summer school session (e.g. test scores, course grades, school enrollments, etc.).
A summary of response rates for each type of new data collected for the study (i.e., not
including administrative records) is shown in Table 6. More detail about each data source
follows.
29
Table 7. Summer 2012 Data Collection Summary
Cohort 1
Data Collected
Student Consent
Consent Returned
Consent Affirmed
Classroom
Observations
Student Posttest
Student Survey
Teacher Survey
Mentor Survey
Cohort 2
N
Total N
Response/
Data
Collection
Rate
N
Total N
Response/
Data
Collection
Rate
499
446
591
591
84%
75%
708
634
792
792
89%
80%
36
391
391
17
16
36
591
591
18
16
100%
66%
66%
94%
100%
39
555
555
17
17
40
792
792
19
19
98%
70%
70%
89%
89%

Classroom Observations. Each of the online and f2f classrooms (total of 36 in 2011
and 40 in 2012), was observed once during the summer school session. The exception
was one online classroom in 2012 which was not observed because it was combined
with the f2f class after the first week of summer school.

Archived Online Course Data. We collected archival data from the Aventa course
system to quantify the number, types, and quality of online interactions in the course.
The course system records the amount of time students are logged in, information
about quiz attempts and grades per quiz attempt (percent correct), unit exam grades
(percent correct) and, in Year 2 only, cumulative and final grades. The archival data
were cleaner in Year 2 than they were in Year 1 due to changes in the way that
students were entered into the online course system in summer 2012.

Classroom Materials. We collected classroom materials from the f2f teachers to
describe the proportion of time spent on specific algebra topics in these classes. The
materials included annotated tables of contents from the algebra textbook, detailed
syllabi, and collections of materials assembled and/or generated by teachers.

Online Classroom Mentor Logs. On a daily basis, the online class mentor were asked
to complete a log indicating the amount of time that day they had spent doing
different activities, such as proctoring quizzes or tests, communicating with the online
teacher, and answering students’ questions about mathematics. On average, mentors
completed 92% of the logs in both summers 2011 and 2012.10
10
The online class mentor at the one school in 2012 that moved the entire online class to a f2f class is not included
in the calculation of this percentage.
30

Student End-of-Course Assessments. During the last week of summer session, study
staff administered the end-of-course algebra assessment to students in the online and
f2f course sections. In most cases, the test was administered on the second-to-last day
of summer session, in order to have time to administer make-ups to students who
were absent on the scheduled testing day. The 28-item assessment was administered
on paper in both conditions. Students had 50 minutes to complete the assessment, and
then were asked to complete the student survey. Of Cohort 1 students who had
parental consent to participate, 88% completed the posttest (66% of the entire
sample). Of Cohort 2 students who had parental consent to participate, 88%
completed the posttest (70% of the entire sample).

Student Surveys. Following completion of their end-of-course assessments, students
were asked to complete a survey that asked about their attitudes toward mathematics,
aspirations for future education, engagement, and satisfaction with their summer
Algebra IB course. Of Cohort 1 students who had parental consent to participate,
88% completed the student survey (66% of the entire sample).Of Cohort 2 students
who had parental consent to participate, 88% completed the student survey (70% of
the entire sample).

Teacher Surveys. While project staff was present in the schools to administer the
student posttest and survey, teachers were also asked to complete a survey with items
about their background characteristics, qualifications, perceptions of student
engagement in their summer Algebra IB class, opinions about teaching or serving as a
mentor as part of the study, and grading criteria. In summer 2011, 97% of the teachers
and mentors completed the survey. In summer 2012, 89% percent of teachers and
mentors (17 out of 19 in both groups) completed the survey.
Analytic Strategy
To test the impact of taking an online vs. f2f Algebra I credit recovery course, we
modeled schools and summer session (1st or 2nd) as fixed effects to account for the clustering of
students within schools and within summer school sessions. In addition to modeling schools and
summer session as fixed effects, we also included student-level characteristics for residual
covariate adjustment. Specifically, all impact models include the student-level characteristics
highlighted in Tables 5 and 6, above and all predictors with the exception of the treatment
indicator are centered around the grand mean. Analyses of continuous outcomes employed
fixed-effects linear regression models while analyses of binary outcomes (e.g. credit recovery)
employ fixed effects logistic regression models.
31
As a preliminary strategy to minimize bias resulting from missing data, we used dummy
variable adjustment (see Puma, Olsen, Bell & Price, 2009) and imputed the mean of each
covariate for cells with missing data. Impact models additionally include dummy indicators for
missing student characteristics data. Although this approach to missing data eliminates the
deletion of cases due to missing covariate data, it employs listwise deletion for outcome data. As
a result, missing outcome data may reduce statistical power and introduce bias. Future
sensitivity analyses will test the extent to which results are sensitive to the missing data
approach. Specifically, we will compare the outcomes presented below with what is observed
using listwise deletion and multiple imputation.
IV.
FINDINGS TO DATE
This section summarizes findings from implementation analyses followed by impact
findings to date. The implementation analyses first examine the implementation of the online
course and students’ progression through it. Next they compare the online and f2f algebra
courses in terms of rigor, grading policies, and instructional experience. The impact findings
include comparisons of key outcomes to date for students in the online and f2f courses,
separately by cohort.
IMPLEMENTATION FINDINGS
Implementation of the credit recovery courses went as planned during summer 2011.
There were two distinct summer sessions—the first beginning in mid-June and the second
beginning in mid-July—and most schools offered study-related algebra credit recovery courses
in either the first session or the second session. The only exceptions were two schools that
offered pairs of courses during both first and second sessions and one school that spread out
summer school across the two sessions. The start and end dates of the sessions were set by the
32
schools and varied only within a few days, and in most cases the sessions ran 3.5–4 weeks. As
noted above, there were 36 course sections (18 online and 18 f2f) in summer 2011. The
logistical implementation of both the f2f and online courses went as planned without any notable
technology issues with the online course sections.11
In summer 2012, the district set standard dates for two, 3-week sessions of summer
school—the first ran from 6/19/12 – 7/9/12 and the second ran from 7/10/12 – 7/27/12. Five of
the 13 participating schools offered algebra credit recovery courses as part of the study during
the first summer session in 2012, and 9 schools offered study-related courses during the second
session. (One school offered courses as part of this study during both first and second sessions.)
As noted earlier, there were a total of 40 course sections in summer 2012 (5 pairs of online and
f2f courses during session 1, and 15 pairs of sections during session 2).
In 2012, the online course provider changed the process for registering students into the
course. In session 1, there was a delay in students’ activation into the online course of several
hours following the revised registration process, resulting in the loss of one instructional day for
most students. Thereafter, there were no technology problems during session 1.
In session 2, a somewhat shorter delay in activation for students of a few hours was
followed by a period of system instability, due to a system migration conducted by the online
course provider the weekend before the start of the second summer session. In some schools, the
period of system instability lasted a few days; in other schools, it ran the entire first week and
11
In summer 2011, prior to the first day of summer school, Aventa provided each online mentor a list of temporary
student usernames and IDs. This temporary login information allowed students to begin working on the course
within the first hour of the first day of summer school; without these temporary usernames and IDs, it would have
taken a minimum of 24-hours for students to become activated in the Aventa system and begin using the course.
Given the compressed nature of summer school—one day of instruction in summer school ranged from 4-6 hours—
the study team and Aventa implemented the temporary log in approach. Although students were able to log in
quickly on the first day of summer school in 2011, some of the in-class mentors found the registration process
confusing and cumbersome.
33
into the the second week of the 3-week session. The problems were more severe in some schools
than others; commonly cited problems experienced by the mentors and students in schools
included: students’ log-in information not working; students being kicked out of the system;
students not being able to save quiz responses; students not being able to select answers when
taking a test; and online mentors not being able to access assessment or review student grades.
According to the online course provider, some of the problems were experienced system-wide
(affecting all of their online courses) and some were concentrated within particular schools. By
the beginning of the second week of session 2, the online course system problems had subsided
for many but not all of the schools. By the middle of the second week, the problems were
resolved in all schools.
The online class mentors responded in various ways to troubleshoot the technical
challenges during the start of session 2, and some provided algebra instruction to students who
were unable to move through the course content. All schools but one (a total of 14 out of 15
online course sections during session 2) kept students in the online course despite the system
problems.12 These problems were evident in some of the implementation data we collected. For
example, in observations of the online classes, more than 25% of session 2 online class
observations found student difficulty navigating the online course (versus less than 10% of
session 1 online class observations). In their daily activity logs, online mentors reported spending
25% of their time dealing with technology problems at the beginning of session 2.
Online Course Progression. In both summers (2011 and 2012), students navigated the
course mostly on their own (as expected). According to the mentor logs, mentors spent about
12
The study team communicated frequently with all schools during this time, and by the third day of the summer
session, formally indicated to participating schools that although they are participating in the study, they were not
obligated to keep students in the online course. The one school that chose to take students out of the online course
and provide them a f2f course instead remained in the study and participated in data collection activities.
34
20% of their time answering students’ questions about the course and 80% of their time on
administrative tasks including proctoring quizzes, resolving technical issues, and grading.
We used archival course data to examine student progression through the online course.
These archived data show that in both summers, students were active in the course for less than
half of the 60-hour summer “seat time” requirement. The average number of hours spent by
students in the online course system is shown in Table 8.
Table 8. Average Number of Hours Students Spent in the Online Course By Cohort and
Session
Summer Session
Session 1
Session 2
Overall
Cohort 1
Cohort 2
Mean (SD)
24.16 (10.96)
28.80 (10.23)
26.81 (10.65)
25.57 (10.93)
25.88 (10.81)
26.41 (10.83)
We calculated course progression and completion in two ways, first based on the percent
of quizzes and exams attempted (out of a total of 29), and second based on the percent of quizzes
and exams passed with a score of 60% or higher. Online course progress for Cohorts 1 and 2 are
shown in Exhibits 2 and 3, respectively. In both years, we observed that students attempted
between two-thirds and three-quarters of the quizzes and tests that made up the course. On
average, they passed less than half of the course assessments.
Exhibit 2. Online Course Progress for Cohort 1 Students, by Session
Session 1
46%
Session 2
47%
Overall
46%
0%
20%
40%
Percentage of tests taken
68%
75%
72%
60%
80%
Percentage of tests passed
100%
35
Exhibit 3. Online Course Progress for Cohort 2 Students, by Session
Session 1
68%
49%
Session 2
72%
43%
Overall
71%
45%
0%
20%
40%
60%
Percentage of tests taken
80%
100%
Percentage of tests passed
Focusing specifically on unit exams, we observed that both Cohort 1 and Cohort 2
students took 2.8 out of 5 unit exams on average. Again defining passing as earning a score of
60% correct or higher on the unit exams, Cohort 1 students passed an average of 1.7 of the 5 unit
exams and Cohort 2 students passed an average of 1.8 exams. Exam taking and passing by
session are shown in Exhibits 4 and 5, for Cohorts 1 and 2, respectively.
Exhibit 4. Average Number of Unit Exams Taken and Passed by Cohort 1 Students
2.4
Session 1
1.7
3.1
Session 2
1.8
2.8
Overall
1.8
0
1
2
3
4
5
Average number of Unit Exams Taken
Average number of Unit Exams Passed
36
Exhibit 5. Average Number of Unit Exams Taken and Passed by Cohort 2 Students
2.6
Session 1
1.7
2.9
Session 2
1.6
2.8
Overall
1.7
0
1
2
3
4
5
Average number of Unit Exams Taken
Average number of Unit Exams Passed
For Cohort 2, course averages and quiz-taking rates appeared similar by session.
However, we noted that about twice as many session 1 students passed the first unit of the
course, which was when the technology problems were most severe in session 2. We also
observed that about twice as many session 2 students took exams in the last two units of the
course compared to session 1.
Comparison of Online and F2F Courses – Rigor, Grading Policies, and Instructional
Experience. For both Cohorts 1 and 2, we coded the math content for the online and f2f Algebra
I credit recovery courses. The sources of information for this coding were the online course
curriculum and the classroom materials (syllabi, textbooks, annotated tables of contents, etc.)
from the f2f teachers. We found that the online course, as expected, focused exclusively on
second semester Algebra I topics, presented sequentially. The f2f courses in both summers
focused on a mix of first- and second-semester algebra topics that were not necessarily presented
37
sequentially. Specifically, across both summers, the f2f courses focused on second-semester
Algebra I topics about 58% of the time. Furthermore, more than one quarter of the f2f courses
presented topics incoherently and out of sequence and nearly all of the f2f course materials
submitted were procedural exercises, rather than tasks that addressed algebraic concepts or
problem solving.
In addition to possible differences in the type of content covered in the two conditions,
we were particularly interested in whether and how the grading policies and practices differed for
credit recovery students in online and f2f classes. To inform this question, we collected grading
criteria from f2f teachers and online class mentors both summers. We found that both the online
class mentors and f2f teachers based students’ final grades on academic criteria (tests, quizzes)
and non-academic criteria (attendance, effort). One difference in grading criteria between the two
conditions was that the online mentors had access to students’ online course average, which was
based entirely on quizzes and exam scores from the online course.
As presented in Exhibit 6, for both summer cohorts, the f2f teachers based about half of
students’ grades on tests and quizzes—49% for the 2011 cohort and 57% for the 2012 cohort. In
summer 2011, the online mentors based students’ grades primarily on tests and quizzes (77%).
However, in summer 2012, the proportion of students’ grades in the online course based on tests
and quizzes was only 45%. This difference in grading criteria for the online classes between the
two summers may have been in response to the system instability experienced in summer 2012.
That is, the online mentors appeared to rely less on the data they received from the online course
to determine grades for cohort 2 students, when the online course was unstable.
38
Exhibit 6. Percentage of Students’ Grades Based on Tests/Quizzes in the F2F Classes (As
Reported by F2F Teachers) and in the Online Classes (As Reported by Online Mentors)
49%
57%
F2F Teachers
Cohort 1
(Summer 2011)
Cohort 2
(Summer 2012)
77%
Online Mentors
45%
0%
20%
40%
60%
80%
100%
We also collected data on the degree to which students appeared to be engaged in
learning algebra in both conditions. In both the online and f2f classes during both summers, data
from in-person observations indicated that students were generally cooperative—i.e., they were
on task and followed directions—but they rarely appeared excited by what they were learning.
About 90% of students in both conditions and cohorts were on task most of the time; about 85%
of students were cooperative and attentive; but less than 10% of students appeared excited in the
observed lessons. These descriptive analyses of the implementation of the online and f2f courses
set the context for the findings from impact analyses conducted thus far.
IMPACT FINDINGS
This section presents impact findings on key outcomes to date for students in the online
and f2f courses. Table 8 shows which outcomes are included in this paper by cohort for impact
analysis.
39
Table 8. Outcomes included in Impact Analyses to Date
Outcome Measure
Student attitudes (engagement, classroom personalism, academic demand,
self-efficacy in math
End-of-course Algebra assessment
Credit recovery course grades
Credit recovery rates
Grade 10 PLAN assessment scores
Grade 10 course-taking (enrollment and credit earned in Geometry)
Cohort 1
X
Cohort 2
X
X
X
X
X
X
X
X
X
Impacts on Cohort 1. In this section, we present the results of impact analyses for Cohort
1 including results from the survey that students took at the end of the course, the end-of-course
posttest, grades in the summer course, credit recovery rates, the results on the 10th grade PLAN
assessment (overall composite score and subtest scores in algebra and mathematics), and
enrollment and credits earned in Geometry (or a more advanced math course) in the 2011-12
school year for Cohort 1 students who had been first time freshman the prior year. As detailed
below, using an alpha of 0.05, we observed a statistically significant impact on students’
perceived difficulty of the course and on summer course grades but not on any other outcomes
tested for Cohort 1.
(1) Student Survey Outcomes. At the end of the summer course, students were asked to
complete a survey that included measures of student engagement (8 items; α = 0.75; 4-point
scale, 0 = strongly disagree, 3 = strongly agree)13, classroom personalism (7 items; α = 0.88; 4point scale, 0 = strongly disagree, 3 = strongly agree)14, the usefulness of mathematics (5 items; α
13
Engagement included the following items: “The topics we studied were interesting and challenging.”; “I worked
hard to do my best in this class.”; “Sometimes I got so interested in my work I did not want to stop.”; “This class
really made me think.”; “No student wasted time in this class.”; “I usually looked forward to this class.”; “ I was
usually bored with what we studied in this class” (reverse-coded); and “I often counted the minutes until class
ended” (reverse-coded).
14
Classroom personalism included the following items: “The teacher really listened to what I have to say”; “The
teacher believed I can do well in school”; “The teacher was willing to give extra help on work if I need it.”; “The
Teacher helped me catch up if I was behind.”; “The teacher noticed if I have trouble learning something.”; “The
40
= 0.78; 4-point scale, 0 = strongly disagree, 3 = strongly agree)15, liking of and confidence in
mathematics (7 items; α = 0.90; 4-point scale, 0 = strongly disagree, 3 = strongly agree)16,
teacher expectations (4 items; α = 0.83; 4-point scale, 0 = strongly disagree, 3 = strongly
agree)17, and perceived difficulty of the class (4 items; α = 0.74; 4-point scale, 0 = never, 3 = all
the time)18. Online students were asked about teacher expectations and the classroom
personalism with regard to both their online teacher and their classroom mentor. For each item
within these measures, we used the higher of the two scores provided (for the online teacher vs.
mentor) and calculated the average score across items for comparison with f2f students who only
completed these measures for their in-class teacher. Across conditions, 390 students completed
the survey (223 online, 167 F2F).
We found no significant differences by condition in students’ level of engagement or
their sense of classroom personalism, usefulness of mathematics, liking of and confidence in
mathematics, and teacher expectations. However, students in the online course found their class
significantly more difficult than students in the f2f classes. These results are shown in Table 10.
teacher gave me specific suggestions about how I could improve my work in this class.”; and “The teacher explained
things in a different way if I did not understand something in class.”
15
Usefulness of mathematics included the following items: “I think learning mathematics will help me in my daily
life.”; “I need mathematics to learn other school subjects.”; “I need to do well in mathematics to get to the college or
university of my choice.”; and “I would like to get a job that involves using mathematics.”
16
Liking/confidence in mathematics included the following items: “I usually do well in mathematics.”; “I would like
to take more math courses in school.”; “Mathematics is easier for me than for many of my classmates.”; “I enjoy
learning mathematics.”; “When I don’t understand a new math topic right away, I know that I will eventually get
it.”; “Mathematics is one of my strengths.”; and “I learn things quickly in mathematics.”
17
Teacher expectations included the following items: “The teacher expected me to do my best all the time.”; “The
teacher expected us to become better thinkers, not just memorize things.”; “The teacher did not let me get away with
being lazy.”; and “The teacher expected everyone to work hard.”
18
Class difficulty included the following items: “I found the work difficult.”; “I found the work challenging.”; “The
teacher asked difficult questions on the test.”; and “I had to work hard to do well in this class.”
41
Table 10. Impact of Online vs. F2F Algebra I Credit Recovery on Student Survey
Outcomes (Cohort 1)
Survey Outcome
Engagement
Online
Mean
SD
1.52
0.48
F2F
Mean
SD
1.54
0.55
Impact Estimate
β (S. E.)
t
p-value
-0.04(0.05)
-0.73
0.464
d
-0.08
Classroom
Personalism
2.13
0.50
2.07
0.57
0.04 (0.05)
0.82
0.411
0.11
Usefulness of
Mathematics
1.92
0.60
1.84
0.65
0.08 (0.06)
1.21
0.228
0.11
Liking/Confidence
in Mathematics
1.41
0.70
1.46
0.67
-0.05 (0.07)
-1.77
0.441
-0.07
Academic Press:
Teacher
Expectations
2.26
0.49
2.24
0.58
0.01 (0.05)
0.10
0.918
0.02
Academic Press:
Class Difficulty
1.79
0.59
1.49
0.55
0.29 (0.06)
4.98
<0.001
0.49
Notes: Sample includes 15 schools; 390 students (223 Online, 167 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Study Administered Student Survey
(2) End-of-Course Study Posttest. The study-administered posttest included 28 algebrarelated NAEP items from grades 8 and 12. Students’ item-level accuracy was IRT-scaled and
using an average of 276 and a standard deviation of 35 (the average and standard deviation of
NAEP Algebra scores in CPS). Average percent of items answered correctly was 38.6% for
students in the f2f classes and 37.8% for students in the online course. There was not a
significant difference in percent correct or in IRT-scaled scores between students in the online
and f2f courses. Results are shown in Table 11.
Table 11. Impact of Online vs. F2F Algebra I Credit Recovery on Posttest (Cohort 1)
Online
Mean
SD
Mean
SD
β (S.E.)
274.93
277.44
33.42
-1.50 (3.14)
36.17
F2F
Impact Estimate
t
p-value
-0.48
0.632
d
-0.05
Note: Sample includes 15 schools; 391 students (224 Online, 167 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Study Administered Posttest
42
(3) Course Grades. Students’ final grades from their online or f2f summer credit
recovery courses were obtained from CPS administrative records. An ‘F’ was imputed for
students who were missing grade data in the district database with the assumption that the
student dropped and failed to recover credit in the course. 19 Across conditions, 6% of students
received an ‘A,’ 13% received a ‘B,’ 18% received a ‘C,’ 24% earned a ‘D’ and 39% earned an
‘F,’ or failing grade. Table 12 shows the distribution of credit recovery course grades by
condition. They show that nearly 30% of students in the online course earned a grade of C or
higher, versus over 44% of students in the f2f condition.
Table 12. Summer Credit Recovery Course Grades by Condition (Cohort 1)
Grade
N
Online
Percent
N
F2F
Percent
A
6
2.0
31
10.8
B
24
7.9
53
18.5
C
67
20.0
43
15.0
D
79
26.00
60
20.91
F
128
42.1
100
34.8
Note: Sample includes 15 schools; 591 students (304 Online, 287 F2F). Values represent unadjusted frequencies and
percentages.
Source: Chicago Public Schools (CPS) Administrative Data
Students’ course grades were recoded numerically (A=4, B=3, C=2, D=1, F=0) and
treated as a continuous measure to conduct an exploratory test for significant difference by
condition. We found that overall, grades for students in the f2f courses were higher than grades
for students who took the online course, as shown in Table 13.20 The means by condition show
19
Students who did not receive a summer credit recovery grade in the district records were assumed to have dropped
the course early or to have missed too many days to complete the course. Most CPS schools do not allow students
to recover credit if they miss more than one day of class.
20
Future analyses will also examine this effect using a multinomial logit model given that the analytic approach
employed in these preliminary analyses forces linearity to a non-linear, non-continuous outcome.
43
that the average grade in the online course was about a D and in the f2f course was between a D
and a C.
Table 13. Impact of Online vs. F2F Algebra I Credit Recovery on End-of-Course Grade
(Cohort 1)
Online
Mean
SD
1.01
1.07
F2F
Mean
SD
β (S. E.)
1.49
1.40
-0.50 (0.10)
Impact Estimate
t
p-value
-5.10
<0.001
d
-0.39
Note: Sample includes 15 schools; 591 students (304 Online, 287 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Chicago Public Schools (CPS) Administrative Data
(4) Credit Recovery Rates. Students’ final grades were also recoded as a binary
indicator for whether they successfully recovered credit in the course (F=0, D or higher = 1).
Across conditions, more than half (61%) of all students who were randomly assigned to
condition on the first days of summer school successfully recovered their Algebra 1B credit. By
condition, we observed that credit was successfully recovered by 58% of all students randomly
assigned to the online course and 65% of all students randomly assigned to the f2f course (see
Table 14). This difference in recovery rates by condition was statistically significant, indicating
that students assigned to a f2f credit recovery course for second-semester algebra were more
likely to successfully recover the credit than those assigned to an online version of the course.
Table 14. Impact of Online vs. F2F Algebra I Credit Recovery on Students’ Recovery of
Algebra IB Credit (Cohort 1)
Online
Percent
F2F
Percent
57.89
65.16
Impact Estimate
Odds Ratio (S. E.)
z
0.67 (0.13)
-2.13
p-value
0.033
Note: Sample includes 15 schools; 591 students (304 Online, 287 F2F). Values represent unadjusted percentages.
Source: Chicago Public Schools (CPS) Administrative Data
As noted earlier, students who were missing summer grades in the district records were
coded as having not recovered the credit for the purpose of the analysis shown in Table 14. The
44
students with missing summer grades who had been randomly assigned can safely be assumed to
have dropped their summer credit recovery course, or were dropped by their school for missing
more than one day. Overall, this was the case for 32% of Cohort 1 students—34.5% of the
students in the online course and 30% in the f2f course. The proportion of students who dropped
the course are not significantly different by condition (OR = 1.36, SE=0.27; z=1.55, p=0.122).
We also removed these students to descriptively examine credit recovery rates by
condition for those students in the study who were issued Algebra IB summer grades in district
records. There were a total of 400 students received grades, and of those, 363 (91%) passed their
class—187 of 201 f2f students (93%) and 176 of 199 online students (88%).These data also
show that among students who completed their summer course, 7% of those in a f2f class and
12% of those in an online class earned a failing grade.
(5) Grade 10 PLAN Assessment. Many but not all students in Cohort 1 took the “preACT” PLAN assessment in the fall of 2011, as 10th graders. Their composite and subtest or
“strand” scores were obtained from CPS administrative data. The PLAN composite and subtest
scores range from 1-32 nationally. Across conditions, composite scaled scores were low,
averaging 14.20 overall. However, there were PLAN scores from fall 2011 for only 300 of 591
students in the cohort 1 sample. Thus, it is important to exercise caution in interpreting the
findings detailed below given limited statistical power and increased bias due to missing data.
Neither students’ overall composite scores nor their algebra strand or mathematics PLAN scores
differed significantly by condition; see Table 15.
45
Table 15. Impact of Online vs. F2F Algebra I Credit Recovery on Grade 10 PLAN
Assessment Scores (Cohort 1)
PLAN
Test/Subtest
Online
Mean
Std.
F2F
Mean
Std.
Composite
14.19
2.34
14.20
2.36
0.13 (0.23)
0.55
0.585
0.05
Algebra
5.60
2.32
5.36
2.27
0.33 (0.22)
1.47
0.143
0.14
Mathematics
14.30
2.98
13.96
3.24
0.38 (0.32)
1.21
0.236
0.12
β (S. E.)
Impact Estimate
t
p-value
d
Note: Sample includes 15 schools; 300 students (159 Online, 141 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Chicago Public Schools (CPS) Administrative Data
(5) Grade 10 Math Courses Taken and Credits Earned. Students in CPS typically
take Geometry in 10th grade. As noted earlier in this paper, this is the case even for students who
failed Algebra I in their freshman year, although they will need to recover the failed algebra
credit in order to graduate. There is variation, however, in the math courses taken by students
who failed Algebra I in 9th grade. For an initial analysis of coursetaking patterns of students in
the Cohort 1 study sample, we first examined the percent of students overall and by condition
who attempted Geometry in school year 2011-2012 (the year after the summer school session).
Out of the total sample of 591 Cohort 1 students, there were a total of 517 first-time
freshman. Of these 517 students, we were missing course taking data for 53 students (10.3%)
reducing our total sample of first-time freshman with coursetaking data to 464. Out of these 464
students, 413 students (89%) took Geometry or a more advanced course in 2011-2012 (196 of
223 F2F students—88%; 217 of 241 Online students—90%).
Table 16 displays results of an impact analysis testing the difference in the odds of
enrolling in Geometry (or higher) by condition. The findings show that there were no significant
differences; however, it is important to note that this analysis is likely underpowered for two
reasons: (1) we have a reduced sample of only 464 students with coursetaking data, and (2) not
46
taking Geometry (or higher) was a low-incidence outcome (only 11% of the 464 students did not
take the course).
Table 16. Impact of Online vs. F2F Algebra I Credit Recovery on Students’ Likelihood of
Taking Geometry or a More Advanced Course in 2011-2012 (Cohort 1)
Online
Percent
F2F
Percent
90.04
87.89
Impact Estimate
Odds Ratio (S. E.)
z
1.82 (0.63)
1.74
p-value
0.083
Note: Sample includes 15 schools; 464 students (241 Online, 223 F2F). Values represent unadjusted percentages.
Source: Chicago Public Schools (CPS) Administrative Data
We next examined the likelihood of earning credit in Geometry among students in the
study sample, by condition. Of the 413 students that took Geometry, 168 (41%) earned credit in
the course (83 of 196 F2F students—42%; 85 of 217 Online students—39%). Table 17 shows the
results of an impact analysis testing the difference in the odds of earning credit in Geometry,
conditional on taking the course in 2011-2012. These results show that students assigned to the
online credit recovery course were no more or less likely than students assigned to a f2f credit
recovery class to earn credit in Geometry during their second year in high school.
Table 17. Impact of Online vs. F2F Algebra I Credit Recovery on Students’ Likelihood of
Earning Credit in Geometry in 2011-2012 (Cohort 1)
Online
Percent
F2F
Percent
39.17
42.35
Impact Estimate
Odds Ratio (S. E.)
z
0.88 (0.21)
-0.55
p-value
0.582
Note: Sample includes 15 schools; 413 students (217 Online, 196 F2F). Values represent unadjusted percentages.
Source: Chicago Public Schools (CPS) Administrative Data
We also examined the difference by condition on likelihood of earning credit in
Geometry for all students with coursetaking data (not conditional on taking Geometry) and found
a similar result: 35.27 percent of students in online algebra credit recovery and 37.22 percent of
students in f2f algebra credit recovery earned credit in Geometry the following year. The
47
difference between the two groups was not statistically significant (OR = 0.96; SE = 0.21, z =
˗0.17, p = 0.864).
Cohort 2. For Cohort 2, we present the results of impact analyses from the survey that
students took at the end of the course, the end-of-course posttest, course grades, and credit
recovery rates.
(1) Student Survey Outcomes. The results for the survey measures we examined for
Cohort 2 are similar to those for Cohort 1 (see Table 18). There were no significant differences
in students’ level of engagement or their sense of classroom personalism, usefulness of
mathematics, and teacher expectations by condition. As with Cohort 1, students in the online
course found their class significantly more difficult than students in the f2f classes. Unlike
Cohort 1 students, however, Cohort 2 students in the online class also reported significantly less
liking/confidence in mathematics than their counterparts in the f2f classes.
Table 18. Impact of Online vs. F2F Algebra I Credit Recovery on Student Survey
Outcomes (Cohort 2)
Survey Outcome
Engagement
Online
Mean
Std.
1.53
0.43
F2F
Mean
Std.
1.53
0.49
Impact Estimate
β (S. E.)
t
p-value
-0.01(0.04)
-0.35
0.723
d
-0.03
Classroom
Personalism
2.15
0.54
2.10
0.54
0.04 (0.04)
0.82
0.414
0.07
Usefulness of
Mathematics
1.81
0.61
1.79
0.70
-0.00 (0.06)
-0.01
0.995
0.00
Liking/Confidence
in Mathematics
1.41
0.70
1.46
0.70
-0.15 (0.06)
-2.54
0.011
-0.21
Academic Press:
Teacher
Expectations
2.28
0.53
2.28
0.57
0.01 (0.05)
0.14
0.893
0.01
Academic Press:
Class Difficulty
1.80
0.60
1.48
0.57
0.35 (0.05)
7.02
<0.001
0.59
Note: Sample includes 15 schools; 555 students (282 Online, 273 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Study Administered Student Survey.
48
In light of the technology challenges that occurred during the second session of summer
2012, we disaggregated the means by condition by session. We found that none of the reported
impacts on survey measures differed significantly between the first and second summer sessions,
including the difference in liking/confidence in mathematics (Session 1: Monline=1.39; Mf2f=1.48;
Session 2: Monline=1.42; Mf2f=1.45).
(2) End-of-Course Study Posttest. As described above, the study-administered posttest
was composed of 28 algebra-related NAEP items from grades 8 and 12. As for the first cohort,
students’ item-level accuracy was IRT-scaled and using an average of 276 and a standard
deviation of 35 (again, the average and standard deviation of NAEP Algebra scores in Chicago
Public Schools). Like Cohort 1, overall accuracy on the test was low (39% overall). Average
percent of items answered correctly was 40.52% for students in the f2f classes and 37.61% for
students in the online course. In contrast to Cohort 1 where there was no significant difference by
condition, results from Cohort 2 demonstrate that students in the f2f course had significantly
higher posttest scores than their counterparts in the online course (see Table 19).
Table 19. Impact of Online vs. F2F Algebra I Credit Recovery on Posttest (Cohort 2)
Online
Mean
SD
Mean
SD
β (S. E.)
272.37
279.75
34.16
-6.30 (2.67)
35.48
F2F
Impact Estimate
t
p-value
-2.36
0.019
d
-0.18
Note: Sample includes 13 schools; 555 students (282 Online, 273 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Study Administered Posttest
Again, because of the technology challenges with the online course in second summer
session, students’ accuracy rates and scales scores are presented descriptively by summer session
and by condition in Table 20 and 21.
49
Table 20. Percent Correct on Posttest by Condition and Session (Cohort 2)
Online
F2F
Summer
Session
Mean
SD
Mean
SD
1
40.49
12.59
42.65
14.17
2
36.65
14.68
39.78
13.96
Note: Sample includes 13 schools; 141 students in summer session 1 (71 Online, 70 F2F) and 414 students in summer session 2
(211 Online, 203 F2F). Values represent unadjusted means and standard deviations.
Source: Study Administered Posttest
Table 21. Posttest Scores by Condition and Session (Cohort 2)
Online
F2F
Summer
Session
1
Mean
280.15
SD
29.60
Mean
284.28
SD
35.35
2
269.76
36.94
278.18
33.69
Note: Sample includes 13 schools; 141 students in summer session 1 (71 Online, 70 F2F) and 414 students in summer session 2
(211 Online, 203 F2F). Values represent unadjusted means and standard deviations.
Source: Study Administered Posttest
Although these results suggest that students in both conditions performed more poorly on
the posttest in Session 2 than in Session 1; we did not find a significant interaction between
condition and summer session.
(3) Course Grades. As for Cohort 1, we obtained students’ final grades from their online
or f2f summer credit recovery courses from CPS administrative records. Table 22 shows the
distribution of credit recovery course grades by condition. They show that about one-third
(34%) of students in the online course earned a grade of C or higher, versus nearly 60% of
students in the f2f condition.
50
Table 22. Summer Credit Recovery Course Grades by Condition (Cohort 2)
A
Online
N
Percent
19
4.8
N
56
Percent
14.1
B
39
9.9
86
21.7
C
75
19.0
92
23.2
D
152
38.5
90
22.7
F
110
27.8
73
18.4
Grade
F2F
Note: Sample includes 15 schools; 591 students (304 Online, 287 F2F). Values represent unadjusted frequencies and
percentages.
Source: Chicago Public Schools (CPS) Administrative Data
Students’ course grades were recoded numerically (A=4, B=3, C=2, D=1, F=0) and
treated as a continuous measure to conduct an exploratory test for significant difference by
condition. As with Cohort 1, grades for students in the f2f courses were higher those for students
in the online course, as shown in Table 23. The means by condition show that the average grade
in the online course was just above a D and in the f2f course was just below a C.
Table 23. Impact of Online vs. F2F Algebra I Credit Recovery on End-of-Course Grade
(Cohort 2)
Online
Mean
SD
1.25
1.11
F2F
Mean
SD
β (S. E.)
1.90
1.32
-0.70 (0.08)
Impact Estimate
t
p-value
-9.16
<0.001
d
-0.57
Note: Sample includes 13 schools; 792 students (395 Online, 397 F2F). Values represent unadjusted means and standard
deviations. Beta coefficients are unstandardized. The effect size d was calculated by dividing the unstandardized coefficient by
the pooled standard deviation for the online and f2f conditions.
Source: Chicago Public Schools (CPS) Administrative Data
(4) Credit Recovery Rates. Across conditions, more than three-quarters (77%) of all
students who were randomly assigned to condition on the first days of summer school
successfully recovered their Algebra 1B credit. By condition, we observed that credit was
successfully recovered by 72% of all students randomly assigned to the online course and 82%
of all students randomly assigned to the f2f course (see Table 24). This difference in recovery
51
rates by condition was statistically significant, indicating that students assigned to a f2f credit
recovery course for second-semester algebra were more likely to successfully recover the credit
than those assigned to an online version of the course.
Table 24. Impact of Online vs. F2F Algebra I Credit Recovery on Students’ Recovery of
Algebra IB Credit (Cohort 2)
Online
Percent
F2F
Percent
72.15
81.61
Impact Estimate
Odds Ratio (S. E.)
z
0.49 (0.10)
-3.64
p-value
<0.001
Note: Sample includes 13 schools; 792 students (395 Online, 397 F2F). Values represent unadjusted percentages.
Source: Chicago Public Schools (CPS) Administrative Data
The rates of students dropping the credit recovery courses (or otherwise having missing
summer grade data) were clearly lower in 2012 than in 2011. Overall, this was the case for 13%
of Cohort 2 students (versus 32% of Cohort 1 students). The percentage of students who did not
have a summer grade was 14% in the online course and 12% in the f2f course. Among the rest
of the students who did have summer grade data, only 13% in the online course and 6% in the f2f
classes earned a failing grade.
DISCUSSION
This efficacy trial presents an opportunity to examine the effects of online and f2f credit
recovery options for at-risk students as they unfold over time. At this point, the project is about
halfway through a four-year investigation. Two cohorts of students who failed second-semester
algebra have participated in summer credit recovery courses—half of them online and half f2f—
made possible by the research grant. Students in Cohort 1 were freshmen in the 2010-11 school
year, failed Algebra IB in spring 2011, and participated in the credit recovery courses in summer
2011. Students in Cohort 2 were freshmen in the 2011-12 school year, failed second-semester
algebra in spring 2012, and participated in the credit recovery courses as part of this efficacy trial
in summer 2012. For both cohorts, we are now able to assess the effects of online vs. f2f credit
52
recovery on a full complement of short-term outcomes, and can also begin to examine outcomes
into the second year of high school.
Results thus far appear to be mixed, and where significant differences were observed,
they favored the f2f condition. For both cohorts, credit recovery rates and grades were higher in
the f2f condition than in the online course, and students found the online course to be
significantly more difficult than students in the f2f classes. Overall, we found that the majority
of students in both conditions recovered their Algebra IB credits in both summers: 61% in
summer 2011 and 77% in summer 2012. Recovery rates were higher for students in the f2f
classes than the online course in both years. Most of the students who did not recover credits in
both conditions dropped out, or otherwise had missing summer grade data, rather than actually
earning a failing grade, and the proportion of students who dropped out or otherwise had missing
grade data was not different by condition.
Grades earned were higher in the f2f classes than in the online course in both years. In
2011, over 44% of all students assigned to the f2f condition earned a grade of C or higher, versus
30% of students assigned to the online course. In 2012, a grade of C or higher was earned by
59% of f2f students versus 34% of online students. This result was consistent with the finding
that students in both summers found the online course significantly more difficult and
demanding than did students in the f2f classes. This perception among students, supported by
lower pass rates and grades, is consistent with the perception held by district and school staff that
the online course content is too difficult for some students, particularly those who are
substantially behind in mathematics.
The findings that grades and credit recovery rates were lower in the online course, and
that students found the online course more difficult were also consistent with our analysis of the
53
summer 2011 f2f classroom materials and the f2f teachers’ and online mentors’ grading criteria.
While the online course focused exclusively on second semester algebra topics and online course
grades were determined primarily by quizzes and test scores (especially in 2011), our analysis of
classroom materials revealed that the f2f classes focused on second semester topics only 58% of
the time (the remaining 42% was devoted to first semester algebra topics) and only about 50% of
f2f teachers’ grading systems were based on tests and quizzes (the other 50% focused on
behavior, attendance, in-class work, etc.). Thus, it is possible that earning credit in the online
credit recovery course requires more content mastery than earning credit in a f2f version, and
furthermore, it is plausible that students who took the online course learned more algebra than
their counterparts in the f2f condition.
However, analysis of end-of-course algebra posttest scores detected no significant
differences by condition for Cohort 1. Similarly, analysis of longer-term outcomes for Cohort 1
showed no significant differences in tenth-grade outcomes the following school year.
Specifically, there were no difference in PLAN (“pre-ACT”) scores in the fall of 2012—full
composite scores, mathematics scores, or algebra strand scores. Nor were there differences by
condition in students’ likelihood of taking Geometry (the next course in the sequence after
Algebra) or earning credit in Geometry. Most of the students in both conditions took Geometry
the following year (88% f2f, 90% online); but only about 40% of them earned credit in
Geometry (42% f2f; 39% online).
End-of-course algebra posttest scores were significantly higher for Cohort 2 students in
the f2f classes than those in the online course. This is the main exception to the otherwise
consistent results across the two cohorts. The posttest difference by condition for Cohort 2 may
or may not be related to system disruptions experienced in the second 3-week session of summer
54
school that year. However, with the posttest scores, we did not detect a significant interaction
between treatment status and session (first or second) that would suggest a strong divergence of
the two study groups in session 2, when the online course implementation problems occurred.
Impacts on Grade 10 outcomes for Cohort 2 are still to be determined as the data become
available, as are longer-term outcomes for Cohort 1. Over time, the results for this study will
build and continue to shed light on important questions regarding the impacts of expanding
access to credit recovery early in high school for students who are already falling off-track.
As we wait for the current and future data to become available for analysis, the first two
years of this study offer extensive lessons about the conduct of efficacy trials in urban schools,
and the potential pitfalls of implementing online courses, particularly in extremely brief and
condensed summer school sessions. Overall, the implementation of this study in the field was
successful and provides an example for other researchers planning to implement on-site studentlevel random assignment. This approach mitigates the risk of having substantial numbers of “noshows” in the study’s ITT sample. The successful implementation of the RCT design allows for
over-time analysis of student outcomes that directly address the field’s need for evidence about
the impact of online courses relative to traditional f2f versions of the same course. However,
after three summer sessions of problem-free implementation of the online Algebra I course in
this study (two sessions in summer 2011, and first session in summer 2012), we observed
substantial technology problems in the second session of 2012 that caused the loss of valuable
instructional time for students and frustration for them and school staff alike. Thus this study
provides, in the context of a high quality efficacy trial, the opportunity to provide a note of
caution regarding these potential pitfalls that could affect any district in the country—though
perhaps more drastically for low-income urban schools trying to serve their most at-risk students.
55
REFERENCES
Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school through
college. Washington, DC: U.S. Department of Education. Retrieved September 24, 2009, from
http://www.ed.gov/rschstat/research/pubs/toolboxrevisit/toolbox.pdf.
Alliance for Excellent Education (2007). The high cost of high school dropouts
What the nation pays for inadequate high schools. Washington, DC: Author. Retrieved
September 24, 2009, from http://www.all4ed.org/files/archive/publications/HighCost.pdf.
Allensworth, E., & Easton, J. Q. (2007). What matters for staying on-track and graduating in
Chicago Public High Schools: A close look at course grades, failures and attendance in
the freshman year. Chicago: Consortium on Chicago School Research.
Allensworth, E., & Easton, J. (2005). The on-track indicator as a predictor of high school
graduation. Chicago: Consortium on Chicago School Research.
Allensworth, E. M., Nomi, T., Montgomery, N., & Lee, V. E. (2009) College preparatory
curriculum for all: Academic consequences of requiring Algebra and English I for ninth
graders in Chicago. Educational Evaluation and Policy Analysis.
Archambault, L., Diamond, D., Coffey, M., Foures-Aalbu, D., Richardson, J., Zygouris-Coe, V.,
Brown, R., Cavanaugh, C. (2010). INACOL Research Committee Issues Brief: An
exploration of at-risk learners and online education. Vienna, VA: International
Association for K-12 Online Learning.
Balfanz, R., & Legters, N. (2004). Locating the dropout crisis: Which high schools produce the
nation’s dropouts? Where are they located? Who attends them? Baltimore: Johns
Hopkins University.
Blackboard K-12 (2009). Credit recovery: Exploring answers to a national priority. Retrieved
September 7, 2009, from
http://www.blackboard.com/resources/k12/Bb_K12_WP_CreditRecovery.pdf.
Bridgeland, J. M., DiIulio, J. J., & Morison, K. B. (2006). The silent epidemic: Perspectives of
high school dropouts. Washington, DC: Civic Enterprises & Peter D. Hart Research
Associates.
Cavalluzzo, L., Burke, D., Harris, J., Ackerman, D. & Husted, T. (2003). The road to
improvement: Access, attainment and achievement in a CPMSA district. Alexandria, VA:
CNA Corporation.
Council of Chief State School Officers. (2009). Key state education policies on PK-12
education: 2008. Washington, DC: Authors.
Dynarski, M., Clarke, L., Cobb, B., Finn, J., Rumberger, R., and Smink, J. (2008). Dropout
56
prevention: A practice guide (NCEE 2008–4025). Washington, DC: National Center for
Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S.
Department of Education.
Everson, H., & Dunham, M. (1996). Equity 2000: Signs of success: Preliminary evidence of
effectiveness. New York: The College Board.
Fine, M. (1991). Framing dropouts: Notes on the politics of an urban public high school.
Albany: State University of New York Press.
Gamoran, A., & Hannigan, E. C. (2000). Algebra for everyone? Benefits of college-preparatory
mathematics for students with diverse abilities in early secondary school. Educational
Evaluation and Policy Analysis, 22, 241–254.
Greaves, T. W., & Hayes, J. (2008). America’s digital schools 2008: Six trends to watch.
California: The Greaves Group.
Greene, J. P., & Winters, M. A. (2005). The effect of residential school choice on public high
school graduation rates. Education working paper No. 9. New York: Manhattan Institute
for Policy Research.
Ham, S., & Walker, E. (1999). Getting to the right algebra: The Equity 2000 initiative in
Milwaukee Public Schools. New York, NY: MDRC. Retrieved September 24, 2009, from
http://www.mdrc.org/publications/98/workpaper.html.
Helfand, D. (2006, January). A formula for failure in L.A. schools. LA Times.
http://www.latimes.com/news/local/la-me-dropout30jan30,0,1678653.story
Heppen, J. B., & Therriault, S. B. (2008). Developing early warning systems to identify potential
high school dropouts. Washington, DC: American Institutes for Research, National High
School Center.
Herlihy, C. (2007). State and district-level supports for successful transition into high school.
Washington, DC: American Institutes for Research, National High School Center.
Higgins, L. (2008, May 27). Algebra I stumping high school freshmen: Class of 2011 confronts
tougher state requirements. Detroit Free Press. Retrieved on September 24, 2009, from
http://mymassp.com/files/Algebra%20I%20stumping%20high%20school%20freshmen.p
df.
Jerald, C. (2006). Identifying potential dropouts: Key lessons for building an early warning data
system. Washington, DC: Achieve, Inc.
Kim, J., Crasco, R., Smith, G., Johnson, A., & Karantonis, D. (2001).USI evaluative study:
Academic excellence for all urban students. Massachusetts: Systemic Research, Inc.
57
Lee, V. E., and Ready, D. D. (2009). U.S. high school curriculum: Three phases of contemporary
research and reform. In C. Rouse and J. Kemple (Eds.) America’s High Schools, The
Future of Children, 19, 135-156.
Lee, V.E. & Smith, J.B. (1999). Social support and achievement for young adolescents in
Chicago: The role of school academic press. American Educational Research Journal,
36(4), 907-945.
Letgers, N., & Kerr, K. (2001). Easing the transition to high school: An investigation of reform
practices to promote ninth grade success. Baltimore, MD: Johns Hopkins University,
Center for Social Organization of Schools.
Levin, H., Belfield, C. Muennig, P., & Rouse, C. (2007). The public returns to public educational
investments in African American males. Economics of Education Review, 26(6), 699708.
Neild, R., & Balfanz, R. (2006). Unfulfilled promise: The dimensions and characteristics of
Philadelphia’s dropout crisis, 2000-2005. Baltimore, MD: Johns Hopkins University,
Center for Social Organization of Schools.
Neild, R., & Farley, E. (2004). Whatever happened to the class of 2000? The timing of dropout
in Philadelphia’s schools. In Orfield, G. (Ed.), Dropouts in America: Confronting the
graduation rate crisis (pp. 207–220). Cambridge, MA: Harvard Education Press.
Newmann, F. M. and Associates (1996). Authentic achievement: Restructuring schools for
intellectual quality. San Francisco, Jossey-Bass.
Orfield, G. (2004). Dropouts in America: Confronting the graduation rate crisis. Cambridge,
MA: Harvard Education Press.
Picciano, A., & Seaman, J. (2009). K–12 online learning: A 2008 follow-up survey of the U.S.
school district administrators. Needham, MA: The Sloan Consortium. Retrieved April
10, 2009, from http://www.sloan-c.org/publications/survey/pdf/k12_online_learning_2008.pdf.
Roderick, M. (1993). The path to dropping out: Evidence for intervention. Westport, CT: Auburn
House.
Rouse, C. E. (2005). “Labor market consequences of an inadequate education.” Paper prepared
for the symposium on the Social Costs of Inadequate Education, Teachers College
Columbia University, October 2005.
Slavin, R. E., & Madden, N. E. (1989). What works for students at risk: A research synthesis.
Educational Leadership, 46, 4-13.
58
Stillwell, R. (2010). Public school graduates and dropouts from the Common Core of Data:
School year 2007–08 (NCES 2010-341). Washington, DC: U.S. Department of
Education, Institute of Education Sciences, National Center for Education Statistics.
Retrieved June 9, 2010 from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010341.
Tucker, B. (2007). Laboratories of reform: Virtual high schools and innovation in public
education. Washington, DC: Education Sector.
U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. (2009).
Evaluation of evidence-based practices in online learning: A meta-analysis and review of
online learning studies, Washington, D.C.: Author.
U.S. Department of Education, Office of Special Education and Rehabilitative Services, Office
of Special Education Programs. (2006, April). 26th Annual (2004) report to Congress on
the implementation of the Individuals with Disabilities Education Act, Vol. 1.
Washington, DC: Author.
Watson, J., & Gemin, B. (June 2008). Promising practices in online learning: Using online
learning for at-risk students and credit recovery. Vienna, VA: North American Council
for Online Learning. Retrieved on September 14, 2009, from
http://www.inacol.org/research/promisingpractices/NACOL_CreditRecovery_Promising
Practices.pdf.
Watson, J., & Ryan, J. (2006). Keeping pace with K–12 online learning: A review of state-level
policy and practice. Evergreen, CO: Evergreen Consulting Associates. Retrieved on
January 5, 2007, from http://evergreenassoc.com/online_education.html.
59