Are Charters the Best Alternative? A Cost Frontier Analysis of

Are Charters the Best Alternative? A Cost Frontier Analysis of
Alternative Education Campuses in Texas
by
Timothy J. Gronberg
Department of Economics
4228 TAMU
Texas A&M University
College Station, TX 77843-4228
[email protected]
979-845-8849
Dennis W. Jansen
Department of Economics
4228 TAMU
Texas A&M University
College Station, TX 77843-4228
[email protected]
979-845-7375
and
Lori L. Taylora
The Bush School of Government and Public Service
4220 TAMU
Texas A&M University
College Station, TX 77843-4220
[email protected]
979-458-3015
A paper prepared for the 38th annual AEFP conference
March 2013
a
Corresponding author
Abstract
Previous research suggests that charter schools are able to produce educational outcomes at
lower cost than traditional public schools but are not systematically more efficient relative to
their frontier than are traditional public schools. However, that analysis focused exclusively on
charter schools and traditional public schools which serve a general student population. In
Texas, as in many other states, some charter schools have been designed specifically to serve
students who are at risk of dropping out of school. Such schools, which are designated as
“alternative education campuses,” may have very different cost and efficiency profiles than
schools designed to serve students in regular education programs. In this paper, we estimate a
translog stochastic cost frontier model using panel data for alternative public school campuses in
Texas over the five-year period 2007-2011, and provide evidence as to the relative cost
efficiency of alternative education charter schools.
JEL Classification: I22, H75
Keywords: charter schools; costs; school choice; efficiency; teacher salaries; alternative
education
Introduction
Programs designed to prevent or discourage students from dropping out of school before
they receive a high school diploma can generate significant benefits to both individuals and to
society. In the current environment of increasing returns to skill, the importance of completing
high school on future wages is magnified. In addition to the direct effect on the individual
students, the potential reduction in the incidence of various correlates of dropping out, such as
incarceration, represents a social return to retention-type programs.
One institutional approach to addressing the dropout problem is the creation of
designated campuses that specialize in the education of students identified as being “at risk of
dropping out of school”. These alternative education campuses offer nontraditional programs
and accelerated instructional services to targeted “at risk” students. Students are identified as
“at risk” based on statutory criteria including poor performance on classroom and/or
standardized tests and difficult personal circumstances such as limited English proficiency or
pregnancy. As a result of the selected student population and of the differentiated program
objectives, the alternative education campuses may have very different cost and efficiency
profiles than schools designed to serve students in regular education programs.
In this paper, we estimate a translog stochastic cost frontier model using panel data for
public school campuses in Texas over the five-year period 2007-2011, and provide evidence as
to cost structure and relative cost efficiency of alternative education schools. In Texas, as in
many other states, charter schools have emerged as a major player in the alternative education
segment of the K-12 education market. We focus on two major cost structure hypotheses. First,
we predict that the reduction in the scope of classroom types and non-classroom activities will
result in a lower cost frontier for alternative education schools relative to the cost frontier for
standard accountability campuses. Second, we hypothesize that the reduction in binding
3
regulations will yield a lower cost frontier for charter alternative education campus suppliers
than the cost frontier for traditional public alternative education operators. We also provide
evidence as to the relative efficiency of charter school suppliers and traditional public school
suppliers measured with respect to their respective estimated cost frontiers.
Nontraditional Education in Texas
In 1995, the 74th Texas Legislature authorized the State Board of Education (SBOE) to
establish charter schools in the state. Texas charter schools are exempted from many of the
process-oriented laws and rules that apply to other public schools. For example, charter schools
are not required to follow state rules regarding minimum teacher salaries, maximum class sizes
or teacher certification.
There are two basic types of charters in Texas--district or campus charters, and openenrollment (OE) charters.1 District charter schools are wholly-owned subsidiaries of traditional
public school districts. In contrast, OE charter schools are completely independent local
education agencies. Although legally designated as schools, they function as school districts, and
will therefore be referred to herein as OE charter districts.
By Texas law, colleges, universities, nonprofit corporations, and governmental entities
can establish OE charter districts, but no more than 215 charters can be granted to entities other
than public institutions of higher education.2 Like traditional public school (TPS) districts, OE
charter districts are monitored and accredited under the statewide testing and accountability
1
Under Texas law, there can also be home rule charters and college or university charters. The only functional
difference between a college or university charter and the other OE charters is that the state has no limit on the
number of college and university charters whereas there is a limit on the number of OE charters that can be issued.
We treat the three college or university charter schools as OE charters for the purposes of this analysis. There are no
home rule charters.
2
Texas Education Code (TEC). Chapter 12, Subchapter D. Texas Constitution and Statutes Web site.
http://www.statutes.legis.state.tx.us/Docs/ED/htm/ED.12.htm. Accessed August 29, 2012.
4
system. They may operate multiple campuses, and they are not allowed to charge tuition. Unlike
traditional school districts, OE charters may operate in more than one metropolitan area, may
serve only a subset of grades, and may place limits on the number of children allowed to enroll.3
According to the Texas Education Agency (TEA), in 2010-11 there were 199 OE charter
districts operating 482 campuses in Texas.4 Those 482 campuses served 133,697 students—or
nearly 3% of the public school students in Texas.
Table 1 provides information about the composition of OE charter school campuses in
Texas. As the table illustrates, a disproportionate number of OE charter campuses are classified
as Alternative Education Campuses (AECs). AECs are campuses that (1) are dedicated to
serving students at risk of dropping out of school; (2) are eligible to receive an alternative
education accountability (AEA) rating; and (3) register annually for evaluation under AEA
procedures (TEA 2011). There are two types of AECs—AECs of Choice, which are day
schools, and residential AECs. More than half of the residential AECs in Texas are OE charter
school campuses, and nearly 10 percent of the OE charter campuses are residential.
There are obvious differences between residential and nonresidential campuses. As the
name implies, residential campuses provide round-the clock services to students. Most
residential campuses provide services to students in a juvenile justice or treatment center context.
This analysis excludes residential campuses because their cost structures and objectives are
clearly different from those of day schools.
3
Only OE charter districts with the appropriate provisions in their charter may operate in multiple metropolitan
areas. There are 23 charter school districts with campuses in more than one metropolitan area.
4
Nineteen charter holding companies operated two or more OE charter districts during the 2010-11 school year
(Taylor and Perez 2012). KIPP-affiliated charter holders operated 5 OE charter districts in Texas during 2010-11.
Uplift Education and America Can! each operated another 5 OE charter districts in Texas during 2010-11, while
Cosmos Foundation, Inc., operated 11 OE charter districts (including the Harmony Science Academy Laredo and
the Harmony Science Academy Waco). If the Cosmos Foundation charter-holding company were considered a
single OE charter district, it would have been the largest in the state, with 33 campuses and a total enrollment of
16,721.
5
There are also potentially meaningful differences between AECs of Choice and what
Texas terms standard accountability campuses. AECs of choice are not disciplinary alternative
or juvenile justice alternative education campuses (TEA 2011). They also are not stand-alone
General Educational Development (GED programs). Rather AECs of Choice are campuses that
offer nontraditional programs and accelerated instructional services to students designated at risk
of dropping out of school, where students are identified as “At Risk” based on statutory criteria
including poor performance on standardized tests, a history of being held back in school, limited
English proficiency, pregnancy, homelessness, placement in an alternative education program, or
residence in a residential placement facility (AEIS Glossary). Each AEC of Choice must have a
minimum percentage of at-risk students enrolled on the AEC in order to remain registered and be
evaluated under AEA procedures. “The at-risk criterion began at 65% in 2006 and increased by
five percentage points annually until it reached 75% in 2008, where it remains” (TEA 2011).
However, if an AEC of Choice met the at risk criterion in the previous year (or was not operating
in the previous year) then it can still be considered an AEC of Choice during the current year
even if the at risk percentage falls below the eligibility threshold.
Differing Accountability Standards
In Texas, all standard campuses were rated Exemplary, Recognized, Academically
Acceptable, or Academically Unacceptable based on Texas Assessment of Knowledge and Skills
(TAKS) passing rates, English language learner (ELL) progress rates, completion rates, and
annual dropout rates. AECs of Choice are rated either Academically Acceptable or Academically
Unacceptable based on a TAKS progress measure, a modified completion rate, and the annual
dropout rate. Campuses with no students enrolled in tested grades (such as early elementary
campuses) are paired with other campuses in the same district for evaluation purposes. Campuses
6
with no students enrolled in grades higher than kindergarten, Juvenile Justice Alternative
Education Program (JJAEP) campuses, and Disciplinary Alternative Education Program (DAEP)
campuses, as well as campuses with very small numbers of usable test scores and campuses
where TEA has concerns about data quality, are not rated.
For both standard accountability campuses and AECs of Choice, the TEA accountability
ratings are based not only on average TAKS performance for all students but also on the
performance of the lowest-performing student subgroup. The four subgroups are African
American, Hispanic, non-Hispanic white, and economically disadvantaged students. Any
subgroup with at least 50 students is evaluated separately, as is any subgroup with at least 30
students that also represents at least 10% of campus enrollment.
To receive a rating of Exemplary in 2010-11, 90% of the students as a whole and 90% of
the students in each evaluated subgroup must have passed the TAKS in reading/ELA, writing,
social studies, mathematics, and science. Furthermore, 25% of the students as a whole and 25%
of the economically disadvantaged subgroup must have passed the reading/ELA and
mathematics tests at the commended performance level. The campus must have also satisfied
rating criteria with respect to the ELL progress measure, completion rates, and annual dropout
rates.
To receive a rating of Academically Acceptable, 70% of the students as a whole and in
each evaluated subgroup must have passed the TAKS in reading/ELA, writing, and social
studies, 65% must have passed in mathematics, and 60% must have passed in science. Campuses
that were below the TAKS performance threshold, but were making required improvement (i.e.,
their passing rate was rising fast enough to meet the standard in 2 years) were rated as
Academically Acceptable. There were no necessary performance levels with respect to the ELL
7
progress measure or the percentage of students passing TAKS at the commended performance
level, but the campus must also have satisfied rating criteria with respect to completion rates and
annual dropout rates.
The highest rating possible for an AEC is Academically Acceptable. To have been
assigned this rating in 2010-11, either 55% of the TAKS tests taken by all students and by each
evaluated subgroup must have met the passing standard (regardless of the subject matter of the
test) or else the campus must have been making required improvement (defined as above). The
campus must have also satisfied rating criteria with respect to the ELL progress measure,
modified completion rates, and annual dropout rates.
The accountability standards for AECs of Choice are clearly different from those that
apply to standard campuses. To be considered academically acceptable in 2011, AECs of Choice
must have achieved a 55% passing rate whereas standard accountability campuses must have
achieved a 70% passing rate. The graduation standard for AECs of Choice was also less
rigorous, not only because AECs of Choice were evaluated using a modified completion rate that
considers GED recipients as completers (the standard accountability completion rate does not)
but also because the threshold completion rate is lower for AECs of Choice (60%) than for
standard accountability campuses (75%). Finally, the dropout rate standard is much less rigorous
for AECs of Choice. To be considered academically acceptable, a standard accountability
campus must have an annual dropout rate for grades 7 and 8 below 1.6 percent whereas an AEC
of Choice must have an annual dropout rate for grades 7 through 12 below 20 percent.
Differing Demographics
Table 2 illustrates the demographic characteristics of students who attended nonelementary campuses in Texas in 2010-11. As the table illustrates, at these grade levels, students
8
who attended non-traditional campuses in Texas were systematically different from those who
did not. Nontraditional campuses (i.e. OE charter campuses and AECs of Choice) served a
student population that was disproportionately nonwhite, low income and limited English
proficient. Students attending nontraditional campuses were less likely to participate in gifted
education programs or career and technology programs, and more likely to participate in
bilingual education programs than were students attending standard accountability TPS
campuses.
There were important differences among nontraditional campuses, however. Standard
accountability OE charter schools served a smaller percentage of special education students, at
risk students or career and technology students than did standard TPS campuses or either type of
AEC of Choice. In addition, AECs of Choice operated by OE charter districts served a higher
percentage of low income students than did AECs of Choice operated by TPS districts.
Importantly, nontraditional campuses were also disproportionately located in
metropolitan areas. Only 28 of the 482 OE charter campuses were located outside of a
metropolitan area, and 9 of those were residential AECs. More than half of the OE charter
campuses were located in 3 metropolitan areas—Houston (118 campuses), Dallas (100
campuses), and San Antonio (60 campuses). Similarly, only 89 of the 450 AECs were located
outside of a metropolitan area, and 15 of those were residential AECs.
Differing Financial Conditions
Funding is another important difference between OE charter and traditional public school
districts in Texas. OE charter districts do not have a tax base from which to draw funds and are,
therefore, solely dependent on state and federal transfers, charitable donations, and other non-tax
9
revenues such as food service sales. TPS districts receive funds from their own local tax base, as
well as all of the sources available to OE charter districts.
Texas’ school finance formula, the Foundation School Program (FSP), heavily influences
funding levels for both OE charter and TPS districts. The FSP guarantees each OE charter and
TPS district a minimum level of revenue per pupil and directs additional revenues to OE charter
and TPS districts for students who participate in particular programs, including special
education, career and technology education, bilingual/ESL education, state compensatory
education, and/or gifted and talented education programs. OE charter and TPS districts that
choose to provide transportation to students receive additional state funds.
TPS districts—and only TPS districts—receive a number of additional funding
adjustments. TPS districts that are small (those with average daily attendance below 1,600) and
midsized (those with average daily attendance below 5,000) receive the small and midsized
funding adjustments. All TPS districts have a cost-of-education index (CEI) value that is used to
adjust funding upward in high cost areas. TPS districts may receive additional funding if they
choose to levy a higher enrichment tax rate or if they are eligible for Additional State Aid for
Tax Relief (ASATR). They may also participate in programs that assist with facilities funding.
OE charter districts have neither a CEI nor an enrichment tax rate and are not eligible for
the district size adjustments or the facilities aid programs. Because they do not collect local
taxes, OE charter districts are also not eligible for ASATR. For the OE charter districts that
began operation after September 1, 2001, state aid in 2010-11 was based on the statewide
average values for the CEI, the small and midsize district adjustments, the ASATR, and the
enrichment tax rate. For the OE charter districts that began operation on or before September 1,
2001, 70% of their state aid in 2010-11 was based on those statewide average values, but 30% of
10
their state aid was based on either the statewide averages or the CEIs, size adjustments,
ASATRs, and enrichment tax rates of the TPS districts their students would otherwise attend,
whichever was higher.
In 2010-11, OE charter and TPS districts received similar amounts of federal funding per
pupil (see figure 1). OE charter schools received a larger share of revenue from the state and a
smaller share from local sources (e.g., charitable donations and local taxes) than did TPS
districts. Most of the local revenue for TSP districts came from local taxes.
On average, TPS districts received only $17 per pupil in charitable donations in 2010-11.
In contrast, approximately one-half of the local revenue for OE charter districts came from
charitable donations. However, that charitable revenue was not evenly distributed across the OE
charter districts in the state. Most OE charter districts (80%) received less than $100 per pupil in
charitable donations in 2010-11, while a handful of OE charter districts received more than
$2,500 per pupil. For example, KIPP San Antonio reported more than $4,100 per pupil in
charitable donations in 2010-11.
Importantly, there are no systematic differences in FSP funding between AECs of Choice
and standard accountability campuses. Students who are economically disadvantaged and
therefore part of the compensatory education program generate additional revenues for a school
district through the corresponding funding formula weights, but the formula weights are no
greater for an AEC of Choice than for a standard accountability campus.
The Cost Function Model
Schools produce education outcomes using both input factors that are purchased (for
example, teachers and other personnel) and environmental input factors that are not purchased
(for example, student skills acquired in an earlier grade). They must take the quantities of
11
environmental inputs as given, so we treat them as quasi-fixed factors in our analysis. Thus we
model educational cost as a function of the quantity and quality of educational outcomes, the
prices of variable inputs and the quantities of the environmental factors that directly influence the
education production process.
Our baseline model uses a (modified) translog stochastic frontier specification. This
model nests the popular Cobb-Douglas as a special case, as well as the modified Cobb-Douglas
specification including a limited set of quadratic terms that has been used by Imazeki and
Reschovsky (2004), among others. It also nests the classical (non-frontier) linear regression
specification of the translog (if the one-side error term is restricted to be identically zero).
We model school expenditures per pupil as a function three output indicators (q), one
measure of input prices (w) and six environmental factors (x). To enhance the flexibility of the
specification, we also include year and metropolitan area fixed effects for Dallas, Houston and
San Antonio (which are not indicated in equation (1). Formally, our model is:
ln( Ei )  a0   j 1 a j q j   j 1 b j w j   j 1 c j x j  0.5 j 1  k 1 d jk q j qk
3
2
5
3
3
  j 1  k 1 e jk q j wk  0.5 j 1  k 1 f jk w j wk   j 1  k 1 g jk x j wk
3
2
2
2
5
2
(1)
  j 1  k 1 h jk q j xk 0.5 j 1  k 1 m jk x j xk  k6 x12  vi  i
3
5
5
5
where vi is a random noise component (representing an exogenous random shock, such as a rainy
testing day), and ui is a one-sided error term that captures inefficiency. 5
We impose the usual
symmetry restrictions (dij = dji, fij = fji, and kij = kj ) and assume that the one-sided error ui has a
half-normal distribution. Inefficiency increases cost above minimum cost, so ui ≥ 0, and cost
efficiency is defined as exp(-ui) ≤ 1.
5
To accommodate the extraordinary range of this variable and to enhance the flexibility the model, we also include
the cube of the first environmental factor—school district enrollment.
12
The stochastic frontier methodology is particularly well-suited to the analysis of
educational cost functions because the literature strongly suggests that public schools are
inefficient. The stochastic frontier approach allows the data to reveal information about the
degree of cost inefficiency while identifying properties of the true cost function. The advantages
and the challenges of applying the stochastic frontier methods to school cost function estimation
are discussed in a recent paper by Gronberg, Jansen, and Taylor (2011a).
The stochastic cost frontier framework can accommodate impacts of exogenous
environmental factors on efficiency via modeling of the one-sided error term, u. In particular,
we can specify that
u = u( x, δ) with u ≥ 0
(2)
where x is a vector of environmental efficiency factors and δ is a parameter vector.
Because school quality is frequently thought of as a choice variable for school district
administrators, the possible endogeneity of school quality indicators is a common concern for
researchers estimating educational cost functions. (For example, see the discussion in
Duncombe and Yinger 2011, 2005, Imazeki and Reschovsky 2004 or Gronberg, Jansen and
Taylor 2011a.). Unfortunately, the econometric literature provides little guidance as to the proper
way to address these concerns in a stochastic frontier setting. Furthermore, the translog
specification means that not only do we need instruments for the two quality indicators we also
need instruments for all of the quality interaction terms—a total of 19 variables in all. The large
number of potentially endogenous variables compounds the usual problems associated with weak
13
instruments.6 Because weak instruments are often worse than no instruments at all, we treat all
of the independent variables as exogenous in our estimation.7
The Data
The data for our analysis come from administrative files and public records of the Texas
Education Agency and cover the five-year period from 2006-07 through 2010-11. The unit of
analysis is the non-elementary, metropolitan area campus. The analysis includes all eligible
campuses with complete data during the analysis period. All schools serving exclusively
elementary grades (i.e. grades PK through 6), all schools located outside of a metropolitan area,
all juvenile justice or disciplinary education campuses and all residential AECs have been
excluded from the analysis. Table 1 provides descriptive statistics on the schools included in this
analysis.
The Dependent Variable
The dependent variable used in the analysis is the log of actual current, per-pupil operating
expenditures excluding food and student transportation expenditures. As in Imazeki and
Reschovsky (2004) and Gronberg et al (2004, 2005 and 2011b), we exclude transportation
expenditures on the grounds that they are unlikely to be explained by the same factors that
explain student performance, and therefore that they add unnecessary noise to the analysis. We
exclude food expenditures on similar grounds. Because not all school district expenditures are
allocated to the campus level, and the share of allocated expenditures varies from district to
district, we distribute unallocated school district expenditures to the campuses on a per pupil
6
The references on weak instruments include classic early papers such as Nelson and Startz (1990) and more recent
papers on constructing confidence intervals with weak instruments such as Staiger and Stock (1997) and Zivot,
Startz and Nelson (1998). Murray (2006) has a fairly recent survey paper on instruments.
7
We note that this approach was also taken in Gronberg, Jansen and Taylor (2011b).
14
basis. Per-pupil operating expenditures below $3,000 or above $30,000 were deemed
implausible and treated as missing.
By construction, the dependent variable includes all operating expenditures in the
designated categories, regardless of the sources of revenue. Charitable donations, federal grants,
local tax revenues and state funding formula aid are all included. It also includes all types of
operating expenditures. Thus, the dependent variable includes not only direct salary
expenditures, but also contributions to the statewide teacher pension system, payments for group
health and life insurance, and other outlays for employee benefits. It includes payments for
contracted workers as well as employees. It also includes payments for rents, utilities and
supplies. It does not include capital outlays or debt service. Notably, it does not include any
imputed rent for traditional public school buildings.
Outputs
As noted above, our independent variables include both a quantity dimension of output — the
number of students served — and two quality dimensions. We measure quantity as the number
of students in fall enrollment at the campus. The enrollment variable ranges from 26 to 5,094,
with a mean of 922 and a median of 753.
We measure one dimension of quality using a transformed version of the annual dropout
rate. Our quality measure is 100 minus the annual dropout rate for students in grades 7 through
12, where the annual dropout rate is the number of students in grades 7 through 12 who dropped
out during the school year, divided by the number of students who enrolled.8 It ranges from a
8
According to the Texas Education Agency, a dropout is a student who is enrolled in public school in Grades 7-12,
does not return to public school the following fall, is not expelled, and does not: graduate, receive a GED, continue
school outside the public school system, begin college, or die. A student who completes the school year but does
not return in the fall is: (a) considered a dropout from the grade, district, and campus in which he or she was enrolled
at the end of the school year just completed; and (b) included in the dropout count for the school year just completed
(TEA 2011)
15
low of 52.2 percent to a high of 100 percent. We transform the dropout rate so that increases in
the indicator can be interpreted as increases in campus quality, but this transformation has no
qualitative effect on the analysis.
We measure a second dimension of quality using a normalized gain score indicator of
student performance on the Texas Assessment of Knowledge and Skills (TAKS).9 Every year,
Texas administers tests of student achievement in mathematics and reading/language arts in
grades 3-11.10 We have data on the TAKS scores of individual students in reading and math
from 2005 through 2011 and use those data to generate our measure of school quality. Although
we recognize that schools produce unmeasured outcomes that may be uncorrelated with math
and reading test scores, and that standardized tests may not measure the acquisition of important
higher-order skills such as problem solving, these are the performance measures for which
districts are held accountable by the state, and the most common measures of school district
output in the literature (e.g. Gronberg, Jansen and Taylor 2011a, and 2011b or Imazeki and
Reschovsky 2004). Therefore, we rely on them here.
As an approach to dealing with mean reversion, we normalize the annual gain scores in
each subject a la Reback (2008). In this normalization, we use test scores for student (i), grade
(g), and time or year (t), denoted as Sigt. We measure each student’s performance relative to
others with same past score in the subject as:
Yigt 
Sigt  E ( Sigt | Si ,g 1,t 1 )
[ E ( S | Si ,g 1,t 1 )  E ( Sigt | Si ,g 1,t 1 )2 ]0.5
2
igt
9
(5)
Our use of standardized test scores as an indicator for the quality of district output – the quality of the educational
experience districts offer to students – is mandated by data availability. While standardized tests provide a measure
of student achievement, they typically measure minimal expected learning, and emphatically do not measure deeper
learning or student performance at the highest end of the achievement scale.
10
Tests in other subjects such as science and history are also administered, but not in every grade level.
16
In calculating Yigt for math, we calculate the average test score in math at time t, grade g, for
students scoring Si,g-1,t-1 in math at time t-1, grade g-1. Thus, for example, we consider all tenthgrade students with a given ninth-grade score in math, and calculate the expected score on the
tenth-grade test as the average math score at time t for all tenth-grade students with that common
lagged score. Our variable Yijgt measures individual deviations from the expected score, adjusted
for the variance. This is a type of z-score. Transforming individual TAKS scores into z-scores
allows us to aggregate across different grade levels despite the differences in the content of the
various tests. Thus, the second quality measure in our analysis is the average normalized gain
score in reading and mathematics for each campus.
Input Prices
Teachers are obviously the most important educational input, and one of the ways in which
charter schools differ from traditional public schools is in their teacher hiring and compensation
practices. Table 3 compares the mean salaries and demographic characteristics for nontraditional
public school teachers with those for traditional public school teachers in Texas. As the table
illustrates, charter schools pay much lower full-time-equivalent salaries, on average, than do
traditional public schools. The lower salaries are at least partially explained by the fact that the
average teacher in a charter school is less experienced, less highly educated and less likely to
hold a teaching certificate than the average teacher in a traditional school district.11 In contrast,
TPS districts pay higher salaries for teachers assigned to AECs of Choice than to teachers
assigned to standard accountability campuses.
If there were a teacher type that was hired by all OE charter schools and traditional public
schools—say, for example, a teacher with a bachelor’s degree from a selective university and
11
Texas has a minimum salary scale for teachers that does not apply to charter schools. However, the salary scale is
generally not binding for Texas’ metropolitan districts either. Less than 0.3 percent of metropolitan area teachers in
Texas received the state’s minimum salary during the analysis period.
17
two years of experience—then arguably we would be best served by using the wages paid to
those teachers as our input price measure. However, it is impossible to identify a teacher type
that is hired by all the OE charter and TPS districts under analysis, and any observed average
wage—such as the average salary for beginning teachers—would reflect school and district
choices about the mix of teachers to hire
One way to deal with this source of potential endogeneity is to use a wage index that is
independent of school and district choices. To construct such an index, we developed a hedonic
wage model for teacher salaries and used that model to predict the wages each school would
have to pay to hire a teacher with constant characteristics.12
The hedonic model is a very simple one, wherein wages are a function of labor market
characteristics, job characteristics, observable teacher characteristics, and unobservable teacher
characteristics. Formally, the specification can be expressed as:
ln(Widjt )   i  Ddt   Tit    j  idjt
where the subscripts i,d,j and t stand for individuals, districts, labor markets and time,
respectively, Widjt is the teacher’s full-time-equivalent monthly salary, Ddt is a vector of job and
labor market characteristics that could give rise to compensating differentials, Tit is a vector of
individual characteristics that vary over time, the μi are labor market fixed effects and the αi are
individual teacher fixed effects. Any time-invariant differences in teacher quality—such as the
teacher’s verbal ability or the selectivity of the college the teacher attended—will be captured by
the fixed effects.
12
We note that teachers in charter schools participate in the same statewide teacher retirement system as teachers in
traditional public schools, and that years of service in the system are counted without regard as to whether the
employer was a charter school or a traditional public school district. Thus, teachers can move from oneTPS district
to another, or from a charter school to a TPS district without affecting their pension eligibility or their credited years
of service. Contributions to the teacher retirement system are a function of the salaries paid to individual teachers,
so the price index for teacher salaries should be highly correlated with a price index for teacher salaries and benefits.
18
Arguably, charter schools could differ from traditional public schools in the premium
they pay for teacher characteristics, so the estimating equation becomes
ln(Widjt )   i    Ddt   Tit   Ct Tit  c   j  idjt
where Ct is an indicator variable that takes on the value of one if a teacher is employed by a
charter school in year t, and zero otherwise, the subscript c indicates the coefficient on the
corresponding interaction term, and ν is the coefficient on a charter school intercept. If charter
schools systematically pay a premium (or a discount) for observable teacher characteristics, then
the elements of γc should be significantly different from zero.
The data on teacher salaries and individual teacher characteristics come from the Texas
Education Agency (TEA) and Texas’ State Board for Educator Certification (SBEC). The
measure of teacher salaries that is used in this analysis is the total, full-time equivalent monthly
salary, excluding supplements for athletics coaching. The hedonic model includes controls for
teacher experience (the log of years of experience, the square of log experience and an indicator
for first-year teachers) and indicators for the teacher’s educational attainment (no degree,
master’s degree or doctorate), teaching assignment (math, science, special education, health and
physical education or language arts) and certification status (certified in any subject, and
specifically certified in mathematics, science, special education or bilingual education). Only
teachers with complete data who worked at least half time for an OE charter or TPS district
during the analysis period are included in the analysis. The hedonic wage analysis covers the
same five-year period as the cost function analysis (2006-07 through 2010–11).
The job characteristics used in this analysis allow for teachers to expect a compensating
differential based on student demographics, school size, school type or district size. The student
demographics used in this analysis are the percentage of students in the district who are
19
economically disadvantaged, limited English proficient, black and Hispanic. We measure school
size as the log of average campus enrollment in the district. There are three indicators for school
type (elementary schools, middle schools, and high schools). The analysis also includes four
indicators for school district size—one indicator variable for very small districts (those with less
than 800 students in average daily attendance), one for small districts (those with at least 800, but
less than 1,600 students), one for midsized school districts (those with at least 1,600 but less than
5,000 students) and one for very large school districts (those with more than 50,000 students in
average daily attendance Finally, the model includes an indicator for AECs of Choice.
In addition to the metropolitan area fixed effects, we include three indicators for local
labor market conditions outside of education. We updated the National Center for Education
Statistics’ Comparable Wage Index to measure the prevailing wage for college graduates in each
school district (Taylor and Fowler, 2006). We include the Department of Housing and Urban
Development’s estimate of Fair Market Rents (in logs) and the Bureau of Labor Statistics
measure of the metropolitan area unemployment rate,
Table 4 presents the coefficient estimates and robust standard errors for the hedonic wage
model. As the table illustrates, there are systematic differences between OE charter schools and
traditional public schools in the premiums paid for teacher characteristics. On average, even
after adjustments for teacher, job and location characteristics, OE charter schools pay lower
salaries than traditional public schools. Compared with other public schools, OE charter schools
pay a larger premium for certified teachers, particularly those with bilingual/ESL or mathematics
certification and a much larger premium for teachers with a bachelor’s degree. There are no
differences between OE charter schools and other public schools with respect to the premiums
paid for advanced degrees.
20
OE charter schools also pay a much smaller premium for teacher experience than do
traditional public school districts. Figure 2 illustrates the relationship between years of teaching
experience and teacher salaries for AECs of Choice and standard accountability campuses in OE
charter and traditional public school districts. As the figure illustrates, salaries rise more rapidly
with experience for beginning teachers in OE charter schools, but more slowly with experience
for experienced teachers in OE charter schools. In other words, OE charter schools pay a much
smaller premium for highly experienced teachers.
Intriguingly, whereas traditional public school districts pay a very small premium to
teachers in AECs of Choice, OE charter schools pay a significant discount. All other things
being equal, salaries are nearly 5 percent higher in AECs of Choice operated by TPS districts
than in AECs of Choice operated by OE charter districts.
The teacher salary index for each campus is based on the predicted wage for a teacher
with zero years of experience and a bachelor’s degree, holding all other observable teacher
characteristics constant at the statewide mean, and suppressing any charter school or AEC
differentials. Thus, the predicted wage for an OE charter school is the same as the predicted
wage for an otherwise equal traditional public school, and the predicted wage for an AEC of
Choice is the same as the predicted wage for a similarly situated standard accountability campus.
The predicted wage is then divided by the minimum predicted wage to yield the salary index.
The index indicates that labor cost is more than 40 percent higher in some campuses than in
others each year.
Ideally, we would also include direct measures of local prices for instructional equipment
and classroom materials. Unfortunately, such data are not available. However, prices for
pencils, paper, computers, and other instructional materials are largely set in a competitive
21
market, and prices for nonprofessional labor are largely a function of school district location.
Therefore we include metropolitan area fixed effects to control for variations in the price of
classroom materials and nonprofessional staff.
Other Environmental Factors
The model includes indicators for several environmental factors that influence district
cost but which are not purchased inputs. Because previous researchers have found significant
economies of scale in district-level cost functions (Andrews, Duncombe and Yinger 2002), we
include the log of school district enrollment. To capture variations in costs that derive from
variations in student needs, we include the percentages of students in each district who were
limited English proficient (LEP), special education or economically disadvantaged students. To
allow for the possibility that the education technology differs according to the grade level of the
school, we include an indicator for the lowest grade served in the school. To allow for the
possibility that the dropout rate could vary with the age of the students, we also include an
indicator for the highest grade served in the school.
To test for systematic differences by school type we also include three indicators, one for
each of the three types of nontraditional campuses. The first takes on the value of one if the
campus is a standard accountability charter school, and zero otherwise. The second takes on the
value of one if the campus is an AEC of Choice operated by a charter school, and zero otherwise,
while the third takes on the value of one if the campus if an AEC of Choice operated by a
traditional public school district and zero otherwise. These three indicators allow for the
possibility that the unregulated charter school technology is different from that of traditional
public schools and that the AEC technology differs from the standard accountability technology.
22
Finally, to control for inflation and other time trends in Texas education, the model
includes fixed effects for year.
Efficiency Factors
The error terms for all frontier specifications depend on a number of factors that theory
suggests may explain differences in school efficiency. We model the one-sided variance
function as a linear combination of five indicator variables and one continuous variable.13 The
first indicator variable takes on the value of one is the school is part of an OE charter district
(and zero otherwise). It allows for the possibility that charter school efficiency may differ
systematically from the efficiency of traditional public schools. The next indicator is for AECs
of Choice. It allows for the possibility that AECs of Choice may have an efficiency profile that
differs systematically from that of other campuses. The third indicator is the interaction between
the charter school indicator and the AEC of Choice indicator. The final two indicator variables
take on a value of one if the district is small (enrollment below 1,600 students) or midsized
(enrollment between 1,600 and 5,000 students). These two indicators allow for the greater
inefficiency of small school districts found in previous analysis (e.g. Gronberg, Jansen and
Taylor 2011). The continuous variable is a time trend that allows the Texas public school system
as a whole to become more, or less, efficient over time.
We model the two-sided variance as a function of the inverse of the number of students
taking TAKS because measurement error in this key indicator of campus quality variable is
likely to be a function of the number of students tested. This approach allows for
heteroskedasticity as a function of campus size.
13
We specify the one-sided error term as having a half-normal distribution. Jenson (2005) finds that specifying a
half-normal distribution for the inefficiency term generates more reliable estimates of technical efficiency than other
assumptions about the distribution of inefficiency
23
Estimation Results
Marginal effects evaluated at the sample mean for two models are reported in Table 6.
The first model assumes that OE charters, AECs of Choice and traditional public schools share a
common cost frontier (i.e. a common cost function). In this specification the only difference
between OE charters, AECS of Choice and other public schools is in the efficiency estimate, as
we explicitly allow OE charters and AECs of Choice to have a different one-sided error term
than other public schools. This model allows us to judge the relative efficiency of OE charters,
AECs of Choice and other public schools assuming these schools share a common technology.
The second specification reported in Table 6, called the baseline model, estimates a
version of the common cost function but adds indicators for standard accountability OE charter
schools, alternative accountability OE charter schools and alternative accountability traditional
public schools to the explanatory variables in our modified translog cost function. These
indicators are interacted with the constant and the first order terms in the translog. That is, we
interact the standard accountability, OE charter school indicator with the levels of all other
variables in the model. This allows standard accountability OE charter schools to have a
different slope as well as a different cost intercept, while not estimating two distinct frontiers or
cost functions. Unfortunately, the relatively small number of nontraditional campuses with
complete data makes it impossible to estimate separate frontiers for each type of campus.
As the first column of results in Table 6 illustrates, estimated marginal effects for the
model with a common cost function generally correspond to expectations.14 The TAKS output
measure has positive and statistically significant impacts on cost. The annual dropout rate
measure has a negative marginal effect at the mean, reflecting a highly non-linear relationship
14
Even in cases where the marginal effects are not statistically significant at the mean, we still reject the hypothesis
that each quality indicator and all its interaction terms are jointly zero.
24
between the dropout rate and cost. The salary index has a positive and statistically significant
impact on cost. The grade level indicators generally have the expected sign, and they indicate
that cost is an increasing function of both the lowest grade served and the highest grade served.
Taken together, these two indicators imply that high schools are more costly to operate than
other school configurations. The student body characteristics we include – percent economically
disadvantaged, percent limited English proficient, percent special education – are all associated
with a higher average cost per student. Finally, campus enrollment at the mean has a negative
impact on cost (after controlling for district enrollment, of course).
The second column in table 6 reports results when we allow some differentiation in the
cost function faced by our four campus types. Here our estimates indicate that OE charters have
a lower estimated cost per student than TPS campuses, and thus appear to be accessing a lowercost technology for producing education. AECs of Choice also appear to be accessing a lowercost technology. Finally, the cost of operating an AEC of Choice that is also an OE charter
school is even lower. The cost advantages of AECs of Choice appear for both traditional public
schools and charter schools, and may reflect the narrower focus for such schools. Few AECs of
Choice attempt to provide the full array of fine arts, languages, AP courses or athletics typically
found in a standard accountability TPS campus.
Table 7 drills down to explore differences in the estimated coefficients behind the
marginal effects. This table presents the coefficients on the interaction term between the
designated variable and the district and accountability type of the campus. Thus it indicates how
marginal costs differ in the three listed school types from the marginal costs in a traditional
public school. As the table illustrates, the coefficients on the interactions between either of the
campus quality measures and the OE charter indicators are insignificant, indicating that the
25
marginal cost of producing gains in measured quality is not statistically significantly different
between OE charter schools and TPS districts. Among OE charter campuses, the marginal cost of
an increase in the share of special education students is lower for standard accountability
campuses than for AECs of Choice, all other things being equal. Somewhat surprisingly, among
TPS campuses the marginal cost of an increase in the share of special education students is
higher for standard accountability campuses than for AECs of Choice. The marginal cost of an
increase in the share of special education students is lower for AECs of Choice operated by TPS
districts than for AECs of Choice operated by OE Charter districts.
Table 7 indicates that the relationship between campus size and the cost of education is
systematically different for OE charter campuses than it is for TPS campuses. Figure 3
illustrates that relationship, holding all other campus characteristics constant at the mean for
AECs of Choice. As the figure illustrates, among small campuses, costs are significantly lower
for nontraditional schools than for traditional ones. The predicted costs for all four types of
school converge as the size of the campus increases. However, at the maximum campus
enrollment observed for nontraditional schools (1,507 students, or 7.3 on the horizontal axis in
Figure 3) the predicted cost for standard accountability TPS campuses remains higher than the
predicted cost for any of the nontraditional campuses. The predicted cost for both types of AECs
of Choice is more than $1,000 per pupil below the predicted cost for either type of standard
accountability campus.
Table 8 reports efficiency estimates for the two models reported in Table 6. The top half
of the table reports results for the common cost model, while the bottom half reports results for
the baseline model. In the common cost model, AECS of Choice operated by OE charter
campuses are the most efficient of the four types, and AECs of Choice operated by TPS districts
26
are the least efficient. However, the baseline model indicates that standard accountability
campuses operated by TPS districts are most efficient, relative to their frontier. On the other
hand, standard accountability OE charter campuses have the lowest mean efficiency, the highest
standard deviation of mean efficiency scores, and the widest range from minimum to maximum
efficiency. AEC’s of choice, both those operated as charters and those operated by TPS districts,
fall between these two extremes in terms of average efficiency.
The results on efficiency can perhaps be seen best in a graph, and Figures 4 and 5 provide
plots of the distribution of efficiency estimates for the common cost frontier specification (Figure
4), and the baseline specification (Figure 5). In both cases, the figure plots the efficiency
distributions for the four types of non-elementary campuses under analysis. As Figure 4
illustrates, when we restrict traditional and nontraditional public schools to have a common cost
function we estimate that standard accountability campuses operated by TPS districts are more
efficient than standard accountability OE charter schools and more efficient than AECs of
Choice, regardless of district type. AEC’s of choice at TPS districts are particularly inefficient,
even compared to AEC’s in OE charter districts. Further, the distribution of efficiency at
traditional public schools is much tighter than at OE charter schools, and in fact the spread of the
distribution at OE charter schools is wider than AEC’s of choice in charter districts and in TPS
districts. Among AEC’s of Choice, those operated by OE charter districts are more efficient than
those operated by TPS districts, and as indicated above AEC’s of Choice operated by OE charter
districts are the most efficient and have the least dispersed efficiency estimates of all the campus
types.
When we allow OE charters and AECs of Choice to have a different cost function than
other public schools (Figure 5) we see that for standard accountability campuses charter schools
27
are estimated to be even more relatively less efficient than traditional public schools, and the
spread of efficiency measures at charters is noticeably larger than the spread at traditional public
schools. This fall in the relative efficiency of OE charters when we allow different cost
functions indicates that, while OE charters have lower per pupil costs than traditional public
schools, they are less efficient with respect to that lower cost frontier. That is, OE charters are
estimated to have a lower cost frontier but also to be more inefficient relative to that frontier than
traditional public schools. We also see that nontraditional campuses continue to exhibit a lower
mean efficiency than standard accountability TPS campuses but a higher mean efficiency than
standard accountability OE charters. AEC’s of choice have similar estimated mean efficiencies,
whether operated by TPS districts or by OE charter districts. AEC’s of choice operated by
charter districts do have a greater dispersion of estimated efficiencies than AEC’s of choice
operated by TPS districts. .
It is important to note that in Figure 5 we are presenting estimates of efficiency relative to
a frontier that differs between traditional and non-traditional public schools. That is, OE charters
have a lower estimated cost per student, and thus appear to be accessing a lower-cost technology
for producing education. But OE charters are less efficient relative to their charter cost frontier
than are traditional schools relative to their cost frontier. The same is true for AECs of Choice,
regardless of whether they are operated by an OE charter or a TPS district, although the
difference in estimated efficiency between these two categories of AEC’s of choice is minimal.
Table 9 presents the estimated relationship between the one-sided errors and the model
determinants of inefficiency. These are not coefficient estimates in the traditional sense. Rather,
they represent the marginal effect of each variable on the standard deviation of the one-sided
28
error. Nevertheless, a positive coefficient can be interpreted as indicating a factor that increases
inefficiency, while a negative coefficient indicates a factor that decreases inefficiency.
As the table illustrates, there are significant differences in efficiency between charter and
traditional school districts. Across both specifications, we find that small districts are more
inefficient, and midsized districts somewhat more inefficient, than larger districts. Charters are
more inefficient than traditional public schools, especially so in the baseline model, and AEC’s
of choice are significantly more inefficient than standard accountability campuses in both
models. However, charter AEC’s of choice are more efficient than AEC’s of choice in TPS
districts.
As expected, there are a few important caveats to our analysis. First, the families of
students make a conscious decision to attend charter schools instead of traditional public schools,
and charter schools may make a deliberate effort to attract students with specific demographic
characteristics. As a result, charter school students may be systematically different from the
students who remain in traditional public school districts. Similarly, the students who attend
AECs of Choice may be systematically different from those who attend standard accountability
campuses. We address this concern as best we can by including a school quality measure that is
based on changes over time in the performance of individual students. However, our approach
may not be sufficient to account for unobserved differences between traditional and
nontraditional public school students. Some of the differences in measured efficiency between
OE charter and traditional public school districts or between AECs of Choice and standard
accountability campuses may arise from differences in student and family inputs rather than from
differences in the school districts. Second we note that our measures of school quality may not
be capturing all of the important dimensions of school district output. Schools that appear
29
relatively less efficient in this analysis may simply be producing outputs like art, music or
athletics, that are costly to produce and uncorrelated with the basic academic outcomes measured
here.
In the end, our results point to the interesting situation that charter schools and AECs of
Choice have access to a lower cost technology that allows them to produce school quality at a
lower cost per student, at least in the relevant range. At the same time, charters are on average
less efficient relative to their frontier than are non-charters relative to their frontier. We can only
speculate on the reasons behind this finding. Possibly charters take advantage of their cost
advantage by being less efficient and still providing lower cost education services than noncharters. Possibly charters face less competition from other charters, on average, than noncharters face from other non-charters and from charters themselves. The state of Texas limits the
number of charters that it issues, making it more difficult for new charters to spring up and hence
limiting competition among charters. This issue certainly bears further investigation in future
research.
Conclusion
We investigated the relative efficiency of charter schools and traditional public schools
over the period 2007-2011 in Texas. We find that the cost frontier for alternative education
campuses is estimated to lie below the frontier for regular accountability campuses. This ranking
holds when comparing between traditional alternative public school campus types or between
charter school campus types. We interpret this finding as evidence in support of our hypothesis
that the reduction in the scope of classroom types and non-classroom activities will result in a
lower cost frontier for alternative education schools relative to the cost frontier for standard
accountability campuses. When comparing traditional public alternative education providers to
30
charter alternative education suppliers, we find that the cost frontier is lower over a wide range
of campus enrollment levels, and charter schools are typically closer to that frontier than are
traditional public schools. In other words, the evidence suggests that alternative education
charter schools are able to produce educational outcomes at lower cost than traditional public
alternative education schools—which is consistent with our second hypothesis that the reduction
in binding regulations will yield a lower cost frontier for charter alternative education campuses.
Furthermore, our evidence suggests that they are at least as efficient with respect to reaching that
frontier as are alternative education campuses in traditional public schools.
31
References:
Abdulkadiroglu, A., Angrist, J., Cohodes, S., Dynarski, S., Fullerton, J., Kane, T. & Pathak, P.
(2009). Informing the debate: Comparing Boston’s charter, pilot and traditional schools.
Report prepared for the Boston Foundation. Boston, MA
(http://www.masscharterschools.org/advocacy/HarvardStudy_54pg.pdf).
Academic Excellence Indicator System (AEIS). (2009). AEIS 2008-2009 Reports. Austin, TX:
Texas Education Agency (http://ritter.tea.state.tx.us/perfreport/aeis/).
Alexander, C. D., Gronberg, T. J., Jansen, D. W., Keller, H., Taylor, L. L. & Treisman, P. U.
(2000). A study of uncontrollable variations in the costs of Texas public education: A
summary report. Prepared for the 77th Texas Legislature, The Charles A. Dana Center,
University of Texas – Austin, November.
Andrews, M., Duncombe, W., & Yinger, J. (2002). Revisiting economies of size in American
education: Are we any closer to a consensus? Economics of Education Review, 21(3),
245-262.
Bettinger, E. P. (2005). The effect of charter schools on charter students and public schools.
Economics of Education Review, 24 (2), 133–147.
Bifulco, R. & Ladd, H. F. (2006). The impacts of charter schools on student achievement:
Evidence from North Carolina. Education Finance and Policy, 1 (1), 50–90.
Booker, T. K., Gilpatric, S. M., Gronberg, T. J. & Jansen, D. W. (2008). The effect of charter
school competition on traditional public schools in Texas: Are children who stay behind
left behind? Journal of Urban Economics, 64 (1), 123–145.
Booker, T. K., Gilpatric, S. M., Gronberg, T. J. & Jansen, D. W. (2007). The impact of charter
school attendance on student performance. Journal of Public Economics, 91 (5-6), 849–
876.
Carnoy, M., Jacobsen, R., Mishel, L. & Rothstein, R. (2005). The charter school dust-up:
Examining the evidence on enrollment and achievement. Washington, DC: Economic
Policy Institute.
Center for Research on Education Outcomes (CREDO). (2009). Multiple choice: Charter school
performance in 16 States. Stanford, CA: Stanford University.
Conroy, S. J., & Arguea, N. M. (2008). An estimation of technical efficiency for Florida public
elementary schools. Economics of Education Review, 27(6), 655-663.
32
Duncombe, W., Ruggiero, J., & Yinger, J. (1996). Alternative approaches to measuring the cost
of education. In H. F. Ladd (Ed.) Holding schools accountable: Performance-based
reform in education. Washington, DC: The Brookings Institution.
Duncombe, W.A. & Yinger, J. (2005). How much more does a disadvantaged student cost?
Economics of Education Review, 24 (5), 513–532.
Duncombe, W.A. & Yinger, J. (2011). Are education cost functions ready for prime time? An
examination of their validity and reliability. Peabody Journal of Education, 86 (1), 28-57.
Gronberg, T. J., Jansen, D. W. & Naufel, G. S. (2006). Efficiency and performance in Texas
public schools. In T. J. Gronberg & D.W. Jansen (Ed.) Improving school accountability:
Checkups or choice, vol. 14, Advances in Applied Microeconomics, Elsevier Science
Press, pp. 81–101.
Gronberg, T. J., Jansen, D. W. & Taylor, L. L. (2011a). The adequacy of educational cost
functions: Lessons from Texas. Peabody Journal of Education, 86 (1), 3–27.
Gronberg, T., Jansen, D. W. & Taylor, L. L. (2011b). The impact of facilities on the cost of
education. National Tax Journal, 64 (1), 193–218.
Gronberg, T. J, Jansen, D. W., Taylor, L. L. & Booker, K. (2004). School outcomes and school
costs: The cost function approach. Texas Joint Select Committee on Public School
Finance, Austin, TX; (http://bush.tamu.edu/research/faculty/TXSchoolFinance/).
Gronberg, T. J., & Jansen, D. W. (2009). Charter schools: Rationale and research. Working
Paper. Columbia, MI: Show-Me Institute, University of Missouri.
Gronberg, T. J., Jansen, D. W., Taylor, L. L. & Booker, T. K. (2005). School outcomes and
school costs: A technical supplement. Texas Joint Select Committee on Public School
Finance, (Austin, TX, 2004). (http://bush.tamu.edu/research/faculty/TXSchoolFinance/).
Grosskopf, S., Hayes, K. J. & Taylor, L. L. (2009). The relative efficiency of charter schools.
Annals of Public and Cooperative Economics, 80 (1), 67–87.
Grosskopf, S., & Moutray, C. (2001). Evaluating performance in Chicago public schools in the
wake of decentralization. Economics of Education Review, 20(1), 1-14.
33
Hanushek, E., A. Kain, J. F., Rivkin, S. G. & Branch, G. F. (2005). Charter school quality and
parental decision making with school choice. NBER Working Paper #11252.
Hoxby, C. M. & Murarka, S. (2009). Charter schools in New York City: Who enrolls and how
they affect their student’s achievement. NBER Working Paper #14852. Cambridge, MA:
National Bureau of Economic Research.
Hoxby, C. M. & Rockof, J. E. (2004). The impact of charter schools on student achievement.
manuscript.
Imazeki, J., & Reschovsky, A. (2004). Is No Child Left Behind an un (or under) funded federal
mandate? Evidence from Texas. National Tax Journal, 57(3), 571-588.
Jensen, Uwe, 2005. “Misspecification Preferred: The Sensitivity of Inefficiency Rankings.”
Journal of Productivity Analysis 23 (2), 223¬–244.
Murray, M. P. (2006). Avoiding invalid instruments and coping with weak instruments. Journal
of Economic Perspectives, 20 (4), 111-132.
Nelson, C. R., & Startz, R. ( ) Some further results on the exact small sample properties of the
instrumental variable estimator. Econometrica, 58 (4), 967-976.
Ni, Y. (2009). The impact of charter schools on the efficiency of traditional public schools:
Evidence from Michigan. Economics of Education Review, 28(5), 571-584.
Primont, D. F., and Domazlicky, B. (2006). Student achievement and efficiency in Missouri
schools and the No Child Left Behind Act. Economics of Education Review, 25(1), 77-90.
Reback, R. (2008). Teaching to the rating: School accountability and the distribution of student
achievement. Journal of Public Economics, 92, (5-6), 1394–1415.
Sass, T. R. (2006). Charter schools and student achievement in Flordia. Education Finance and
Policy, 1 (1), 91–122.
Staiger, D., and Stock, J. H. (1997). Instrumental variables regression with weak instruments.
Econometrica, 65 (3), 557-586.
Taylor, L. L. (2008). Washington wages: An analysis of educator and comparable non-educator
wages in the state of Washington. Washington State Institute for Public Policy and the
Joint Task Force on Basic Education Finance.
34
Taylor, L. L., Alexander, C. D., Gronberg, T. J., Jansen, D.W. & Keller, H. (2002). Updating the
Texas cost of education index. Journal of Education Finance, 28 (2), 261–284.
Taylor, L. L. & Fowler, W.J. (2006). A comparable wage approach to geographic cost
adjustment. National Center for Education Statistics Research and Development Report
#2006-321.
Taylor L. L., Alford, B. L., Rollins, K. B., Brown, D. B., Stillisano, J. R. &Waxman, H. C.
(2012). Evaluation of Texas charter schools 2009-10, Report prepared for the Texas
Education Agency, Austin, TX.
Texas Education Agency (2011) Accountability Manual: The 2011 Accountability Rating
System for Texas Public Schools and School Districts, Austin: Texas. Accessed March
4, 2013 from http://ritter.tea.state.tx.us/perfreport/account/2011/manual/manual.pdf.
Texas Education Agency (2012) Secondary School Completion and Dropouts in Texas Public
Schools 2010-11, Austin, Texas. Accessed March 4, 2013 from
http://www.tea.state.tx.us/acctres/dropcomp_index.html.
U.S. Department of Education. (2007). K–8 charter schools: Closing the achievement gap.
Jessup, MD: Education Publications Center.
Wang, H. J. & Schmidt, P. (2002). One-step and two-step estimation of the effects of exogenous
variables on technical efficiency levels. Journal of Productivity Analysis, 18 (2), 129–
144.
Zimmer, R., Gill, B., Booker, K., Lavertu, S. & Witte, J. (forthcoming) Examining charter
student achievement effects across seven states. Economics of Education Review.
Zivot, E., Startz, R., and Nelson, C. R. (1998). Valid confidence intervals and inference in the
presence of weak instruments. International Economic Review 39 (4), 1119-1144.
35
Figure 1
Revenues per Pupil for OE Charter and Traditional Public School (TPS) Districts
(2010–11)
$12,000
$10,000
$8,000
Charitable Donations
Other Local Sources
$6,000
Local Tax Revenues
State
$4,000
Federal
$2,000
$0
OE Charter Schools
Source. Public Education Information Management System (PEIMS).
36
TPS Districts
3000
Predicted Monthly Salary
4000
5000
6000
7000
Figure 2
Predicted Teacher Salaries by Years of Experience for AECs of Choice Operated by OE
Charter and Traditional Public School Districts
0
10
20
30
Years of Experience
OE Charter Alternative
OE Charter Standard
40
TPS Alternative
TPS Standard
37
50
Figure 3
The Estimated Relationship Between Predicted Cost of an Average Quality Education and
Campus Enrollment, by District Type and Accountability Status.
38
Figure 4
Cost Efficiency Estimates from a Common Cost Frontier Model for Campuses, by District
Type and Accountability Status
AECs of Choice
.8
.6
.4
Common Cost Frontier Efficiency
1
Standard Accountability Campuses
TPS District
OE Charter District
39
TPS District
OE Charter District
Figure 5
Cost Efficiency Estimates from Model with Nontraditional Campus Type Treated as a
Campus Characteristic, by District Type and Accountability Status
AECs of Choice
.6
.4
.2
Cost Efficiency
.8
1
Standard Accountability Campuses
TPS District
OE Charter District
40
TPS District
OE Charter District
Table 1: The Number of Campuses by Grade-level and AEC Type, 2010-11
Standard
AEC of Choice
Residential
Campuses
AEC
OE charter districts
Elementary schools
167
15
3
Middle schools
40
2
2
High schools
29
72
15
Multi-level schools
72
38
27
Total
308
127
47
Traditional public school districts
Elementary schools
4,354
2
1
Middle schools
1,602
13
0
High schools
1,320
188
23
Multi-level schools
300
13
18
Total
7,576
216
42
Notes: Non-charter campuses with less than 5 students have been excluded. Multi-level schools serve at least 1 high
school grade (9-12) and at least 1 elementary grade (PK-6).
Sources: Academic Excellence Indicator System (AEIS)7 and the Texas Education Directory.9
41
Table 2. Student Demographics by Charter and Accountability Status for Non-residential,
Non-elementary campuses (2010-11)
Traditional
OE Charter
Public School
Districts
Districts
Standard Campuses
Percent of students who were:
Non-Hispanic white
19.46%
*
34.23%
African American
16.81%
13.09%
Hispanic
56.77%
47.00%
Economically disadvantaged
61.89%
53.07%
At risk
37.79%
*
42.93%
Limited English proficient
10.69%
*
7.59%
Special education program
5.85%
*
9.85%
Gifted education program
2.31%
*
10.33%
Bilingual education program
10.12%
*
7.12%
Career & technology program
5.80%
*
43.14%
Number of campuses
141
3,392
Number of students
52,487
2,348,616
AECs of Choice
Percent of students who were:
Non-Hispanic white
African American
Hispanic
Economically disadvantaged
At risk
Limited English proficient
Special education program
Gifted education program
Bilingual education program
Career & technology program
Number of campuses
Number of students
19.80%
18.93%
58.84%
77.35%
86.96%
13.61%
11.11%
0.18%
13.08%
29.79%
112
22,691
#
#
*#†
#†
#
†
*#†
#
#†
20.95%
19.03%
56.56%
64.93%
92.86%
14.71%
9.37%
0.65%
14.00%
34.24%
230
21,127
Notes: Pupil-weighted averages from campus-level data. Residential campuses and traditional public school
campuses with fewer than 5 students have been excluded. The asterisk indicates a difference between OE charter
school districts and traditional public school districts that is statistically significant at the 5% level, adjusting for
clustering of the data by district. The # indicates a difference between TPS standard accountability campuses and
TPS AECs of Choice that is statistically significant at the 5% level, adjusting for clustering of the data by district.
The † indicates a difference between OE charter standard accountability campuses and OE charter AECs of Choice
that is statistically significant at the 5% level, adjusting for clustering of the data by district.
Sources: Academic Excellence Indicator System (AEIS)7 and authors’ calculations.
42
Table 3: The Characteristics of Texas Teachers, 2007-2011
OE Charter Districts
Mean
Std. Dev.
Standard Campuses
FTE Monthly Salary
$3,881.017
735.494
Years of Experience
3.643
5.511
No degree
0.027
0.161
MA
0.137
0.343
Ph.D.
0.006
0.079
Certified in:
Mathematics
0.026
0.160
Science
0.027
0.163
Bilingual/ESL
0.040
0.196
Special Education
0.044
0.204
Any teaching certificate
0.405
0.491
New hire
0.621
0.485
Teaching Assignment:
Mathematic
0.244
0.429
Science
0.230
0.421
Special Education
0.028
0.165
Health and Physical Education
0.110
0.313
Language arts
0.275
0.447
Coach
0.012
0.108
Number of Observations
14,878
AECs of Choice
FTE Monthly Salary
$3,669.972
706.094
Years of Experience
4.602
7.160
No degree
0.033
0.179
MA
0.144
0.351
Ph.D.
0.010
0.098
Certified in:
Mathematics
0.056
0.230
Science
0.056
0.231
Bilingual/ESL
0.027
0.163
Special Education
0.086
0.281
Any teaching certificate
0.413
0.492
New hire
0.632
0.482
Teaching Assignment:
Mathematic
0.204
0.403
Science
0.188
0.391
Special Education
0.066
0.249
Health and Physical Education
0.114
0.318
Language arts
0.251
0.434
Coach
0.000
0.016
Number of Observations
4,036
Source: PEIMS.
43
TPS Districts
Mean
Std. Dev.
$4,836.496
10.870
0.006
0.218
0.005
737.545
9.389
0.074
0.413
0.074
0.078
0.068
0.097
0.132
0.872
0.165
0.267
0.252
0.296
0.338
0.335
0.372
0.209
0.186
0.056
0.117
0.257
0.082
0.406
0.389
0.231
0.322
0.437
0.274
1,220,705
$4,968.199
12.744
0.011
0.293
0.014
762.287
10.292
0.102
0.455
0.116
0.166
0.146
0.018
0.158
0.878
0.157
0.373
0.353
0.132
0.365
0.327
0.364
0.183
0.153
0.045
0.094
0.255
0.020
0.387
0.360
0.207
0.292
0.436
0.140
7,113
Table 4: The Hedonic Model of Teacher Salaries, 2007-2011
Baseline
Coefficient
OE Charter School
Alternative Education Campus
0.006
(0.002)**
-0.037
(0.001)**
0.041
(0.001)**
-0.028
(0.001)**
0.002
(0.002)
0.024
(0.000)**
0.031
(0.003)**
0.001
(0.001)*
-0.001
(0.001)
0.002
(0.000)**
0.001
(0.000)**
0.003
(0.000)**
-0.002
(0.000)**
0.000
(0.000)
0.000
(0.000)
0.001
(0.000)*
0.004
(0.000)**
-0.001
(0.000)**
-0.029
(0.001)**
-0.029
(0.001)**
Years of Experience (log)
Log Experience, squared
Log Experience, cubed
First-year teacher
No Degree
MA
Ph.D.
Mathematics certified
Science certified
Bilingual/ ESL certified
Special Education certified
Certified teacher
New Hire
Mathematics
Science
Special Education
Health and P.E.
Language Arts
Coach
44
Charter Interaction
Term
-0.154
(0.014)**
-0.047
(0.008)**
0.169
(0.018)**
-0.046
(0.006)**
0.081
(0.011)**
-0.040
(0.018)*
0.003
(0.007)
0.025
(0.023)
0.019
(0.009)*
-0.012
(0.010)
0.029
(0.009)**
0.006
(0.008)
0.013
(0.003)**
0.011
(0.004)**
-0.007
(0.003)*
0.002
(0.005)
0.001
(0.005)
-0.012
(0.011)
-0.004
(0.005)
-0.003
(0.004)
0.008
(0.015)
Baseline
Coefficient
Percent Economically Disadvantaged Students
Charter Interaction
Term
0.006
(0.001)**
0.006
(0.001)**
0.004
(0.003)
0.004
(0.000)**
-0.100
(0.003)**
-0.093
(0.002)**
-0.051
(0.001)**
0.013
(0.001)**
-0.002
(0.002)
0.004
(0.001)**
0.008
(0.001)**
-0.003
(0.005)
0.034
(0.002)**
0.001
(0.000)**
Percent LEP students
Percent Special Education students
Campus enrollment
Very small district
Small district
Midsized district
Big district
Elementary school
Middle school
Secondary school
Comparable Wage Index
Fair market rent (log)
Unemployment rate
Observations
1,246,488
Number of teachers
370,916
R-squared
0.72
Note: Robust standard errors in parentheses. The model also includes individual teacher fixed
effects, metropolitan area fixed effects and year fixed effects. The asterisks indicate a coefficient
that is statistically significant at the * 5%; ** significant at 1%
Source: Authors’ calculations from PEIMS.
45
Table 5: Descriptive Statistics for Non-Elementary, Non-Residential Public Schools in
Texas, 2006-07 to 2010-11
OE Charter
Traditional Public
School Districts
School Districts
Mean
Std. Dev.
Mean
Std. Dev.
Standard Campuses
Per-pupil expenditure (log)
8.999
0.268
9.015
0.198
District enrollment (log)
6.502
1.004
9.408
1.765
TAKS output
-0.040
0.289
-0.008
0.171
Completion rate
99.066
2.548
99.090
1.548
Teacher salary index
0.247
0.057
0.301
0.085
Campus enrollment (log)
5.673
0.763
6.617
0.808
Lowest grade served
2.929
3.304
7.130
1.914
Highest grade served
9.821
1.883
9.662
1.946
Percent economically disadvantaged
0.632
0.279
0.524
0.261
Percent Limited English proficient
0.093
0.146
0.077
0.094
Percent special education
0.075
0.055
0.111
0.042
Number of campuses
532
9,470
AECs of Choice
Per-pupil expenditure (log)
9.039
0.341
9.444
0.371
District enrollment (log)
6.557
1.236
9.765
1.027
TAKS output
-0.350
0.278
-0.420
0.310
Attendance rate
92.265
8.485
88.782
8.635
Teacher salary index
0.198
0.058
0.304
0.067
Campus enrollment (log)
5.174
0.560
4.761
0.640
Lowest grade served
6.606
3.224
8.428
1.451
Highest grade served
11.665
1.095
11.871
0.673
Percent economically disadvantaged
0.727
0.196
0.585
0.198
Percent Limited English proficient
0.134
0.219
0.056
0.088
Percent special education
0.133
0.100
0.087
0.061
Number of campuses
358
201
Note: Juvenile Justice and Disciplinary justice campuses have been excluded, as have all
campuses with fewer than 25 student or fewer than 10 students with test data.
46
Table 6: Marginal Effects at the Sample Mean
Common Cost
Frontier
0.0102***
(0.0032)
0.1100***
(0.0098)
-0.0006***
(0.0022)
1.3851***
(0.1199)
-0.1696***
(0.0045)
0.0022***
(0.0021)
0.0429***
(0.0023)
0.1566***
(0.0109)
0.1560***
(0.0374)
0.8176***
(0.0442)
.
Baseline Model
0.0134***
(0.0037)
TAKS output
0.0956***
(0.0099)
100- Dropout Rate
-0.0028***
(0.0022)
Teacher Salary Index
1.0909***
(0.1235)
Campus Enrollment
-0.1848***
(0.0054)
Lowest Grade Served
-0.0073***
(0.0023)
Highest Grade Served
0.0520***
(0.0026)
Percent Economically Disadvantaged
0.1337***
(0.0120)
Percent LEP
0.2390***
(0.0397)
Percent Special Education
0.6776***
(0.0458)
TPS AEC of Choice
-0.1406***
(0.1319)
Charter School
.
-0.1003***
(0.0387)
Charter AECs of Choice
.
-0.4746***
(0.0690)
Number of Observations
10,561
10,561
Note: All models also include fixed effects for year and metropolitan area. Marginal effects
estimated at the corresponding sample mean. The asterisks indicate that the hypothesis that a
variable and its interaction terms are jointly zero is rejected at the ** 5-percent or *** 10-percent
levels, respectively.
District Enrollment (log)
47
Table 7: Selected Coefficients from the Baseline Model
Accountability Type
Standard
AEC of Choice
Accountability
District Type
OE Charter
OE Charter
District Enrollment (log)
TAKS output
100-Dropout Rate
Teacher Salary Index
Campus Enrollment
High School
Highest Grade Served
Percent Economically
Disadvantaged
Percent LEP
Percent Special Education
Observations
Probability that the above
coefficients are jointly zero
Probability that the two sets of
charter coefficients are the same
Probability that the two sets of
AEC coefficients are the same
Standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
AEC of Choice
TPS District
0.0122
(0.0186)
0.0462
(0.0527)
-0.00229
(0.00680)
0.284
(0.273)
0.122***
(0.0271)
0.00161
(0.00481)
-0.0199***
(0.00729)
0.0516
(0.0529)
0.0435
(0.109)
-0.376
(0.294)
10,561
-0.157***
(0.0186)
-0.0345
(0.0700)
0.00284
(0.00499)
0.0680
(0.301)
0.137***
(0.0374)
-0.000117
(0.00657)
-0.0330**
(0.0141)
0.00391
(0.0897)
0.0659
(0.134)
1.516***
(0.300)
10,561
0.134***
(0.0405)
0.123
(0.123)
0.00350
(0.00608)
-0.714
(0.488)
0.0552
(0.0650)
-0.0365*
(0.0212)
0.0180
(0.0292)
-0.220
(0.176)
0.654
(0.407)
-2.222***
(0.559)
10,561
0.0000
0.0000
0.0000
0.0001
0.0001
48
Table 8: Efficiency Estimates for Non-Residential, Non-Elementary Campuses
Mean Std. Dev. Minimum Maximum
Common Cost Function
Standard Accountability TPS Campuses
0.9046
0.0567
0.5012
0.9915
Standard Accountability OE Charter Campuses 0.8496
0.0825
0.3833
0.9835
AECs of Choice TPS Campuses
0.8092
0.0503
0.6147
0.9491
AECs of Choice, OE Charter Campuses
0.9270
0.0244
0.8138
0.9810
Baseline Model
Standard Accountability TPS Campuses
Standard Accountability OE Charter Campuses
AECs of Choice TPS Campuses
AECs of Choice, OE Charter Campuses
0.9053
0.7715
0.8437
0.8364
0.0545
0.1158
0.0425
0.0638
0.5015
0.3156
0.6071
0.5150
0.9923
0.9820
0.9367
0.9699
Table 9: The Determinants of Campus Inefficiency for Non-Residential, Non-Elementary
Campuses
Common Cost
Baseline
Function
Analysis
Time trend
0.159***
0.128***
(0.0243)
(0.0223)
Small District (enrollment < 1,600)
1.433***
1.158***
(0.0858)
(0.0858)
Midsized District (enrollment >=1600 and < 5,000)
0.524***
0.488***
(0.0822)
(0.0789)
Charter
0.147
1.175***
(0.132)
(0.125)
AEC of Choice
1.713***
1.214***
(0.345)
(0.406)
Charter*AEC of Choice
-3.315
-1.954***
(0)
(0.490)
Constant
-4.875***
-4.729***
(0.0849)
(0.0787)
49