What`s Decentralization Got To Do With Learning?

What’s Decentralization Got To Do With Learning?
The Case of Nicaragua’s School Autonomy Reform
Elizabeth M. King and Berk Özler
Development Research Group
The World Bank
Please do not quote. For comments only.
Revised April 27, 1998
Abstract. Despite its growing popularity, school-based management is seldom evaluated
systematically with respect to its impact on student performance. This study examines the
impact of the current school autonomy reform in Nicaragua on learning within an
educational production function approach. Results show that autonomous public schools
are indeed making more decisions about pedagogical and administrative matters than do
traditional public schools, but because there is a lag in transforming school decision-making
after a school becomes legally autonomous, autonomy de jure does not appear to have any
impact on student test scores. However, another autonomy variable which measures the
actual level of decision-making by the school is positively associated with student test
scores. In particular, schools that exert greater autonomy with respect to teacher staffing
and the monitoring and evaluation of teachers appear to be more effective in raising student
performance.
*Paper presented at the Annual Meetings of the American Educational Research Association held in San
Diego, CA, April 13-17, 1998.
**This study has been funded by the Development Research Group of the World Bank and its Research
Support Budget (RPO 679-18). The authors wish to thank the other members of the Nicaragua Reform
Evaluation Team, especially Patricia Callejas, Nora Gordon, Adolfo Huete, Liliam Lopez, Reina Lopez,
Nora Mayorga, and Laura Rawlings, and Manuel Vera. The authors also wish to thank Deon Filmer,
Jyotsna Jalan, Peter Lanjouw, Stefano Paternostro, and attendees at our World Bank DECRG seminar for
their insightful comments. The findings, interpretations, and conclusions are the authors’ own and should
not be attributed to the World Bank, its Board of Directors or any of its member countries. Comments are
welcome and should be sent directly to the authors: E-mail addresses: [email protected];
[email protected].
What’s Decentralization Got To Do With Learning?
The Case of Nicaragua’s School Autonomy Reform
Elizabeth M. King and Berk Özler
The World Bank
1. Introduction
One type of education reform has been gaining support in developing countries in
the past decade. It is transforming the way public schools operate, making them more
directly accountable to students, parents, and communities. This reform is known by
several names -- school-based management, school autonomy reform, school improvement
programs -- but is really different forms of administrative decentralization. The argument
for the reform goes as follows: actors who have the most to gain or lose and who have the
best information about what actually goes on in schools are best able to make appropriate
decisions about how schools should use ever more scarce resources and how students
should be taught. Following this argument, countries have shifted responsibility and power
to communities, school actors (principals and teachers), parents, and even students.
Despite its growing popularity, school-based management is seldom evaluated
systematically with respect to its impact on student performance in developing countries, or
even in the United States (Summers and Johnson, 1994; Hanushek, 1994). Whether or not
changes in school management produce better learning outcomes for students is the
principal question we address in this paper. We examine this with respect to a current
education reform in Nicaragua. The reform gives public schools greater autonomy by
shifting responsibility for key areas of decision-making from the Ministry of Education
directly to the schools themselves. Our results show that autonomous public schools which
1
are expected to be managed more like private schools than traditional public schools are
indeed making more decisions about pedagogical and administrative matters. However, the
degree to which the autonomous schools are really autonomous, as measured by their
decisionmaking, varies widely among them -- in part a result of the usual lag in
implementation of reforms, and in part a result of local school capacity. The results also
indicate that it is realized or de facto autonomy, rather than simply whether or not a public
school has signed up for the reform, that is positively and significantly associated with
student performance.
The next section revisits the literature on education production functions and
discusses why decentralized schools might affect student performance. Section 3 gives a
brief overview of the reform in Nicaragua. Section 4 describes the data sources and the
empirical approach used in this study. Section 5 presents and discusses the results. Section
6 summarizes the conclusions.
2. Why Decentralize? Revisiting the Education Production Function
There is no developed theory in economics of why and how school governance
would affect student performance. However, if one were to consider, even momentarily,
that schools are like other enterprises and teachers and school directors like other workers,
then the economic literature does offer relevant models (de Groot, 1988; Stiglitz, 1988).
Economic models of decision-making in organizations emphasize the costs associated with
collecting and exchanging information and the costs associated with coordinating various
functions or parts of the organization (monitoring and transaction costs), the costs
associated with divergence in the goals of the organizations and those of the employees
2
involved in decision-making (agency costs), and the difficulties involved in measuring
outcomes resulting from decisions made (moral hazards). In education, the principal
argument for administrative decentralization is that the actors who are closer to the
classroom -- school principals, teachers, parents, and students -- have better information
than the officials of the central government or even subnational governments, and thus
better able to make the best decisions for improving school operations and consequently,
learning. It has been argued that the distance between government officials and school
actors is just too great to enable speedy and informed decisions. It has also been argued that
closer parent-school partnerships through decentralization can improve both the school and
home environment with respect to learning. These partnerships can elicit commitment to
self-made decisions and greater accountability on the part of teachers and the school
principal.
In delegating responsibility and power to the school, governments are faced with the
challenge of designing contracts and incentive systems that will minimize diverging
interests between principals and agents and that will ensure that central mandates are
achieved, or that the local agents behave as closely as possible along the goals of the
principals (de Groot, 1988; Hannaway 1993). Governments are also faced with the choice
of to whom to devolve responsibility and decision-making authority. Several alternatives
exist -- decentralization to subnational administrative units; school-based management in
which some degree of control is transferred to principals and teachers in a school; and
increased parental and community influence in schools by way of electing parent and citizen
representatives to school councils.
3
Another issue in decentralization reforms has to do with which of the many
functions in the system to decentralize. There are no simple rules to follow. Some argue
that there is no such thing as a decentralized educational system because almost all
decisions (e.g., finance, personnel, curriculum) retain degrees of centralization and
decentralization (Hanson 1995). The issue then becomes one of finding the appropriate
balance, given the system objectives. For example, when the goal is clearly that of
improving student performance, which decision needs to be more than less decentralized?
One expected outcome of administrative decentralization as exemplified by schoolbased management reforms is better classroom instruction and better student performance.
To achieve these, the reform has to affect either one, or both, of two things: the quality and
quantity of educational inputs, and the efficiency with which these inputs are used. These
are the elements of an education production function, or the relationship between school
outcomes for a student and the measurable education inputs in the school and the home.
The hypotheses underlying this function are that more school and family inputs into the
education process produce more learning -- that is, more highly educated and more
experienced teachers, smaller classes, more books, better facilities, and more educated
parents should lead to higher student achievement.
A review of hundreds of production function studies by Hanushek (1995) does not
reveal a strong or systematic relationship between observable school inputs and student
performance. He concludes that “the results of studies in developing countries do not make
a compelling case for specific input policies. They do, however, indicate that direct school
resources might be important in developing countries” (1995, p. 281). Levin (1995) makes
a similar observation: “Although the educational production approach continues to be
4
pursued, results since the 1960s have provided little consistency in findings. Family inputs
are always important statistically in explaining student achievement, but there is wide
variability from study to study in terms of which teacher and other school inputs are related
to achievement.”
One explanation offered by Levin for these findings is that schools are a more
complex production organization than most firms, with its output and some inputs not
easily measured. For example, while teachers’ education and years of experience can be
directly observed, it is difficult to monitor the quality of teachers’ work behind closed doors.
In fact, “school policies that attempt to control teacher activity are important mediating
devices in transforming teacher inputs into specific educational outcomes, but these are
almost never considered in educational production functions” (Levin, 1995, p. 285).1
Herein lies a principal argument by proponents of school-based management or
decentralized schools -- that besides inputs, how motivated teachers are or how well schools
are managed and inputs allocated are important determinants of the learning process.2
These factors pertain to the efficiency of use of observable inputs.
The role of having observable measures of school management has been
examined by Lockheed and Zhao (1993) and Glewwe et al. (1995). They estimated the
effect of variables such as the relative influence of the central authority compared with
1
2
In a fuller discussion of this point, Levin (1980) wrote: “The traditional educational production function
utilizes only measures of teacher capacity in specifying teacher inputs. That is, it is assumed that
capacity will be automatically transformed into the effort levels and time allocations that are consistent
with the agenda of the school. On the contrary, success in converting the capacity of teachers (or labor
power) into teacher effort and time allocation (or labor) will depend on how the school is organized to
make this transformation” (p. 215).
Hoxby offers increased teacher unionization as another explanation of why measured school inputs appear
to have little effect on student achievement in the United States, especially for the cohorts educated
after 1960 (199x).
5
the school principal’s on the school’s organization, the principal’s and teachers’ influence
on the curriculum amd selection of students, and of community involvement variables. In
the Philippines, Lockheed and Zhao found the decentralization to be an “empty
opportunity”: local public schools created by the decentralization did not, in fact, exert
local control and had fewer resources to work with than the traditional government
schools. They found no positive association between the extent of school decisionmaking and student learning, but this finding could have been a result of the stratification
of the sample by type of school. In Jamaica, Glewwe et al. found the variables on
teachers and school management to be only weakly significant. We are unaware of other
similar estimates of the education production function that include the effect of school
management. The current study contributes to this gap; further, it examines the effect of
school management within the context of an ongoing reform.
6
3. Nicaragua’s Education Reform
This section provides a brief overview of Nicaragua’s school autonomy reform. In
1991, the new coalition government of Nicaragua established councils in all public schools
to ensure the participation of the school community and parents in making school decisions
(Ministry of Education, 1993). These school councils are composed of the school
principal, teachers, parents and students, with the number of representatives varying
according to the size of the school. Except for the student members, each council member
has an equal vote in making decisions.
The reform was expanded in 1993, first through a pilot program which transformed
the school councils (Consejos Consultivos) of 20 public secondary schools into school
management boards (Consejos Directivos), thus creating “autonomous” public schools.
This program transferred key management tasks from central authorities to the directive
councils. In 1994, 33 more secondary schools signed the requisite contract with the
Ministry and became autonomous, and by the end of 1995, participation had increased to
well over 100 secondary schools.
The reform was extended to primary schools in 1995. It then took on two forms:
one for urban schools which is similar to the secondary school model, and another for rural
schools. For rural schools a new model of autonomy was introduced: the Nucleos
Educativos Rurales Autónomos (NER) is a group of schools, formed around one center
school, that acts as one autonomous school with a shared council. Its directive council is
based in the center school which is usually the largest in the group and the only school to
have a director. As of December 1995, there were over 200 single autonomous primary
schools and 42 NERs consisting of two to four schools each.
7
The public schools that have become autonomous are legally vested with many of
the features of private schools. Table 1 compares public, traditional and autonomous, and
private schools with respect to who has the responsibility for various functions, the Ministry
of Education, subnational governments, or the school. For example, the councils in
autonomous public schools and private schools have the ability to hire and fire the school
director and are involved generally in maintaining their school’s physical and academic
quality, whereas the councils in traditional public schools rely more on the government.
However, the Ministry of Education retains responsibility for structuring the education
system, establishing norms for staff promotions and teacher certification, and setting the
curriculum in all schools. The essence of the transformation of Nicaragua’s basic education
system can be seen in the differences in councils’ responsibilities across schools.3
[Table 1 about here]
3
Autonomous schools vary from traditional schools on several counts regarding pedagogy. First, in
formulating the annual pedagogical plan, directive councils can make changes in the curriculum from
year to year. Additionally, directive councils can choose their own textbooks and set their own
norms for evaluating students. Consultative councils can make these changes only with the approval
of the Ministry. There are also important differences in administrative responsibilities. Directive
councils in autonomous schools can hire and fire the school principal and teachers. In traditional
public schools, the school principal is selected by the Ministry, and the Ministry also has to approve
the principal’s selection of teachers and administrative personnel. Consultative councils in traditional
schools are vested with none of these rights. All councils are responsible for setting and administering
the school budget, setting voluntary fees, and informing the community about the state of the school's
finances. Autonomous schools can set the level of monthly fees paid by students. In practice, a
council’s financial authority depends upon the school being able to generate local resources since the
base salaries for teaching staff and the regular fee schedule for goods and services provided by the
school are set by the Ministry of Education. Funding for autonomous schools is a combination of
monthly lump-sum transfers from the central government and locally generated resources collected
from student fees, community contributions and school activities. The lump-sum transfers are
expected to cover base salaries and expenditures associated with routine maintenance of the school.
All secondary schools are also encouraged to collect a fee of ten córdobas per month (equivalent to
US$1.22) from each student, which may be retained in the school while traditional public schools
must return one-half to the central government. The constitution which guarantees primary education
prevents primary schools from charging fees. However, it is customary for primary schools to collect
a “voluntary” fee of five córdobas per month per student.
8
4. Data Sources and Empirical Model
The evaluation of Nicaragua’s reform has been underway since 1995 and is still
going on. It is being undertaken by a team from the Ministry of Education and the World
Bank’s Development Research Group, including the authors. The evaluation strategy
chosen by the team is a matched comparison design which is based on selecting a sample of
treatment or program schools (the autonomous schools) and comparison groups among nonautonomous public schools and private schools to provide the counterfactual to the program
schools. This evaluation approach was selected because of the way in which the reform has
been implemented: in the initial phase of the reform, the government conducted a
promotion campaign with a focus on large, urban secondary schools; some schools were
virtually hand-picked, while others were persuaded to volunteer. This mode of inscription
into the program ruled out an experimental evaluation strategy. Moreover, the reform began
without baseline data. By the time the evaluation began, almost a hundred secondary
schools had already signed the autonomy contract, with nearly half of them having been
autonomous for at least one year. The first data collection in 1995 provided a baseline for
the non-autonomous public schools for future data collection, but for the early reformers,
the best that could be hoped for was a comparison between traditional (non-reform) schools
and autonomous schools of different program duration. Bringing the time dimension into
the analysis was expected to measure the impact of the reform over time, especially the lag
in realizing the effects of the reform.
Data Sources
9
Three components of the evaluation has been completed so far -- a panel of two
matched school-household surveys conducted in November-December 1995 and AprilAugust 1997, and student achievement tests in November 1996. Each matched schoolhousehold survey collected information on a wide array of variables, including school
enrollment, levels of student grade repetition and dropout; schools’ physical and human
resources; and characteristics of the school principal, teachers, students and their families.
Different questionnaires were applied to school directors, teachers, and council members to
obtain school- and individual-level information. A special module was developed to
inquire about school decision-making: who the primary decision-maker was in aspects such
as budget allocations, hiring and firing of school personnel, pedagogical methods, and the
choice and distribution of textbooks; and how the respondents feel about their influence in
how each of these decisions are made (Appendix A). A sample of students was also
randomly selected from each school and followed to their homes in order to obtain
information on their families’ socioeconomic status and parents’ participation in school
affairs. These same students, with exceptions (to be discussed later), were given
achievement tests in mathematics and language in December 1996.
The school sample was selected according to the chosen evaluation strategy; thus, it
is not nationally representative. The first school-household survey covered 116 schools at
the secondary level, 73 of which were autonomous public schools and 43 were traditional
public or private schools. At the primary level, the sample included 80 autonomous schools
and 46 traditional schools. In all, the survey interviewed about 400 teachers, 182 council
members, and about 3,000 students and their parents. To the extent possible, the
respondents from the council were selected such that they were not the same teachers or
10
parents who answered the teacher or parent questionnaire. In very small schools, especially
at the primary level, this was not always possible.
For the most part, the unit of analysis in this paper is the student. A random
sample of 10-15 students in the third grade of primary school or in the second year of
secondary school was surveyed in the relevant sample schools in 1995. After taking
account of missing data due primarily to non-matching of students and parents, the
student sample numbers 1,484 and 1,430 at the primary and secondary levels,
respectively. For a variety of reasons, summarized in Table 2, not all these students were
given the achievement tests at the end of the 1996 school year: Some students had
dropped out of school by then, or had repeated the previous grade and were thus ineligible
to take the grade four (primary) or third year (secondary) tests.4 Some students had
transferred to another school. Some of these were followed in cases where students
transferred to a school within the sample of schools, but those who moved to other
schools were lost to the sample. Yet still other students simply did not appear for the test.
Finally, there appears to have been unexplained discrepancies between the student lists
presented by schools during the school-household survey and the tests.
[Table 2 about here]
All these reasons diminished the original sample of students. Because of the
intention of the evaluation team to continue the study over several years, the student
11
sample of replacement students from the same classes as the original sample. Ultimately,
we are able to work with a sample of 1,691 students in 92 schools at the primary level,
and 1,885 students in 95 secondary schools. Of these, 1,146 and 1,253 students have test
scores at the primary and secondary levels, respectively.
The usual concern about sample attrition is that it may not be random such that
the observable characteristics of students are not independent of the disturbance term in
the education production function. This is certainly the case with respect to students who
repeated the grade, and is also true for a fraction of students who dropped out to the
extent that these students also would have repeated the grade had they continued in
school. Even the students who did not repeat but transferred to a different school are
suspect. Apparently, students prefer to transfer schools to avoid the ridicule of their
peers. We test the hypothesis that replacing students who were promoted but were absent
from the test with randomly selected students from the pool of students already promoted
does not change the results of a probit model of the probability of grade promotion and
continuation. Both a log-likelihood test and a joint equality of means test for the
coefficients of the probit equation confirm that we cannot reject the null hypothesis.5
Therefore, we substitute the students who were absent from the test but were promoted
with the replacement students.
We tackle the sample attrition problem by explicitly estimating the probability
that a student will pass a grade and continue in school. This probability is estimated
using the 1995 original sample of students who were present at testing, the replacement
5
Likelihood ratio tests give Chi-square values of 1.02 and 1.08 for primary and secondary schools,
respectively. With 30 restrictions, we cannot reject the hypothesis that the parameter estimates are
equal for each sample.
12
students, and the students who were repeating their respective grade or had dropped out.
The estimated parameters are then used to calculate a predicted probability of being
promoted a grade between 1995 and 1996, which is included in the achievement test
equations as a selectivity bias factor. This model is well known in the labor economics
literature for its application in wage functions (Heckman 1976b). This predicted
probability is estimated not only for the original sample of students for whom the test
scores are available, but also for the replacement students who were given the tests in
1996. Since the replacement students were randomly drawn from the same classes as the
original sample, they are assigned the same school characteristics and mean family
characteristics in their respective schools. An alternative approach to deal with this issue
would be to impute their family characteristics in 1995 using student-specific data from
the follow-up school-household survey in 1997. Unfortunately, these newer data have
only very recently become available; we plan to incorporate them in the next version of
this paper. The econometric model is discussed further in the next section.
It is more difficult to deal with sample attrition due to no-show during the test and
due to errors in student lists. We assume the latter to be at least independent of the test
results and thus does not produce biased coefficients. We have also thus far assumed the
same for no-shows since the absence rate from the exam is not unduly high.
Empirical Model
Put simply, we estimate an expanded education production function, which can be
written as follows:
Qij = Q(Z1ij, Mj),
13
where Z1 is a composite variable consisting of the vectors of school inputs, household and
student characteristics, and Mj indicates the management regime in school j. The “technical
efficiency” factor thus enters the production function additively like other inputs; an
alternative formulation that will be tried in the future is a multiplicative specification.
Assuming a linear functional form and discarding the subscripts for student and
school for simplicity, the educational production regression is:
Q = Z1β1 + Mδ + u1
where u1 is a stochastic error term and Z1 is assumed to be exogenous and therefore
orthogonal to u1. M, however, is unlikely to be exogenous in that many of the school
characteristics that determine student performance are also probably related to the choice of
management structure of the school. In the Nicaragua case, although the reform is meant to
be system-wide, thus far, participation is not yet universal. Schools have been allowed to
phase in gradually, and participation measured at any point in time since 1993 has depended
on both selection by the central government and the willingness of individual schools to be
persuaded to join.6 Since participation is at least partially demand-driven, we cannot
assume that M is exogenous and must estimate the following structural model:
Q = Z1β1 + Mδ + Aγ + u1
(1)
A = Z2β2 + u2
(2)
Equation (1) is an expanded production function that allows for the non-random selection of
schools into the reform, while equation (2) represents the underlying selection process into
6
One could also argue that the vector of school characteristics that pertains to a student is not exogenous
because of school choice. We ignore this issue in this paper. Although school choice may exist in the
larger urban areas of Nicaragua (and other countries), it is quite limited in most communities because
of the constraints on school supply and means of transportation.
14
the reform by the central government. Participation in the reform by any school depends on
two forces which can be viewed as occurring in seriatim -- first, the central government
makes an offer to a school to join the reform, an invitation which depends on the likelihood
that the school will benefit from the reform and thus provide a good demonstration case for
other schools; second, given the offer, the willingness of the school and/or community to
participate, which presumably depends on its own perception of the net benefits of
participation. The government offer is assumed to be a function of Z2 , which is a vector of
a subset of variables belonging to Z1. The community and/or school willingness to
participate depends on also a subset of Z1 and on a vector of community-level variables, G.
After the necessary substitutions, the model can be written as follows:
Q = Z1β1 + Mδ + Z2β2γ + v1, where v1 = u1 + γu2
(3)
and
M = 1 if A* > 0
M = 0 otherwise
(4)
A* = Z2β2 + Gθ + v2.
(5)
given
This model can be estimated by a two-stage estimation procedure or a maximumlikelihood method.7 Note that v1 and v2 do not need to be independent, nor does Z2 have to
include a variable that is not in Z1.8 After getting consistent estimates of β2 and θ by using
the probit maximum-likelihood method for equation (5), the parameters β1, β2γ, and δ can
7
See Limited-Dependent and Qualitative Variables in Econometrics by G.S. Maddala for a detailed
description of these two estimation procedures.
15
be estimated and the resulting estimates can be shown to be consistent under some general
conditions.9 The estimates are not efficient, however. The standard errors need to be
corrected for the two-stage estimation procedure and we will make the necessary corrections
in the next draft of this paper. Since the primary focus of the paper is whether or not the
autonomy reform has had any affect on student achievement, whether or not δ is significant
is its principal empirical question.
So far, we have been defining the education reform variable M as a dummy variable
representing de jure autonomy, that is, whether a public school has officially signed a
contract with the Ministry of Education transforming its school council into a Consejo
Directivo. This dummy variable, M1, is equal to one if the school is autonomous, and zero
otherwise. We also examine a second measure, M2, which measures de facto autonomy.
This is a discrete variable indicating the percentage of key decisions made by the school
council rather than by the central government or the local government. This variable is
derived from a special questionnaire given to school principals and a random sample of
council members and teachers in each sample school on the locus of decision-making for 25
school decision areas (Appendix A).10
De facto autonomy is an important alternative measure for the reform for two
reasons: although there are, in principle, clear differences in the management of the various
8
9
Equation (3) can still be distinguished from linear combinations of (3) and (5), because (5) contains A*
and (3) contains M.
Since Z2 is a subset of Z1, some of the coefficients will reflect both the direct effects of factors on student
performance and their indirect effect through their influence on the probablity that a school will have
participated in the autonomy reform. That is, the regression function is Q = Z*1β1 + Mδ + Z*2(β1+β2γ)
+ v1, where Z1 = Z*1+ Z*2.
10
These decision areas are similar to the list used in OECD (19xx).
16
types of schools in the past and the current regimes, as shown in Table 1, there is, in fact, a
distribution by type of school with respect to how many and which decisions are being
taken in the school than at some other organizational level. Table 3 shows the expected
pattern that private schools take more decisions than do autonomous public schools, and
autonomous schools take more decisions than do traditional public schools; however, there
is some variation around this pattern. Since, in practice, not all traditional public schools
are the same with respect to decision-making, the distance that a school has to go in order to
achieve autonomy will differ from school to school. The dummy variable on de jure
autonomy cannot reflect this diversity. One reason for the diversity is the failure of the
central government to enforce legal arrangements such that traditional public schools are
taking more decisions than they ought to. This corroborates the view that there is room for
a great deal of autonomy even within a centralized system.
[Table 3 about here]
The second reason for why the de facto autonomy measure might make more sense
is that inscription into the reform does not necessarily mean immediate implementation. In
fact, because the reform involves organizational and administrative changes, we expect
some participating schools to take more time than others in achieving autonomy. The
model is estimated using each measure separately, as well as in interaction with one
another. Just as we explicitly address the issue that M1 may be endogenous, we do likewise
for M2. Note that we find that M1 and M2 are not only not perfectly correlated but their
correlation coefficient is low -- 0.35 at the primary level and 0.3 at the secondary level (with
statistical significance of 1 percent). This finding itself justifies the examination of the
effect of the two school autonomy measures.
17
Two final comments are in order before turning to the results. First, the measure of
education output that is used here is test scores on two student achievement exams at a
particular grade or level, one on mathematics and another on Spanish. There are other
measures of student performance such as completion of the education cycle, performance
on a school-leaving test, or even wages in employment after leaving school. These are
outcomes that may eventually be observed. Secondly, as mentioned earlier, achievement
test scores are so far available only at one point in time. The results from a second round of
tests are not yet available. A benefit from a second round of tests is to estimate a valueadded education production function (Ehrenberg and Brewer, 1994; Goldhaber and Brewer
1997; Hanushek 1986, 1995). This empirical approach would help in controlling for
differences in achievement test scores between autonomous schools and their comparators
that are not due to the reform.
5. Results
We begin the discussion of our results with the simplest specification of the
education production function and then proceed to compare this with other models. We
present results for primary education alongside those for secondary education for
comparison. We remind the reader that the reform has been applied more recently to
primary schools than to secondary schools, such that differences in results may be due more
to the timing of participation in the reform than to differences in its true impact.
Basic student achievement functions
18
The simplest specifications, presented in Tables 4 and 5, are OLS results of a linear
achievement function without considering the selection of students (that is, without
accounting for their drop-out probabilities), and without addressing the endogeneity of
reform participation. Columns (1) and (2) of Tables 4 and 5 present the models with only
student and family characteristics. The coefficients of students’ age and sex are significant,
with older and female students achieving less than younger and male students; the male
advantage, however, appears only in math test scores, not in language test scores. For
primary schoolers, household wealth has a positive and significant effect on student
performance, but only on math tests; these variables do not appear to affect the performance
of secondary schoolers. Mother’s education has a positive effect, but this is significant at
the 10 percent level only for primary students in the math test. The sign on father’s
education is negative on math scores, and positive on language scores; both sets of results
are not significant. A variable that turns out to have a large positive and significant effect is
the student’s access to textbooks, with the only exception to this being the result for
language test scores at the primary level. The regional variables also have significant
coefficients but the patterns obtained from the primary and secondary level estimates are not
consistent across the regressions.
[Tables 4 and 5 about here]
Following the literature, we reestimate the above regressions (minus the regional
variables) with school fixed-effects. These specifications substantially raise the power of
the models to explain student achievement (columns (3) and (4) of Tables 4 and 5). Note
too that the results on some of the student and family level variables have changed. The
most notable change pertains to the access to textbook variable. When school variables are
19
included, individual access to textbooks does not appear to be important at the secondary
level and appears to have a significant negative effect at the primary level. The latter result
is a puzzle to us thus far, and may simply indicate non-robust results for this variable. In the
following set of results, the variable loses statistical significance at the primary level.
Columns (5) and (6) of Tables 4 and 5 show the results for expanded forms of the
traditional function. These specifications include far more information about teacher and
school principal characteristics and variables representing the school environment than is
usually the case. Substituting specific school variables in lieu of the school fixed-effects
decreases the predictive power of the achievement function; but although isolated
statistically significant results may be reflecting random factors, these specifications suggest
that specific school and teacher characteristics (holding constant the socioeconomic
background of students) may indeed influence student performance. For example, teachers’
education has a positive and significant effect on math test scores. Teachers’ number of
years of teaching also has a positive effect, though imprecisely estimated in about half the
regressions.11 The index variable representing school facilities has a positive coefficient in
all the regressions, albeit with different degrees of statistical significance, implying that
improvements in school infrastructure influence student achievement. Another index
variable, that which characterizes the disciplinary environment of the school, suggests that
schools with more problems regarding the behavior of students and teachers have lower
performing students.
11
In the regressions for secondary education, we included the field of education of teachers. Hence, for
math test scores, it matter significantly whether or not teachers have a degree in mathematics. The
same cannot be said for a degree in Spanish and the language test scores.
20
Determinants of the probability of grade promotion
This probability is defined as being equal to one if a student has been promoted to
grade four for primary students or the third year for secondary students and continues in
school. Otherwise, the variable is equal to zero for students who are promoted to the next
grade but decide to drop out and for students who repeat the grade but continue in school.
Students who transfer schools are classified similarly, provided that information on their
enrollment status is available. Of students sampled in December 1995 and could be found
in 1997 for the second school-household survey, 2.4 percent of students in grade three and
4.9 percent of students in the second year of secondary school dropped out the following
year. These dropout figures may be misleading, because the students who could not be
found in the second phase are more likely to have dropped out. The corresponding
percentages for grade repetition were 10.5 and 9.2 percent. About 10 percent of students
transferred to another school at the primary level, of which 11.3 percent repeated the grade.
The corresponding percentages are 10.9 and 22.3 at the secondary level. According to Belli
and Ayadi (1998), the drop out rate in grade three in Nicaragua is 11 percent and the grade
repetition rate is 12 percent. Hence, while our sample’s repetition rate is quite similar, our
dropout rate is considerably lower, most likely because of unexplained sample attrition.
The variables that have been included in the probit estimates of the probability of
promotion and continuation reflect demand and supply factors as measured by family and
community background and school and teacher characteristics (Table 6). These variables
do not explain the probability satisfactorily, and only a few are significant. The parental
variables do not appear to be significant factors, in contrast to the general findings of other
studies on the determinants of schooling (for example, see Behrman and Wolfe 1987 on
21
pre-revolutionary Nicaragua). Older students are more likely to dropout or repeat a grade.
At the primary level, students in schools that enforce payment of fees through penalties tend
to have a higher probability of dropout or grade repetition; this variable appears to have the
opposite effect at the secondary level. There is no clear pattern in the effect of teachers’
characteristics: teachers’ education has a negative effect but this is significant only at the
secondary level. Teachers’ years of experience has a statistically significant positive effect
at the primary level, but a small, insignificant negative effect at the secondary level. The
regional variables consistently indicate that the probability of promotion and continuation in
school is lower in areas outside Managua, though some results are not statistically
significant.
[Table 6 about here]
When these probit results are used to correct for selection bias in the OLS models
discussed in the previous section, the results do not change qualitatively overall but there
are a few notable changes regarding the statistical significance of some key policy variables,
among them being students’ access to textbooks, school facilities, and teachers’ education
and experience (Table 7). However, these gains are realized mainly at the primary level.
The selection bias is significant in all cases except for primary school students in the
language test. This justifies the correction with a selection equation. At the secondary
level, the selection bias factor is negative and statistically significant. This means that
students who have dropped out or repeated would have, as expected, performed worse in
the math and language tests. At the primary level, the sign of the selection factor for the
math equation is positive and significant, but the very large value of rho causes us to
22
speculate that the underlying probit function may not be properly specified and deserves a
further look.
[Table 7 about here]
Models with de jure autonomy
We now turn to the education function specifications that include the school
management variables. The estimated underlying relationship describing a school’s
selection into the reform is given in columns 1 and 2 of Table 8. As discussed above, this
relationship depends on community-level variables and school characteristics. The results
show these variables to have quite different effects on participation in the reform by primary
schools and secondary schools, and a clear picture of which public schools are more likely
to have joined the reform is elusive. For example, primary schools that are located in richer
districts (as measured by average per capita monthly household expenditure) are more likely
to be participating in the reform, while urban location per se appears to have a negative
(though insignificant) impact. The opposite is true for secondary schools. Participation is
more likely for schools in urban areas (as was the intent of the central government); given
this, schools in richer communities are not more likely to have joined. Moreover, whereas
primary schools that are located in districts which have a larger proportion of its children
aged 13-19 in school are more likely to be participating in the reform, secondary schools in
similar areas are less likely to do so. These differences between primary and secondary
schools are understandable; indeed, the government’s selection criteria played a greater role
in the participation of secondary schools, whereas demand-driven participation is more
23
likely the dominant force at the primary level due to the phased implementation of the
reform.
[Table 8 about here]
Three school-level variables that appear to have relatively consistent effects on
participation for primary and secondary schools are the average number of students per
section, school infrastructure, and the quality of the school environment as measured by an
index of behavioral problems pertaining to students and teachers. These variables could be
indicating the avenues for potential gains from the reform. Schools with larger average
class size are more likely to be in the reform. At least in the case of secondary schools,
enrollment size was a criterion used by the government in its selection of schools to invite
into the reform. Public schools with better physical infrastructure are less likely to be
participating in the reform, although this is significant (at the 10 percent level) only for
primary schools. Lastly, secondary schools that have more behavioral problems tend more
to be participating in the reform. This is not a significant factor for primary schools.
As a second-stage in the estimation, a cumulative probability of participation is
derived from the above probit estimates and is included in the student achievement
functions. Note that the achievement function also includes the correction for the selection
bias due to dropout and grade repetition. Table 9 presents the results for the school-level
variables; school principal and family background variables have been omitted to simplify
the tables since their coefficients are not altered by inclusion of the de jure autonomy
variable.
[Table 9 about here]
24
For primary students, de jure autonomy has a positive but insignificant effect on
performance in the math or language tests. None of the results on school level variables
appear to have been affected by the inclusion of this management variable. For secondary
25
In the next set of estimates, we include both autonomy measures in the student
achievement function. There are no changes in the direction or the significance of the
public-private and the de jure autonomy variables. The results for de facto autonomy also
remain similar to the estimates in the previous model with the exception of the equation
for secondary school math scores. In this specification, all the coefficients of the de facto
autonomy variable are positive, although not all are statistically significant.
[Table 12 about here]
Next, we disaggregate the de facto autonomy variable into two types of decisionmaking areas. Instead of the percentage of all decisions made by the school, we examine
only those decisions that are related to teachers and instruction -- staffing as well as
pedagogical issues. The former includes such decisions as hiring and firing, evaluation,
supervision, training, and relations with the teachers union. The latter pertains to
decisions such as class size, curriculum, textbooks, educational plans and programs, and
the school hours and calendar.
We find that the variable on pedagogy gives mixed results. It is negative for the
primary level and positive for the secondary level, with one subject in each level being
statistically significant. More interesting, however, are the results for the variable on
teacher-related issues. The effect of this variable is positive for both levels and both
subjects, and is statistically significant throughout, except for math scores at the
secondary level. Recalling Levin (1995), these results suggest that the schools that are
more active in tracking and monitoring teacher activity and in controlling staffing issues
are likely to be more successful in increasing student achievement.
[Table 13 about here]
26
Next, we turn our attention to the relationship between student achievement and
the degree of influence the teachers feel in decisions made. We find that in secondary
schools where teachers feel more influential in school decision-making, the test scores in
both math and language are significantly higher. When the explanatory variable for
influence is restricted only to areas regarding pedagogical decisions, we still find a
positive and significant impact on secondary school language scores. One interpretation
of these findings echoes the research findings about the Chicago school reform:12 that
“teachers who are more involved in school governance efforts are more likely to report
changes in their classroom practices”.
[Table 14 about here]
Finally, we assess the relative magnitude of the autonomy effect to establish its
policy implication. We compare the effect size of de facto autonomy with school inputs
that are generally of policy interest, namely, the availability of textbooks, teacher’s years
of education, and class size. We simulate the change in test scores as a result of an
increase of one standard deviation in any of these variables, holding all the other inputs
constant at their current values.
At the primary level, were each school to increase its real decison-making power
by one standard deviation (i.e. approximately by 20 percent more decisions) the average
math score would increase by 6.7 percent (Table 15). This effect size is twice as large as
that of textbooks books (3.3 percent), 1.5 times that of teacher’s education (4.5 percent),
and 1.4 times that of a one-standard-deviation reduction in class size (-4.8 percent). The
effect size for language scores of de facto autonomy at the primary level is much smaller
12
Consortium on Chicago School Research, 1991.
27
than for the math score (and is not statistically significant), and is smaller than those of
other inputs, except textbooks. At the secondary level, the effect size of school autonomy
is large for language scores (insignificant for math scores) and is larger than for textbooks
and teacher’s education.
[Table 15 about here]
6. Conclusions
Many governments in developing countries have been quick to adopt
decentralization in its various forms without a firm knowledge of how and under what
conditions changes in the allocation of responsibility and power among the central
authority, subnational governments, and the school can affect education outcomes.
Despite its growing popularity around the world, the impact of these reforms on learning
and student performance is seldom evaluated systematically. This study has aimed to fill in
some of this gap with respect to one form of decentralization, school-based management,
through an impact evaluation of the current school autonomy reform in Nicaragua.
Although its link to student achievement would seem to be a natural focus of past studies,
most of the existing literature has focused instead on the effect of the reform on education
administration, not on instruction and learning (Summers and Johnson 1994).
Using data on autonomous public schools and on comparable traditional public
schools and private schools (obtained through a matched comparison evaluation strategy),
this study finds first that autonomous public schools in Nicaragua are indeed making more
decisions about pedagogical and personnel matters than traditional public schools, although
less than do private schools. However, within each of these groups of schools there is
diversity in the observed levels of autonomy, in which autonomy is measured by the
28
proportion of a set of twenty-five decision areas that is made by the school and not by the
central or subnational government. Part of this diversity is perhaps due to the inability of
the central authority to enforce legal arrangements, part due to problems of perceptions by
the school community, and part due to a lag in transforming school decision-making after a
school becomes legally autonomous.
Because of this diversity, our education production function estimates show that de
jure autonomy, as reflected by a dummy variable, does not appear to have any statistically
significant impact on student achievement test scores. However, a second measure of
school autonomy, the proportion of decisions made by the school or de facto autonomy, is
positively and significantly associated with student performance. In particular, focusing on
decisions related to teacher staffing issues and the monitoring of teacher activities illustrates
the pathways through which greater school autonomy positively affects student
achievement. It may be too early to judge the true impact of the Nicaragua school
autonomy reform on learnng and student performance; our results suggest that there is cause
for optimism.
Finally, with respect to future research, the inclusion of school management
variables in the standard education production function does not change the effect of the
school and home inputs (and may not even add to the explanatory power of the function as
the whole), but the results strongly indicate that assessing the role of school organization
and management variables in predicting student performance deserves greater attention.
29
30
Hanushek, Erik A. and others. 1994. Making Schools Work: Improving Performance
and Controlling Costs. Brookings Institution, Washington, D.C.
Hanushek, Erik A. 1995. “Interpreting Recent Research on Schooling in Developing
Countries,” World Bank Research Observer 10(2): 227-246.
Hanushek, Erik A. 1995. “Education Production Functions,” in M. Carnoy (ed.).
Kremer, Michael. 1995. “Research on Schooling: What We Know and What We Don’t
(A Comment on Hanushek),” World Bank Research Observer 10(2): 247-254.
Levin, Henry M. 1980. “Educational Production Theory and Teacher Inputs,” In
Bidwell and Windham (eds.), The Analysis of Educational Productivity. Ballinger
Publishing Co.
Levin, Henry M. 1995. “Raising Educational Productivity,” in M. Carnoy (ed.), 1995.
Levin, Henry M. and Marlaine E. Lockheed (eds.). 1993. Effective Schools in
Developing Countries. Washington DC: The Falmer Press.
Lockheed, Marlaine E. and Qinghua Zhao. 1993. “The Empty Opportunity: Local
Control and Secondary School Achievement in the Philippines,” International
Journal of Educational Development 13(1): 45-62.
Maddala, G.S. 1983. Limited-Dependent and Qualitative Variables in Econometrics.
Cambridge University Press.
Ministry of Education. 1993. Reglamento General de Educación Primaria y Secundaria.
Managua, Nicaragua.
Nicaragua Reform Evaluation Team. Nicaragua’s School Autonomy Reform: A First
Look. Working Paper Series on Impact Evaluation of Education Reforms, The
World Bank, October 1996.
Organisation for Economic Cooperation and Development. 1993. Education at a
Glance. OECD Indicators.
Stiglitz, Joseph E. 1988. “Principal and Agent,” John M. Olin Program for the Study of
Economic Organization and Public Policy: 12.
Summers, Anita A. and Amy W. Johnson. 1994. “A Review of the Evidence of the
Effects of School-Based Management Plans”, Review of Educational Research.
31
Witte, John F. 1990. “Choice and Control: An Analytical Overview,” In W. H. Clune
and J. F. Witte (eds.) Choice and Control in American Education. Volume 2.
32
Table 1. Previous v. Present Regime: Comparing Autonomous, Traditional and Private Schools
Previous Regime
Functions
Present Regime
All Public Schools
Traditional
Public Schools
Autonomous
Schools
Private
Schools
Ministry
Ministry
Ministry
Ministry
Staff promotions policy
Ministry
Ministry
Ministry
Ministry
Setting the curriculum
Ministry
Ministry
Ministry
Ministry
Ministry
Ministry
Ministry
Ministry
Expanding classroom hours by
subject
Ministry
School
School
School
Programming additional curricular
and extracurricular activities
Ministry
School
School
School
Establishing pedagogical methods
Ministry
School
School
School
Formulating the annual pedagogical
plan
Ministry
Ministry
School
School
Selecting textbooks
Ministry
Ministry
School
School
Evaluating students
Ministry
Ministry
School
School
Setting equivalencies*
Ministry
Ministry
Hiring and firing director
Ministry
Ministry
School
School
Hiring and firing teachers and
administrative personnel
Ministry
School
School
School
Setting student and staff obligations,
rights and sanctions
Ministry and School
Ministry
Setting and administering the school
budget
Ministry and School
School
School
School
Setting school fees for goods and
services
Ministry
Ministry
Ministry
School
Setting voluntary school fees
School
School
School
School
Setting monthly fee paid by students
Ministry
Ministry
School
School
Structuring the education system
Certifying teachers
School
School
School
School
Note: *This pertains to academic requirements that must be fulfilled in order to determine the academic level of students
who transfer schools.
Table 2: Sample Attrition
Primary Schools
Secondary Schools
Completed HH Surveys, ‘95
After clean-up
After merge w/ parent data (a)
After merge w/ school data
1,561
1,515
1,528
1,484
1,455
1,455
1,474
1,430
Test Scores, ‘96 (b)
1,296
1,312
1,744
1,896
607
350
787
642
465
789
1,691
1,885
555
350
786
632
465
788
Student Data & Test Scores
of which:
Student data, no test score
Test score, no student data (c)
Test Score and student data
After final clean-up (d)
Student Data & Test Scores
of which:
Student data, no test score
Test score, no student data
Test Score and student data
(a) Students with no matching parents or guardian were assigned the mean parent characteristics in their respective
schools. Similarly, parents with missing student data were assgned the mean student characteristics in the school.
(b) The figure includes replacement students for students who were absent from the test for a variety of reasons. See the
‘Data Sources’ under section 4 for a more detailed discussion.
(c) These students were assigned mean school, student and parent characteristics in their respective schools. See the
text for a more detailed discussion of alternative approaches to this problem.
(d) See “Notes on Data” in Appendix C for details on data cleaning.
Table 3a. Percentage of Respondents Who Claim that the School is the Decision-maker in Specific Areas,
Primary Schools, 1995
Decision Areas
Traditional
Autonomous
NER
Classroom & pedagogy
27.31
31.88
43.44
Personnel
22.62
50.83
47.89
Supervision & evaluation of teachers
53.61
57.08
63.69
Setting salaries & incentives
19.89
36.84
42.91
School budget & plan
28.08
49.77
60.88
9.58
21.94
25.22
Teacher training
Table 3b. Percentage of Respondents Who Claim that the School is the Decision-maker in Specific Areas,
Secondary Schools, 1995
Decision Areas
Traditional
Autonomous
Private
Classroom & pedagogy
32.60
41.16
64.81
Personnel
13.77
65.20
86.63
Supervision & evaluation of teachers
58.58
68.97
89.38
Setting salaries & incentives
31.00
59.04
92.36
School budget & plan
51.73
73.87
96.08
Teacher training
10.44
23.97
49.17
Table 4: Basic Student Achievement Functions in Primary Schools
Variable
s_age
Student and HH
With School fixed- With school
variables only
effects added
characteristics added
Math
Spanish Math
Spanish Math
Spanish
-0.351**
(0.159)
s_male
books
-0.324** -0.341**
(0.094)
(0.168)
-0.338**
-0.296*
-0.260**
(0.104)
(0.161)
(0.095)
1.147**
-0.184
0.624
-0.171
1.087**
-0.259
(0.473)
(0.278)
(0.464)
(0.288)
(0.478)
(0.283)
1.120**
(0.545)
0.161 -1.788**
(0.320)
.
d_edyrs
(
.)
(
.)
(
.)
.
(
.)
(
.)
(
.)
.
d_exp
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
(
.)
.)
0.030
0.030
.)
(0.102)
(0.061)
(
.)
(
.)
(0.542)
.
0.405**
0.124
(
.)
(0.164)
(0.097)
.
0.111**
0.040
(
.)
(0.047)
(0.028)
.
3.204**
0.896
(
.)
(0.988)
(0.586)
.
0.012
-0.188**
(
.)
(0.144)
(0.086)
.
0.383
(
.)
(3.712)
(
.)
(
.
-0.043*
-0.016
(
.)
(0.025)
(0.015)
.
0.363*
0.200
.)
(0.209)
(0.128)
.
-0.301
-0.238*
(0.133)
. -0.070**
(
.)
(
.)
.
.
.
.
(
.)
(
.)
(
.)
.
.
(
.)
(0.017)
. -1.189**
-0.811**
(
.
(
.)
.
-0.719
.)
(1.069)
.
.)
(0.223)
hh_input
0.434
0.270
0.113
0.088
0.345
0.100
(0.300)
(0.176)
(0.391)
(0.243)
(0.345)
(0.204)
.)
(
.)
.
(0.319)
problem
(
.
-0.018
(0.028)
.
.
.
(
.
(
.
.
.
sc_input
-0.052
(0.336)
.
.
.
stu_sect
.)
.
.
t_trspan
(
.
.
t_trmath
0.682
(0.570)
.
.
.
t_abs
.)
.
.
t_male
(
.
.
t_exp
.)
.
.
t_edyrs
-0.603
(0.412)
.
(
.
.
d_male
(0.665)
(
.)
(
1.511*
0.210
0.775
-0.166
1.196
0.018
(0.776)
(0.456)
(0.826)
(0.513)
(0.805)
(0.475)
m_edyrs
0.550*
0.238
0.212
0.120
0.507
0.198
(0.331)
(0.195)
(0.326)
(0.202)
(0.333)
(0.197)
f_edyrs
-0.399*
0.183
-0.211
0.144
-0.436*
0.182
(0.233)
(0.137)
(0.233)
(0.145)
(0.233)
(0.138)
hometype
mom_inhh
dad_inhh
int_m1
int_f1
int_h1
0.294
-0.591
0.513
-0.821
0.452
-0.664
(1.537)
(0.904)
(1.514)
(0.939)
(1.583)
(0.937)
-0.998
-0.155
-1.070
-0.655
-1.185
-0.218
(0.880)
(0.518)
(0.870)
(0.540)
(0.891)
(0.526)
-0.254
-0.286*
-0.179
-0.198
-0.209
-0.262
(0.266)
(0.156)
(0.255)
(0.158)
(0.269)
(0.159)
0.207
-0.023
0.227*
0.040
0.218
-0.015
(0.141)
(0.083)
(0.135)
(0.084)
(0.141)
(0.084)
-0.215
-0.081
-0.075
-0.009
-0.133
-0.073
(0.288)
(0.169)
(0.280)
(0.174)
(0.289)
(0.171)
Variable
1
reg1
reg2
Student and HH
With School fixed- With school
variables only
effects added
characteristics added
Math
Spanish Math
Spanish Math
Spanish
0.461
-0.417
(1.172)
(0.689)
1.339**
0.242
(0.516)
(0.304)
.
(
.)
.
1.171
-0.460
.)
(1.213)
(0.717)
.
1.003*
0.106
(
.)
(0.572)
(0.338)
.
2.816**
0.213
(
.)
(0.630)
(0.360)
.
-0.499
-2.995**
.)
(1.039)
(0.618)
.
3.065
1.545
.)
(1.904)
(1.126)
(
.
(
.)
(
.)
2.535**
0.153
(0.578)
(0.340)
reg5
-0.631
-2.731**
(0.963)
(0.567)
reg6
5.337**
2.571**
(1.801)
(1.060)
N
1116
1116
1116
1116
1097
1097
Adj. R^2
.049
.069
.213
.143
.075
.084
reg4
.
.
(
.)
(
.
(
.)
(
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates.
1
See Appendix B for the geographic definition of each region. Our comparison region in all the regressions is Managua
(Region 3).
Table 5: Basic Student Achievement Functions in Secondary Schools
Student and HH
variables only
With School fixedeffects added
Variable
int_m1
int_f1
int_h1
Student and HH
variables only
With School fixedeffects added
Math
Math
Spanish
-0.305
-0.078
-0.489**
-0.182
-0.453*
-0.167
(0.251)
(0.240)
(0.261)
(0.240)
(0.255)
0.037
-0.067
0.115
-0.038
0.112
-0.039
(0.105)
(0.109)
(0.108)
(0.118)
(0.104)
(0.111)
-0.091
-0.056
-0.165
-0.182
-0.154
-0.076
(0.192)
(0.199)
(0.193)
(0.210)
(0.189)
(0.201)
.
(
reg2
reg4
reg5
reg6
reg7
Spanish
(0.243)
private
reg1
With school
characteristics
added
Math
Spanish
.)
.
(
.)
0.915
0.219
(0.607)
(0.630)
-1.256**
1.604**
(0.500)
(0.519)
0.059
1.750**
(0.505)
(0.524)
1.666**
0.334
(0.814)
(0.845)
0.681
3.792**
(0.665)
(0.691)
-0.562
3.590**
.
(
.)
(
.)
(
.)
(
.)
.
-1.657**
0.442
(
.)
(0.591)
(0.649)
.
1.213*
0.631
(
.)
(0.697)
(0.777)
.
-1.148**
1.044*
(
.)
(0.542)
(0.582)
.
0.317
1.524**
(
.)
(0.586)
(0.678)
.
1.837**
0.928
(
.)
(0.888)
(0.948)
.
1.951**
3.870**
(
.)
(0.752)
(0.767)
.
1.344
4.936**
.
.
.
.
(
.)
(
.)
.
.
(0.905)
(0.940)
.)
(1.031)
(1.002)
N
1237
1237
1237
(
.)
1237
(
1219
1219
Adj. R^2
.039
.106
.202
.179
.096
.116
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates.
Table 6: Probability of Being Promoted and Continuing Education
Variable
mon_fee
sc_acts
allowan
freebook
s_age
Primary Schools Secondary Schools
-0.001
0.001
( 0.003)
( 0.001)
-0.072**
0.038*
( 0.033)
( 0.022)
-0.008
0.000
( 0.006)
( 0.003)
-0.002
-0.017
( 0.023)
( 0.021)
-0.017**
-0.092**
( 0.007)
( 0.031)
s_age2
(
s_male
books
t_edyrs
-0.029
0.008
( 0.022)
0.071**
0.015
( 0.031)
( 0.019)
-0.002
-0.007**
( 0.008)
( 0.004)
(
t_male
t_abs
stu_sect
sc_input
problem
hh_input
hometype
m_edyrs
f_edyrs
mom_inhh
dad_inhh
int_m1
int_f1
int_h1
0.002**
( 0.001)
( 0.022)
t_span
t_exp
.
.)
.
-0.017
.)
( 0.026)
0.006**
-0.001
( 0.002)
( 0.001)
0.096**
-0.022
( 0.047)
( 0.025)
-0.001
-0.008
( 0.006)
( 0.005)
0.000
-0.001
( 0.001)
( 0.001)
0.004
0.001
( 0.010)
( 0.010)
-0.014
-0.004
( 0.009)
( 0.009)
0.023
-0.001
( 0.016)
( 0.015)
0.039
0.029
( 0.033)
( 0.059)
-0.006
0.003
( 0.014)
( 0.013)
0.010
0.008
( 0.009)
( 0.008)
0.068
0.034
( 0.064)
( 0.061)
-0.032
0.018
( 0.040)
( 0.041)
-0.004
-0.006
( 0.012)
( 0.012)
0.003
-0.004
( 0.006)
( 0.005)
-0.002
-0.001
( 0.012)
( 0.010)
Variable
reg1
reg2
reg4
reg5
reg6
Primary Schools Secondary Schools
-0.026
-0.029
( 0.077)
( 0.044)
-0.083**
-0.055*
( 0.034)
( 0.034)
-0.067**
-0.011
( 0.037)
( 0.037)
-0.150**
-0.063
( 0.065)
( 0.061)
-0.210**
-0.101**
( 0.129)
( 0.053)
.
-0.081
.)
( 0.073)
N
1262
1366
Pseudo R^2
.063
.044
reg7
(
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates.
Table 7: Basic Student Achievement Functions corrected for Selection Bias due to Dropout and Repetition
Variable
s_age
Primary Math Primary Spanish Secondary Math Secondary Spanish
-0.485**
-0.240**
-1.460**
-0.938*
( 0.175)
( 0.098)
( 0.436)
( 0.499)
.
s_age2
(
s_male
books
t_edyrs
t_abs
t_trmath
.
0.032**
0.014
.)
( 0.011)
( 0.013)
0.712
-0.251
1.021**
0.596
( 0.283)
( 0.407)
( 0.454)
1.395**
-0.143
0.905**
0.497
( 0.645)
( 0.351)
( 0.347)
( 0.387)
0.426**
0.117
-0.012
0.025
( 0.184)
( 0.096)
( 0.060)
( 0.066)
.
(
.)
(
.)
.
2.992**
(
.)
( 0.636)
(
.)
.
t_span
t_male
(
( 0.531)
t_math
t_exp
.)
.
(
.
(
.)
.
-0.468
.)
( 0.544)
0.149**
0.032
0.025
0.071**
( 0.054)
( 0.029)
( 0.031)
( 0.030)
4.508**
0.996
-1.008**
0.192
( 1.126)
( 0.611)
( 0.433)
( 0.496)
-0.039
-0.205**
0.020
0.240**
( 0.158)
( 0.085)
( 0.066)
( 0.121)
-2.480
( 3.305)
(
.
0.422
.)
( 0.503)
.
.
-0.210
.)
( 0.615)
(
.)
t_trspan
.
-0.779
.)
( 1.058)
stu_sect
-0.047*
-0.015
0.007
0.053**
( 0.027)
( 0.015)
( 0.012)
( 0.019)
(
sc_input
problem
hh_input
hometype
m_edyrs
f_edyrs
mom_inhh
dad_inhh
(
0.451*
0.215*
0.701**
0.447
( 0.231)
( 0.127)
( 0.247)
( 0.274)
-0.441*
-0.232*
-0.326**
-0.108
( 0.242)
( 0.132)
( 0.159)
( 0.180)
0.545
0.091
-0.155
-0.405
( 0.385)
( 0.205)
( 0.274)
( 0.308)
1.894**
-0.007
-0.442
0.636
( 0.874)
( 0.475)
( 1.221)
( 1.376)
0.446
0.206
0.551**
0.004
( 0.361)
( 0.195)
( 0.251)
( 0.276)
-0.330
0.178
-0.152
0.169
( 0.245)
( 0.137)
( 0.141)
( 0.155)
1.650
-0.713
2.742**
1.100
( 1.728)
( 0.935)
( 1.274)
( 1.388)
-1.669*
-0.184
-0.365
0.218
( 0.981)
( 0.524)
( 0.819)
( 0.909)
.
.
-1.657**
0.312
.)
( 0.583)
( 0.655)
1262
1262
1403
1366
ρ [Prob(ρ=0)]
0.93 [.000]
-0.17 [.481]
.
-0.559 [.003]
log-likelihood
-4023
-3414
-4318
-4274
Prob>Chi^2
.0001
.002
.154
.074
private
(
N
.)
(
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates. ρ is the correlation coeffiecient between the error terms in the Heckman
model.
Table 8: Probability of Being Selected into the Reform
Variable
pc30
schaged
prienrol
secenrol
pried
Primary Schools Secondary Schools
0.027**
-0.002
( 0.009)
( 0.002)
24.105
-16.834*
( 47.272)
( 9.524)
-11.488
1.506
( 11.296)
( 2.176)
30.389**
-5.344
( 10.064)
( 3.438)
-11.524
( 24.719)
seced
(
timereq
urban_ls
d_edyrs
d_exp
d_male
t_edyrs
t_exp
t_male
t_abs
stu_sect
sc_input
problem
N
Pseudo R^2
.
(
.)
.
-2.292
.)
( 4.067)
-0.990**
-0.067
( 0.446)
( 0.075)
-13.754
2.651
( 10.907)
( 1.647)
0.207
0.055
( 0.147)
( 0.079)
-0.065*
-0.010
( 0.038)
( 0.023)
-0.347
-0.995**
( 0.791)
( 0.450)
-0.759**
0.129*
( 0.312)
( 0.068)
-0.092
-0.027
( 0.085)
( 0.033)
-5.079**
-0.144
( 1.835)
( 0.469)
0.087
-0.186
( 0.190)
( 0.152)
0.161**
0.048**
( 0.059)
( 0.019)
-0.547*
-0.245
( 0.321)
( 0.298)
0.681
0.409*
( 0.725)
( 0.209)
86
93
.581
.402
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates.
Table 9: Student Achievement Functions corrected for Selection Bias and Endogeneity of the Reform
Variable
s_age
Primary Math Primary Spanish Secondary Math Secondary Spanish
-0.526**
-0.236**
-0.889*
-0.916*
( 0.177)
( 0.098)
( 0.477)
( 0.500)
.
.
0.017
0.014
( 0.013)
s_age2
(
s_male
books
t_edyrs
.)
( 0.012)
0.684
-0.230
0.994**
0.612
( 0.534)
( 0.286)
( 0.434)
( 0.455)
1.408**
-0.168
0.718*
0.474
( 0.349)
( 0.373)
( 0.388)
0.463**
0.137
0.040
0.049
( 0.198)
( 0.104)
( 0.066)
( 0.073)
.
.
3.168**
.
.)
( 0.682)
.
.
-0.419
.)
( 0.547)
(
(
t_abs
t_trmath
.)
.)
(
0.012
0.071**
( 0.055)
( 0.029)
( 0.034)
( 0.030)
4.324**
0.815
-1.144**
0.174
( 1.204)
( 0.654)
( 0.465)
( 0.498)
-0.011
-0.210**
0.032
0.230*
( 0.159)
( 0.087)
( 0.071)
( 0.124)
-1.985
.
0.334
.)
( 0.500)
(
problem
.)
.)
0.042
t_trspan
sc_input
(
(
0.163**
( 3.269)
stu_sect
(
.
t_span
t_male
(
( 0.645)
t_math
t_exp
.)
(
.
-1.072
.)
( 1.078)
(
.
(
.)
.
-0.234
.)
( 0.623)
-0.046*
-0.015
0.019
0.059**
( 0.027)
( 0.015)
( 0.014)
( 0.021)
0.457**
0.217*
0.752**
0.404
( 0.231)
( 0.128)
( 0.260)
( 0.286)
-0.452*
-0.230*
-0.273
-0.052
( 0.244)
( 0.134)
( 0.188)
( 0.193)
.
.
-1.935**
0.255
.)
( 0.644)
( 0.672)
0.068
0.468
-0.629
-0.912
( 0.775)
( 0.414)
( 1.205)
( 1.119)
1242
1242
1403
1366
ρ [Prob(ρ=0)]
0.945 [.000]
-0.213 [.304]
-0.753 [.000]
-0.573 [.002]
log-likelihood
-3956
-3408
-4313
-4272
Prob>Chi^2
.0000
.0006
.0498
.0649
private
(
autonomous
public school
N
.)
(
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates. ρ is the correlation coeffiecient between the error terms in the Heckman
model.
Table 10: 1st Stage OLS for De Facto Autonomy
Variable
pc30
schaged
prienrol
secenrol
pried
Primary Schools Secondary Schools
0.001
0.000
( 0.000)
( 0.000)
0.803
0.729
( 1.459)
( 1.293)
0.075
0.107
( 0.506)
( 0.315)
0.296
-0.029
( 0.283)
( 0.483)
-0.459
( 0.619)
seced
(
timereq
urban_ls
d_edyrs
d_exp
d_male
t_edyrs
t_exp
t_male
t_abs
stu_sect
sc_input
problem
N
Pseudo R^2
.
(
.)
.
-0.133
.)
( 0.549)
-0.020
0.023**
( 0.015)
( 0.012)
-0.230
0.043
( 0.156)
( 0.222)
0.010
0.000
( 0.012)
( 0.011)
-0.002
0.000
( 0.003)
( 0.003)
0.013
0.016
( 0.066)
( 0.058)
0.003
0.013
( 0.026)
( 0.008)
-0.011*
-0.001
( 0.006)
( 0.005)
-0.088
-0.006
( 0.127)
( 0.064)
-0.022
0.004
( 0.017)
( 0.019)
0.005
-0.001
( 0.003)
( 0.002)
0.016
0.040
( 0.023)
( 0.033)
0.020
0.036
( 0.027)
( 0.023)
66
81
.144
.003
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates.
Table 11: Student Achievement Functions corrected for Selection Bias and Endogeneity of De Facto Autonomy
Variable
s_age
Primary Math Primary Spanish Secondary Math Secondary Spanish
-0.467**
-0.225**
-0.903*
-1.013**
( 0.175)
( 0.098)
( 0.476)
( 0.497)
.
.
0.018
0.016
( 0.013)
s_age2
(
s_male
books
t_edyrs
.)
( 0.012)
0.710
-0.251
0.990**
0.551
( 0.529)
( 0.283)
( 0.435)
( 0.450)
1.287**
-0.197
0.717*
0.618
( 0.351)
( 0.375)
( 0.386)
0.337*
0.091
0.063
-0.063
( 0.187)
( 0.098)
( 0.076)
( 0.075)
.
.
3.140**
.
.)
( 0.683)
.
.
-0.556
.)
( 0.540)
(
(
t_abs
t_trmath
.)
.)
(
0.013
0.049
( 0.063)
( 0.034)
( 0.034)
( 0.031)
4.685**
1.102*
-1.203**
0.461
( 1.131)
( 0.614)
( 0.467)
( 0.503)
0.133
-0.145
0.029
0.266**
( 0.175)
( 0.095)
( 0.070)
( 0.121)
-3.086
.
0.393
.)
( 0.500)
(
problem
.)
.)
0.058*
t_trspan
sc_input
(
(
0.225**
( 3.362)
stu_sect
(
.
t_span
t_male
(
( 0.647)
t_math
t_exp
.)
(
.
-1.026
.)
( 1.067)
(
.
(
.)
.
-0.652
.)
( 0.631)
-0.084**
-0.026
0.014
0.073**
( 0.031)
( 0.017)
( 0.013)
( 0.020)
0.208
0.144
0.769**
0.069
( 0.252)
( 0.137)
( 0.283)
( 0.305)
-0.625**
-0.297**
-0.233
-0.502**
( 0.254)
( 0.140)
( 0.197)
( 0.236)
.
.
-1.783**
0.037
.)
( 0.630)
( 0.666)
8.716**
3.121
-2.037
7.155**
( 3.766)
( 2.104)
( 2.638)
( 2.827)
1262
1262
1403
1366
ρ [Prob(ρ=0)]
0.932 [.000]
-0.191 [.400]
-0.766 [.000]
-0.513 [.013]
log-likelihood
-4020
-3461
-4310
-4267
Prob>Chi^2
.0001
.0019
.0205
.0601
private
(
de facto
autonomy
N
.)
(
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates. ρ is the correlation coeffiecient between the error terms in the Heckman
model.
Table 12: Student Achievement Functions including Indicators for Reform and Management
Variable
s_age
Primary Math Primary Spanish Secondary Math Secondary Spanish
-0.492**
-0.235**
-0.552
-0.745
( 0.177)
( 0.099)
( 0.484)
( 0.509)
.
.
0.009
0.011
( 0.013)
s_age2
(
s_male
books
t_edyrs
.)
( 0.012)
0.688
-0.232
0.942**
0.653
( 0.532)
( 0.286)
( 0.430)
( 0.453)
1.213*
-0.167
0.692*
0.618
( 0.351)
( 0.370)
( 0.389)
0.454**
0.145
0.036
-0.027
( 0.199)
( 0.105)
( 0.069)
( 0.075)
.
(
.)
(
.)
t_abs
t_trmath
(
.)
.
(
.
(
.)
.
-0.193
.)
( 0.549)
0.045
0.012
0.062**
( 0.030)
( 0.033)
( 0.030)
4.140**
0.741
-1.316**
0.353
( 1.206)
( 0.656)
( 0.463)
( 0.498)
0.063
-0.200**
0.095
0.300**
( 0.161)
( 0.088)
( 0.073)
( 0.125)
-3.532
(
problem
( 0.693)
0.188**
t_trspan
sc_input
3.257**
.)
( 0.056)
( 3.678)
stu_sect
.
(
.
t_span
t_male
(
( 0.646)
t_math
t_exp
.)
(
.
0.239
.)
( 0.524)
.
-1.177
.)
( 1.091)
(
.
(
.)
.
-0.224
.)
( 0.623)
-0.049*
-0.014
0.024*
0.059**
( 0.028)
( 0.015)
( 0.014)
( 0.021)
0.418*
0.234*
0.556**
0.331
( 0.234)
( 0.130)
( 0.264)
( 0.290)
-0.448*
-0.247*
-0.298
-0.258
( 0.246)
( 0.135)
( 0.194)
( 0.201)
.
.
-1.692**
-0.018
.)
( 0.629)
( 0.673)
-0.063
0.399
-0.362
-0.242
( 0.775)
( 0.419)
( 1.216)
( 1.129)
3.488**
0.275
0.433
3.419**
( 1.254)
( 0.783)
( 0.944)
( 0.978)
1242
1242
1403
1366
ρ [Prob(ρ=0)]
0.946 [.000]
-0.216 [.297]
-0.733 [.000]
-0.574 [.001]
log-likelihood
-3952
-3407
-4307
-4266
Prob>Chi^2
.0000
.007
.0499
.00546
private
(
autonomous
public school
de facto
autonomy
N
.)
(
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates. ρ is the correlation coeffiecient between the error terms in the Heckman
model.
Table 13: Student Achievement Functions including Indicators for Reform and Decision-Making on teacherrelated Issues
Variable
s_age
Primary Math Primary Spanish Secondary Math Secondary Spanish
-0.520**
-0.229**
-0.538
-0.770
( 0.177)
( 0.098)
( 0.483)
( 0.510)
.
.
0.009
0.011
.)
( 0.012)
( 0.013)
s_age2
(
s_male
books
t_edyrs
0.832
-0.170
0.957**
0.675
( 0.286)
( 0.430)
( 0.453)
1.172*
-0.239
0.694*
0.560
( 0.647)
( 0.350)
( 0.369)
( 0.387)
0.460**
0.137
0.025
-0.023
( 0.200)
( 0.104)
( 0.068)
( 0.076)
.
(
.)
(
.)
t_abs
t_trmath
.
(
.
-0.411
.)
( 0.544)
0.011
0.066**
( 0.033)
( 0.030)
4.094**
0.712
-1.301**
0.285
( 1.208)
( 0.654)
( 0.463)
( 0.496)
0.104
-0.162*
0.099
0.298**
( 0.162)
( 0.089)
( 0.073)
( 0.126)
-5.009
(
.
0.215
.)
( 0.507)
.
-1.295
.)
( 1.083)
(
.
(
.)
.
-0.294
.)
( 0.624)
-0.042
-0.012
0.025*
0.062**
( 0.028)
( 0.015)
( 0.014)
( 0.021)
0.344
0.187
0.570**
0.404
( 0.236)
( 0.131)
( 0.263)
( 0.290)
-0.418*
-0.228*
-0.319*
-0.226
( 0.246)
( 0.135)
( 0.191)
( 0.199)
.
.
-1.718**
0.083
.)
( 0.626)
( 0.670)
0.126
0.431
-0.298
-0.236
( 0.774)
( 0.416)
( 1.206)
( 1.132)
3.721**
1.359**
0.675
2.301**
( 0.981)
( 0.617)
( 0.667)
( 0.716)
(
de facto
.)
.)
0.041
private
autonomous
public
schools
(
.
(
( 0.029)
(
problem
( 0.689)
0.157**
t_trspan
sc_input
3.326**
.)
( 0.055)
( 3.568)
stu_sect
.
(
.
t_span
t_male
(
( 0.536)
t_math
t_exp
.)
.)
(
2
autonomy
1242
1242
1403
1366
ρ [Prob(ρ=0)]
0.957 [.000]
-0.225 [.262]
-0.732 [.000]
-0.561 [.003]
log-likelihood
-3949
-3405
-4306
-4267
Prob>Chi^2
.0000
.0007
.0497
.00578
N
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates. ρ is the correlation coeffiecient between the error terms in the Heckman
model.
2
de facto autonomy here refers to decision-making on teacher-related issues only. See text for a detailed
description of those decisions.
Table 14: Student Achievement Functions including indicators for Reform, Decision-Making on teacher-related
Issues, and Teachers’ Influence on Pedagogical Decisions
Variable
s_age
Primary Math Primary Spanish Secondary Math Secondary Spanish
-0.327
-0.227**
-0.769
-0.522
( 0.940)
( 0.098)
( 0.499)
( 0.522)
.
.
0.015
0.005
.)
( 0.013)
( 0.013)
s_age2
(
s_male
books
t_edyrs
1.257
-0.185
1.086**
0.583
( 0.286)
( 0.449)
( 0.471)
0.339
-0.218
0.814**
0.501
( 2.292)
( 0.350)
( 0.399)
( 0.412)
0.414**
0.134
0.087
-0.013
( 0.154)
( 0.107)
( 0.074)
( 0.078)
.
(
.)
(
.)
t_abs
t_trmath
(
.)
.
(
.
(
.)
.
-0.412
.)
( 0.598)
0.048
0.042
0.082**
( 0.030)
( 0.039)
( 0.033)
2.751
0.607
-0.732
0.851
( 4.253)
( 0.671)
( 0.537)
( 0.567)
0.249
-0.137
0.152
0.223
( 0.152)
( 0.090)
( 0.154)
( 0.206)
-4.061
(
problem
( 0.775)
0.145
t_trspan
sc_input
2.499**
.)
( 0.147)
( 4.787)
stu_sect
.
(
.
t_span
t_male
(
( 1.333)
t_math
t_exp
.)
(
.
0.067
.)
( 0.551)
.
-1.460
.)
( 1.089)
(
.
(
.)
.
-0.259
.)
( 0.638)
-0.046
-0.015
0.018
0.068**
( 0.035)
( 0.015)
( 0.015)
( 0.025)
0.302
0.198
0.806**
0.508
( 0.222)
( 0.131)
( 0.287)
( 0.326)
-0.091
-0.199
-0.174
-0.239
( 0.478)
( 0.136)
( 0.206)
( 0.221)
.
private
(
.)
(
.
-2.451**
-0.299
.)
( 0.689)
( 0.749)
-0.169
0.389
-0.289
0.229
( 1.430)
( 0.427)
( 1.291)
( 1.188)
3.343**
1.491**
1.103
2.717**
( 1.460)
( 0.625)
( 0.698)
( 0.736)
-0.594
( 0.389)
1242
-0.295
( 0.221)
1242
-0.120
( 0.399)
1267
0.865**
( 0.423)
1229
ρ [Prob(ρ=0)]
-0.006 [.999]
-0.228 [.249]
-0.714 [.000]
-0.640 [.000]
log-likelihood
-3952
-3404
-3856
-3783
Prob>Chi^2
.0009
.0008
.0469
.00137
autonomous public schools
3
de facto autonomy
teachers’ influence on pedagogical
decisions
N
** denotes statistical significance at the 5% level. * denotes the same at the 10% level. The standard errors are reported
in parantheses below the parameter estimates. ρ is the correlation coeffiecient between the error terms in the Heckman
model.
3
de facto autonomy here refers to decision-making on teacher-related issues only. See text for a detailed
description of those decisions.
Table 15: Effect Size of Various Variables on Test Scores4
Variable
Textbooks
Teacher’s yrs of ed.
Class size
De facto autonomy
4
Primary Math Primary Spanish Secondary
Secondary
Math
Spanish
3.31
-0.55
1.75
1.34
4.52
1.73
0.61
-0.40
-4.79
-1.59
2.71
5.81
6.73
0.64
0.60
4.05
Effect size should be interpreted as follows: Adding one standard deviation to the variable in
question, holding all other variables constant at their current values (not means), increases the
relevant test scores by x%. e.g. A one standard deviation increase in each teacher’s years of
education (i.e an increase of 1.38 years) at the primary level, would increase the average
primary school math score by 4.52%. On the other hand increasing class size by approximately
14 students would decrease the average score by 4.79%.
Appendix A: Key Areas of Decision-making
•
•
•
Salaries and Incentives
Setting salaries
Establishing incentives for teachers
and administrative staff
Personnel
Hiring and firing teachers
Hiring and firing the director
Hiring and firing administrative personnel
Classroom and Pedagogy
Determining class size
Designing the curriculum
Selecting textbooks
Defining educational plans and programs
Pedagogical supervision
Determining schools hours
Setting the school calendar
Training Teachers
• Maintenance and Infrastructure
Maintaining the schools
Developing infrastructure projects
• Administration
Planning and preparing school budget
Setting goals for the school
Planning and preparing school budget
Providing textbooks
Distributing textbooks
Informing the community about school
activities
Accrediting new schools
Relations with teachers’ union
• Teacher Supervision and Evaluation
Evaluating teachers
Supervising teachers
Appendix B: Regions
The 7 regions defined consist of the following departments in Nicaragua:
Region 1: Esteli, Madriz, Nueva Segovia
Region 2: Chinandega, León
Region 3: Managua
Region 4: Carazo, Granada, Masaya, Rivas
Region 5: Boaco, Chontales
Region 6: Matagalpa, Jinotega
Region 7: RAAN, RAAS, Rio San Juan.
Appendix C: Notes on Data Work on the Nicaragua School-Household and Testing Instruments
Primary Schools:
1. 13 schools were eliminated from the 1997 follow-up survey, because they did not have fourth grade, but
the evaluation team in Nicaragua managed to follow some students from these schools to administer tests.
The students that were followed belonged to 5 of these schools mentioned above. The other 8 schools
have been dropped from our sample
2. There were 5 schools where tests were not administered for various reasons (flooding, etc.). No
systematic patterns in this attrition is detected.
3. In one school, we had problems with the quality of data and consequently dropped it from our sample.
Secondary Schools:
1. Two schools had no tests administered in the center for various reasons.
Variable Name
allowance
books
d_edyrs
d_exp
d_male
dad_inhh
f_edyrs
freebook
hh_input
hometype
int_f1
int_h1
int_m1
m_edyrs
mom_inhh
mon_fee
pc30
pried
prienrol
private
problem
s_age
s_male
sc_acts
sc_input
schaged
seced
secenrol
stu_sect
t_abs
t_edyrs
t_exp
t_male
t_math
t_span
t_trmath
t_trspan
timereq
Appendix D: Descriptive Statistics of the Variables Used in Data Analysis
Description
Primary School
Secondary School
Mean
Standard
# of
Mean
Standard
Deviation
Obs.
Deviation
Daily allowance for
1.28
1.35
1691
3.72
3.83
school
=1 if student has some or
.82
.39
1691
.52
.50
all books
Director’s # of yrs of
10.59
4.79
1691
14.51
5.83
educ.
Director’s yrs of exp in
13.83
9.73
1691
18.80
12.37
teaching field
=1 if director is male
.24
.43
1691
.34
.47
Father lives in HH
.65
.41
1691
.60
.41
Father’s # of yrs of educ.
4.73
3.55
1691
6.31
3.75
=1 if school gives or
.44
.46
1691
.42
.45
lends books for free
1.83
.78
1670
2.57
.76
Index of hh assets (e.g.
water, sewer, phone,
electricity; max=5)
=1 if student lives in a
.87
.29
1691
.97
.14
house or apt.
f_edyrs * dad_inhh
3.00
3.51
1670
3.65
3.74
.30
1.14
1648
.74
2.03
Guardian’s # of year of
ed. if no parent lives in
HH
m_edyrs * mom_in
4.14
3.77
1670
5.25
4.06
Mother’s # of yrs of educ.
4.72
3.65
1691
6.31
3.83
Mother lives in HH
.87
.29
1691
.84
.31
Monthly fee paid by stu
3.71
3.73
1691
20.53
24.89
373.10
187.88
1670
513.01
201.83
Per capita total 30-day
expenditure (municipality
level)
.69
.14
1670
.75
.14
% of persons aged 20+
completed pri
(municipality level)
% of chil aged 6-12 in
.77
.16
1670
.77
.14
school (municipality level)
=1 if school is private
N.A.
N.A.
N.A.
.14
.35
Index of various problems
N.A.
N.A.
N.A.
1.27
1.34
at school ; max=8
Student’s age
10.44
1.51
1691
16.02
2.99
=1 if stu is male
.45
.45
1691
.42
.43
.13
.29
1691
.49
.44
=1 if sch takes negative
action when student can’t
pay the fee
2.71
1.55
1691
3.94
1.67
Index of sch inputs (e.g.
library, water, etc.;
max=5)
% of persons aged 6-19
.35
.04
1670
.35
.03
(municipality level)
.22
.16
1670
.35
.14
% of persons aged 20+
completed sec
(municipality level)
% of chil aged 13-19 in
.62
.22
1670
.60
.16
school (municipality level)
Mean section size
29.31
13.73
1667
43.26
22.54
# of days teacher was
.89
1.53
1691
1.13
2.86
absent last month
Teacher’s # of yrs of
11.00
1.38
1691
14.81
3.38
educ.
Teacher’s yrs of exp in
9.41
5.02
1687
11.75
6.98
teaching field
=1 if teacher is male
.09
.24
1691
.48
.44
Teacher’s degree in math
0
0
1691
.11
.31
Teacher’s degree in
.01
.09
1691
.10
.20
span.
Teacher received training
.01
.06
1691
.19
.39
in math
Teacher received
.03
.12
1691
.05
.15
training in spanish
Ave. min. of travel to sch.
16.42
3.28
1670
17.63
3.28
# of
Obs.
1885
1885
1885
1885
1885
1885
1885
1885
1823
1885
1823
1801
1823
1885
1885
1885
1885
1885
1885
1823
1885
1885
1885
1885
1885
1885
1885
1885
1856
1885
1885
1885
1885
1885
1885
1885
1885
1885
urban_ls
Math test scores
Span test scores
De jure autonomy
De facto autonomy
Rural or urban (‘93
LSMS)
=1 if legally autonomous
% decisions made by sch
.42
.41
1670
.67
.34
1885
15.89
11.46
0.75
0.50
6.69
3.98
0.34
0.20
1136
1136
86
67
18.76
22.34
0.70
0.61
5.80
6.25
0.26
0.21
1253
1253
93
83