Teacher Incentives

Impact Evaluation of the Potential for
Teacher Incentives
to Improve Outcomes
Deon Filmer
Development Research Group, The World Bank
Evidence-Based Decision-Making in Education Workshop
Africa Program for Education Impact Evaluation (APEIE)
Accra, Ghana
May 10-14 2010
1
Teacher incentives
• How to ensure that teachers make the most
effort possible?
2
Teacher absenteeism
Percent of teachers absent on the day of an unannounced visit to the school
30
25
20
15
10
5
0
Ecuador
India
Indonesia
Papua
New
Guinea
Peru
Zambia
Uganda
3
Teacher incentives
• Intrinsic motivation
• Extrinsic motivation
• Models for improving incentives:
– Higher standards for entry, higher average pay and
pay gradients, career progression linked to skills and
performance
– “Contract teachers” -- local hiring, no tenure,
performance evaluated directly by the school
community
– “Pay for performance” – bonus pay linked to effort
and/or results
4
How could teacher incentives lead to
better outcomes?
• Quality of staff
– At entry: Accreditation and merit-based incentives lead to higher quality
teaching professionals joining the education system
– In remote areas: Locality-based financial incentives improve equity of teacher
placements
• Increased attendance and effort
– Decentralized hiring and monitoring of teacher performance leads to higher
teacher attendance rates & teacher effort
– Pay-for-performance systems incentivize improved quality and quantity of
teaching (at least in the short-term) and thus improve student test scores.
• Sustained effort
– Teacher career advancement incentives stimulate sustained teacher effort,
thereby improving student performance
• (Empirically testable) Assumption is that increased quality and effort
increase learning
5
Focus today:
Teacher pay for performance schemes
• Linking pay to performance
– As measured by tests
– Note: Could also be linked to effort
• e.g. presence in the classroom at the beginning and
end of day
6
Teacher pay for performance schemes:
Potential downsides
• Assumes teacher know how to improve teaching
• Difficulty of accounting for characteristics of student body
• Perverse impacts at the level of students:
– teaching to the test
– manipulating who takes a test
• Perverse impacts at the level of teachers:
– Demoralization
– undermining intrinsic motivation
 Impact evaluation will help us understand the tradeoffs
between potential upsides and downsides of incentives
7
Case Study 1:
Teacher incentives in India
• Teacher Performance Pay: Experimental
Evidence from India
• By Karthik Muralidharan (University of
California San Diego) and Venkatesh
Sundararaman (World Bank)
8
Location of Study
• Indian State of Andhra
Pradesh (AP)
-5th most populous state of India
Population of 80 Million
-23 Districts (2-4 Million each)
• Close to All-India averages
on many measures of human
development
India
AP
Gross Enrollment
(6-11) (%)
95.9
95.3
Literacy (%)
64.8
60.5
Teacher Absence
(%)
25.2
25.3
63
62
Infant Mortality
(per 1000)
9
Incentive design
• Teachers were given bonus payments over and above their
regular salary on the basis of average improvement of test
scores of all students in grade/school over base line
– Subjects considered were math and language
– Assessment papers were designed by an independent testing
agency (EI)
– All assessments were conducted by an independent NGO (APF)
• Bonus formula
– Rs. 500 bonus for every 1% point improvement in average
scores
– Calibrated to be around 3% of annual pay (and equal to input
treatments)
• Both group and individual level incentives were studied
– Free-riding/Peer monitoring/Gains to cooperation
10
Design Overview
INCENTIVES
(Conditional on Improvement in Student Learning)
NONE
NONE
INPUTS
(Unconditional)
CONTROL
(100
Schools)
GROUP
BONUS
INDIVIDUAL
BONUS
100 Schools 100 Schools
EXTRA
CONTRACT 100 Schools
TEACHER
EXTRA
BLOCK
GRANT
100 Schools
11
Summary of Experimental Design
• Study conducted across a representative sample of 500 primary
schools in 5 districts of AP
• Conduct baseline tests in these schools (June/July 05)
• Stratified random allocation of 100 schools to each treatment
(2 schools in each mandal to each treatment) (August 05)
• Monitor process variables over the course of the year via
unannounced monthly tracking surveys (Sep 05 – Feb 06)
• Conduct 2 rounds of follow-up tests to assess the impact of various
interventions on learning outcomes (March/April 06)
• Interview teachers after program but before outcomes are
communicated to them (August 06)
• Provide bonus payments and communicate continuation of
program (Sept 06)
12
Results
Table 2: Impact of Incentives on Student Test Scores
Dependent Variable = Normalized End of Year Test Score
Combined
Normalized Lagged Test Score
Incentive School
Observations
R-squared
Maths
Telugu
Year 1 on
Year 0
Year 2 on
Year 1
Year 2 on
Year 0
Year 1 on
Year 0
Year 2 on
Year 0
Year 1 on
Year 0
Year 2 on
Year 0
[1]
[2]
[3]
[4]
[5]
[6]
[7]
0.5
(0.013)***
0.553
(0.016)***
0.45
(0.015)***
0.49
(0.017)***
0.418
(0.022)***
0.516
(0.014)***
0.484
(0.014)***
0.153
0.143
0.217
0.188
0.277
0.119
0.158
(0.042)***
(0.035)***
(0.047)***
(0.049)***
(0.055)***
(0.038)***
(0.043)***
68702
0.29
78613
0.29
49516
0.23
34121
0.28
24592
0.22
34581
0.32
24924
0.25
Note: Smaller impact also found on non-incentivized subjects (science; social studies)
13
Group versus Individual Incentives
Dependent Variable = Normalized Endline Test Score
Y1 on Y0 Y2 on Y1
Y2 on Y0
Combined
Combined
Combined
Maths
Telugu
[1]
[4]
[7]
[8]
[9]
0.5
0.554
0.451
0.418
0.485
(0.013)***
(0.016)***
(0.015)***
(0.022)***
(0.014)***
0.160
0.198
0.271
0.321
0.223
(0.049)***
(0.044)***
(0.058)***
(0.068)***
(0.053)***
0.146
0.087
0.162
0.232
0.092
(0.050)***
(0.045)*
(0.058)***
(0.071)***
(0.052)*
68702
78613
49516
24592
24924
F-Stat p-value (Testiing GI = II)
0.78
0.05
0.12
0.29
0.03
R-squared
0.29
0.3
0.23
0.23
0.25
Normalized Lagged Score
Individual Incentive School (II)
Group Incentive School (GI)
Observations
Notes:
* significant at 10%; ** significant at 5%; *** significant at 1%
14
How did teacher behavior change?
Table 11: Teacher Behavior (Observation and Interviews)
Incentive versus Control Schools (All figures in %)
Coefficient of behavior
indicator on test score
Incentive
Schools
Control
Schools
p-Value of
Difference
Teacher Absence (%)
0.24
0.24
0.82
Actively Teaching at Point of Observation
(%)
0.44
0.42
0.57
0.124 ***
Did you do any special preparation for the
end of year tests? (% Yes)
0.63
0.25
0.000***
0.102 ***
Extra Homework
0.42
0.15
0.000***
0.085 **
Extra Classwork
0.46
0.17
0.000***
0.091 ***
Extra Classes/Teaching Beyond School
Hours
0.16
0.04
0.000***
0.181 ***
Gave Practice Tests
0.31
0.10
0.000***
0.111 ***
Paid Special Attention to Weaker Children
0.21
0.05
0.000***
0.017
Teacher Behavior
-0.110 **
What kind of preparation did you do?
(UNPROMPTED) (% Mentioning)
15
Summary of results
• Incentive schools perform significantly better (0.22 SD)
– Improvements are across the board (all grades, districts, baseline
scores)
– Limited evidence of heterogeneous treatment effects
– Children in incentive schools perform better on mechanical and
conceptual components of test, and also on non-incentive subjects
• No difference between group and individual incentives in the first
year – but in the second year the individual incentives start
outperforming the group incentives
• Teacher absence does not change, but incentive school teachers
report higher levels of teaching activity conditional on attendance
• These differences in behavior are correlated with learning
outcomes
• Much more cost effective than inputs of the same value
16
Case Study 2:
Teacher incentives in Brazil
• Encouraging quality: Evaluating the effects of
a bonus for Performance in education in
Brazil
• By Claudio Ferraz (PUC Rio) and Barbara Bruns
(World Bank)
17
Brazil: Study Aims to Understand…
• The effects of the introduction of a system of
bonuses for students’ performance based on
standardized tests.
• Variation in the impact of bonus according to
characteristics of schools (e.g. social cohesion;
teacher profiles).
• Strategies used to improve performance.
18
Study area: Pernambuco State, Brazil
19
Features of Brazil case study
• The Program of Educational Performance Bonus in
Pernambuco was created by 2008 law
• Its goal was to create incentives for improvement in
the quality of education, rewarding employees of
schools that meet school-specific performance targets
• In the first year, targets were based on an index* of
performance in 2005 (the last available information).
Three groups of school:
• high performance
• mid-performance
• low performance
* Index = Average test score * pass rate
20
Features of Brazil case study
• System wide implementation (not
“experiment)
• Causal analysis of impacts possible using:
– Differences-in-differences
– Regression Discontinuity designs exploiting
annual targets and rules for bonus
21
Impact evaluation methodology
• Differences-in-differences
– Compare the performance of State schools of
Pernambuco with State schools in other neighbouring
States, before the bonus program (2005-2007) and
after (2009)
• Regression discontinuity
– Targets are set according to whether the school was in
the low, middle, or high category
• Low: reach the average score for the state of Pernambuco.
• Middle: reach 10% over the average index level for the
Northeast region
• High: reach the average index level for all Brazilian states.
22
Illustration of RD design
3.5
4
Goal for 2008 (in Portuguese) for each school (according to 2005 level)
2.5
3
How do outcomes in
these
schools …
1.5
2
… differ from outcomes in
these schools
1
2
3
IDEPE 2005 Lнngua Portuguesa 8a
4
23
Bonus determination
• The proportion of goal reached by school is
calculated as
PI = (actual progress) / (required progress)
• Schools with at least 50% earn bonus
• Bonus is determined by initial salary and with
the percentage of the target achieved
24
Brazil: Outcome measures
• Student learning and repetition, teacher
attendance, school-level planning activities
• School level trust and social capital
• Teacher behavior “inside the black box” via
standardized classroom observations
• Dynamic effects of schools’ receiving/not
receiving bonus on subsequent years’ strategy
and effort…and do schools know what to
do??
25
Preliminary results on Portuguese
test scores
26
Brazil: “Stallings” method of classroom
observation
• Used in all study schools to measure potential
changes in in-classroom behavior
27
Brazil: Example of data generated
Years
Use of classroom Time
TOTAL
Sample
3rd
5th
9th
1. Learning Activities
65%
62%
68%
63%
2. Reorganization of
classroom
25%
29%
21%
25%
3. Non-academic
10%
9%
11%
11%
7%
7%
8%
8%
8%
2%
4%
5%
3.1. Teacher
outside room
No teacher
28
Thank you
29