File 1

EVALUATION & EVIDENCE IMPROVING
EDUCATIONAL OUTCOMES
WELCOME
2
SOCIAL INNOVATION FUND
Lois Nembhard
Deputy Director
3
ABOUT CNCS
• The Corporation for National and Community Service
is an independent federal agency
• Dedicated to improving lives and strengthening
communities by fostering civic engagement through
service and volunteering, and identifying and scaling
effective solutions to community challenges
• The nation's largest grantmaker in support of service
and volunteering; engages more than five million
Americans annually in service to their communities
through programs such as Senior Corps and AmeriCorps
#SIFund
4
ABOUT THE SIF
The Social Innovation Fund (SIF) is a
program of the Corporation for National and
Community Service (CNCS).
SIF combines public and private resources
to grow the impact of innovative, communitybased solutions that have compelling
evidence of improving the lives of people in
low-income communities throughout the
United States.
WHY THE SIF?
“The bottom line is clear: solutions to America’s
challenges are being developed every day at the grass
roots – and government shouldn’t be supplanting
those efforts, it should be supporting those efforts.
“Instead of wasting taxpayer money on programs that
are obsolete or ineffective, government should be
seeking out creative, results-oriented programs … and
helping them replicate their efforts across America.”
- President Obama, June 30, 2009
#SIFund
6
OUR PROGRAMS
SIF Classic
SIF Pay for Success
up to 20% of appropriations
The SIF can use up to 20% of grant funds to PFS: Advancing
and developing emerging models that direct resources toward
interventions that produce measurable outcomes.
8
WHY THE SIF?
Find what works,
make it work for more people.
$295+ million
to 43
grantees
$600+ million
in match
funding
328
local
orgs
funded
625,000+
individuals
benefitting
THE SIF CLASSIC APPROACH
9
OUR FOCUS AREAS
Youth Development
Economic Opportunity
Healthy Futures
10
READING PARTNERS
Kristarae Flores
National Deputy Director of
Development
SCALING
Students Served
10000
8000
6000
4000
2000
0
9000
7242
5351
25
20
1999: YES Reading
founded by 3 retired
California teachers in
East Menlo Park
30
50
2002: YES
Reading
expands to
second
school site
1240 1916
855
472
302
275
90 120
2004: First
full-time
Executive
Director
hired
2008: Name
changed to “Reading
Partners” and
expansion to 3
schools in LA - first
outside Bay Area.
2010: Awarded
first AmeriCorps
grant and
expansion to DC
- first sites
outside of CA
Reading Partners has rapidly scaled since 2008
3395
2011: RP
selected as
Social
Innovation
Fund recipient
2013: 2018
Strategic
Plan
adopted
SCALE + IMPACT
EXPANSION HAS BEEN
RAPID
• We’ve grown from serving
855 students in ‘08-09 to
over 11,000 in ‘15-16
• We now serve 11 states,
14 metro areas, and more
than 160 schools across
the country
OUR IMPACT HAS BEEN
CONSISTENT
2
1.6
1.6
1.6
1.5
1
0.6
0.6
0.6
0.5
0
2011-12
2012-13
Prior to
RP
Enrollme
nt
While
Enrolled
in RP
2013-14
• Students have consistently shown average
gains of 1.6 months of reading skills for every
month in our program
• 74% of students narrow their grade-level
equivalency (GE) gap, indicating they have
made progress toward grade-level reading
STATE OF EDUCATION
8.7 MILLION LOW INCOME STUDENTS
READING BELOW GRADE LEVEL
Reading Partners’ Vision
We envision a future where all
children in this nation have the
reading skills they need to reach
their full potential
10.6 million low
income
K-5 students
nationally
18%
82%
OUR MISSION IS BIGGER
THAN US
Proficient
or above
Below
Proficient
Reading Partners’ Mission
To help children become lifelong
readers by empowering
communities to provide
individualized instruction with
measurable results
NATIONAL
IMPACT
GOAL: MAKE A MEANINGFUL IMPACT ON LITERACY ACROSS THE US.
Our first ten-years
Our next ten-years
Focus on rapid expansion
of our one-on-one tutoring
model
which has had significant
benefits
for individual students in
our program
Aspiration to dramatically
expand impact
and ultimately have a
positive effect on
national reading
proficiency
by building on success
achieved to date
and expanding access
to proven interventions
PRE-SIF: READING PARTNERS
• Developed 3-year strategic plan and wanted
rigorous research, but had not set a specific
objective of an RCT design
• Engaged in 3-year matched pairs quasiexperimental design in 2009 with Dr. Deborah
Stipek at the Stanford School of Education.
Analysis carried out by her doctoral students.
• Analysis indicated statistically and greater
results for Reading Partners students vs. nonReading Partners matched students using our
internal assessment measure. However, due to
budget constraints it was not a very rigorous,
causal design.
• Internally, had data from our internal measure
indicating improvements as well as data from
school teachers and principals suggesting they
were seeing improvements in reading amongst
Reading Partners students.
EVALUATION CROSSROADS
• The MDRC study was funded by EMCF
through our 2011 CNCS Social Innovation
Fund grant
• While positive results were a huge
opportunity, not seeing impact in a
rigorous, public evaluation was an equally
large risk
• We decided to approach as a chance to
“learn in” to our mission
EMCF + READING PARTNERS
• Funded Evaluation
• Engaged others in the True North Fund
• Placed a member of their staff on our
national board
• Connected us to MDRC
• Provided TA during
planning/communication/dissemination
phases
• Continues to support financially – enabling
us to replicate the core program and
engage in new work related to strategic
plan goals to help close the early reading
achievement gap
THE MDRC STUDY
• The evaluation was conducted by MDRC - a
leading, independent educational policy
research firm
General approach
and methodology
• The Edna McConnell Clark Foundation
(EMCF) funded the MDRC evaluation
through its CNCS Social Innovation Fund
(SIF) grant to Reading Partners
• The study involved more than 1,200 secondto fifth-graders across 19 schools in three
states and is one of the largest random
assignment studies ever done for a reading
intervention program administered
primarily by community volunteers
READING PARTNERS WORKS!
• Reading Partners program had a positive and
statistically significant impact on student
reading proficiency
•
Findings
Improved all 3 measures of reading proficiency
examined: reading comprehension, reading
fluency, and sight-word reading
• Impact equal to approximately 2 months of
growth in literacy achievement compared to the
control group
MDRC’s COST STUDY
•
Cost methodology
The MDRC study provides a snapshot cost
analysis in 2012 dollars to describe the
average value of resources and program costs
per student, per program year
• All resources used to implement a program
(including in-kind contributions) were given a cost,
regardless of who provides them
•
Volunteer time and transportation were counted
as “in-kind” donations or non-cash resources
• The same resource cost estimate method was
applied to other literacy services
• All costs reflect an average across six sites
•
Costs varied by site depending on factors
including number of students served, tutoring
sessions, and in-kind costs for volunteers
CONSISTENTLY EFFECTIVE
Findings
• Effective for a wide variety of students from
different grades and baseline reading levels, for
male and female students, different ethnicities
and for non-native English speakers
• Reading Partners’ volunteer tutoring program
was implemented with high fidelity
How will we
build on MDRC
evaluation
learning
CONTINUOUS LEARNING
•
We have a second SIF evaluation underway in
Colorado
-
•
The Mile High United Way SIF impact evaluation
is in year 3 of 3 with results expected about
implementation and impact in 2017
We are exploring additional research that would
benefit both Reading Partners and the broad
fields of early literacy, tutoring and socialemotional development, education reform,
national service, civic engagement, etc.
SCALE + IMPROVEMENT
Target Population Redefinition
Beginning of Year Reading Level
While innovation will
play an important
part of achieving
this vision, this effort
will focus on
expanding and
improving our
current program to
boost more students
into proficiency.
Proficiency
Current Target
Possible New Target
4
3
2
1
0
1
2
3
Grade Level
4
• Strategically reframe student target population to
serve those furthest behind earlier, and focus on
pushing fourth graders into proficiency
• Research-based changes and improvements to
curriculum
• Serve more students per site with higher dosage
NATIONAL IMPACT
TODAY
WITHIN 4
YEARS
WITHIN 5-10 YEARS
>10 YEARS
SCHOOL LEVEL
IMPACT
DISTRICT LEVEL
IMPACT
STATE LEVEL
IMPACT
NATIONAL LEVEL
IMPACT
• The average RP
elementary school
has ~450 students
and ~70-90 students
per grade
• Focusing on districts
with supportive
conditions could
enable Reading
Partners to achieve
district-level scale and
increase district-wide
4th grade reading
proficiency amongst
low-income students
by ten (10)
percentage points
• Serving up to 80
students in a school
over a multi-year
period could lead to
notable gains in
school-level reading
proficiency rates by
the time students
reach 4th grade
• Success in districts
could draw the
attention of
policymakers and
ultimately change
funding flows at the
state level
• More funding for
evidence-based
literacy interventions
could support more
students across states
to achieve 4th grade
reading proficiency
• As more states
support evidencebased literacy
interventions, more
youth across the
nation could have
access to the
supports required to
be proficient in
reading by 4th grade
• This success could
ultimately result in
increased federal
funding flows as well
INVESTMENT IN INNOVATION
READING PARTNERS
TODAY
+11,000 students
+14,000 volunteers
+160 school partners
9 states (including DC)
12 metro areas
$22M operating budget
Student-level outcomes
 75% narrow achievement gaps
 MDRC evidence of significant
outcomes
PLANNED GROWTH FUND
INNOVATIONS
READING PARTNERS
IN 2018
 Deep district partnerships to
saturate target areas
+20,000 students
+320 schools
 Piloting additional interventions to
increase proficiency:
 Summer programming
 Engaging populations in
addition to community
volunteers as tutors
 Multi-year support for high
need students
$30M operating budget
 Licensing our curriculum to reach
thousands more students
Influence funding what works,
share best practices, and policy
90% of students narrow
achievement gaps
Increase students reading
proficiently by >10% in target
areas
UNIVERSITY OF MICHIGAN
Professor
Robin Jacob
Research Associate
University Of Michigan
EVALUATION DESIGN
• Reading Partners was competitively selected
for the EMCF SIF known as the True North
Fund
• Evaluation period: School Year 2012-2013
• Study sample
– Grades 2 - 5
– Eligible for Reading Partners
– 1,250 students randomly assigned within 19 schools
RESEARCH QUESTIONS
Implementation: In what context was Reading
Partners implemented and was it implemented
with fidelity?
Impact: What effect does the Reading Partners
program have on students who participate?
– Impact on receipt of services
– Impact on reading proficiency
Cost: What resources are needed to implement
the program? What proportion of these are borne
by the school?
OVERVIEW: IMPLEMENTATION
Average Student's Experience in Reading Partners
Program Characteristic
Program
Group
School-Level Averages
Maximum
Minimum
1.55
1.11
1.76
Length of participation in program (weeks)
28.13
24.24
32.01
Student attendance ratea (%)
78.76
55.75
88.98
2.52
1.67
3.60
Duration of each tutoring relationship (weeks)
19.81
11.20
26.01
Scheduled sessions per week with primary tutora
Scheduled once per week (%)
Scheduled twice per week (%)
76.38
23.62
39.58
8.76
91.24
60.42
Number of sessions per week
Number of tutors assigned
Sample size
594
FINDINGS: FIDELITY OF
IMPLEMENTATION
Fidelity Scores of Study Schools
Average Score = 17.9
25
Fidelity score
20
18.5
15.5
16.5
17
17
17
17
D
E
F
G
H
18.5
19
19
19
19.5
19.5
20
L
M
N
O
P
Q
22
22
R
S
17
14
15
12
10
5
0
A
B
C
Regular 1:1 tutoring
(maximum score = 3)
Space and materials
(maximum score = 5)
I
J
K
Study schoolsa
Data-driven instruction
(maximum score = 5)
Training
(maximum
score = 5)
Supervision and support
(maximum score = 7)
FINDINGS: TREATMENT
CONTRAST

65% of students in the control group were also
receiving services

21% of students in the control group received oneon-one tutoring
• Impact of the program is the impact relative to
other supplemental service receipt, not the
impact of Reading Partners compared to no
intervention
TIME SPENT IN READING
INSTRUCTION
Time Spent in Reading Instruction and Supplemental Services
600
48 minutes
500
Weekly Minutes
400
Reading Partners
Supplemental services, excluding
Reading Partners
300
In-class one-on-one instruction
In-class group instruction
200
100
0
Program Group
Control Group
FINDINGS
Reading Partners had a positive impact on
all three measures of reading proficiency
• Sight word reading
• Fluency
• Reading comprehension
Effect sizes around .10 standard deviations
• 1.5 to 2 months of additional growth compared
to the control group
No statistically significant impact on teacher
reports of academic behavior or
performance
VARIATION: STUDENT
CHARACTERISTICS
The program was effective for a wide range
of students
• Boys and girls
• English language learners and fluent English
speakers
• Different baseline reading abilities
The program appears to be particularly
effective for those beginning the study with
the weakest reading skills
VARIATION: INCOMING ABILITY
Reading Partners Improves Reading Skills
for the Lowest-Achieving Students
0.25
**
Effect Size
0.2
*
0.15
Bottom quartile
0.1
**
**
Top 3 quartiles
0.05
0
Reading
comprehension
Sight word
efficiency
Fluency
COST OF OTHER READING
SERVICES
Site-Level Costs of Reading Partners and Other Supplemental Services
Site
School Contribution per Student ($)
Other
Reading
Supplemental
Partners
Servicesc
Total Resources per Student ($)
Other
Reading
Supplemental
Partners
Services
Contribution Provided by School (%)
Other
Reading
Supplemental
Partners
Services
Site A
690
1,840
3,450
1,840
20
100
Site B
520
1,850
3,420
2,230
15
83
Site C
940
2,680
3,570
2,680
26
100
Site D
1,270
1,040
5,190
1,050
25
99
Site E
660
1,310
4,210
1,980
16
66
Site F
480
4,890
2,740
4,890
17
100
Pooled
710
1,700
3,610
1,780
20
96
SOURCES: MDRC calculations from cost data.
PROGRAM EVALUATION:
THINGS TO CONSIDER
• Evaluation should not be a simple thumbsup/thumbs down assessment
– A way to learn about your program
• What works well and what doesn’t?
• For whom does your program work and in what
context?
• If you don’t find positive impacts, what would you want
to know?
• What is your program’s theory of change?
– This can help clarify what your key evaluation
questions should be
A FRAMEWORK FOR STUDYING
PROGRAM EFFECTIVENESS
(Weiss, Bloom and Brock, 2014)
EVALUATION QUESTIONS
• Measure implementation and fidelity
– What are the key elements of your program that
should be measured?
• Understand service contrast
– What will the participants in the control group be
getting?
• Measure variation in outcomes
– What works, for whom and in what context
• Consider cost
– A program with a small impact but a low cost can
be considered effective
WORKING WITH A 3RD PARTY
EVALUATOR
• Communication is key to an effective
partnership
– Set up regular meetings
– Work together to specify the evaluation questions
and an analysis plan
• Agree to stick to the plan
• Consider what you would want to know if the results are
not positive
– Ask to see early findings
– Make sure results are easily understood by the
general public and your constituents
QUESTIONS & ANSWERS
CLOSING: 3-2-1:
3 ideas, 2 insights, 1 question