MRC 07-08 Evaluation - Minnesota Reading Corps

Minnesota Reading Corps
Final Evaluation
2007-2008
Statewide Report
Prepared By:
Kerry Bollman, SSP, NCSP
Academic Collaborative Planner, Reading Center Director
Saint Croix River Education District
[email protected]
Benjamin Silberglitt, PhD
Senior Consultant, Assessment and Implementation
Technology and Information Educational Services
[email protected]
David Heistad, Ph.D
Director, Research Evaluation and Assessment
Minneapolis Public School District
[email protected]
Table of Contents
Background of Minnesota Reading Corps
3
Evaluation Design
4
 Assessment Data Collection
Evaluation Report
6
7
 Is the goal of the Minnesota Reading Corps valid in terms of the expectations of
student learning?
7
 What is the current impact of the MRC on the state of Minnesota in terms of
students and programs receiving support?
9
 Are the data collection tools being used valid and reliable to determine whether
children are attaining the literacy-learning goal?
11
 Are the Members implementing assessments correctly?
19
 Do the interventions used with children have a research base?
20
 Are the Members implementing the interventions correctly?
25
 Is the performance of students in terms of their literacy improvement consistent
with expectations?
26

Pre-Kindergarten performance
26

Pre-Kindergarten matched sample analysis
32

K-3 performance
36

K-3 pilot analysis on MCA II outcomes
41
 Are the organizations with which the MRC is working changing to adopt the
practices of the MRC?
45
 What is the impact of the MRC experience on the AmeriCorps Members?
47
References
49
2
Background of Minnesota Reading Corps
Minnesota Reading Corps (MRC) is an AmeriCorps program that provides trained literacy tutors
(Members) for children age three to grade three. Some MRC Members work with preschoolers and
focus on integrating talking, reading, and writing into all activities. Other Members provide
supplemental literacy skills tutoring for children in kindergarten to third grade. Still others recruit,
train, and manage community volunteers to expand the capacity of the program.
MRC Members and volunteers from the community are trained in specific research-based, leveled
literacy instructional protocols, and are supported by expert coaches. Members use reliable, valid
assessment tools to monitor student progress on a regular basis, and with help from their expert
coaches, use data from assessments to inform tutoring strategies for each student. Use of specific
research-based instructional techniques and technically adequate assessment tools for decision
making make the MRC program both highly unique and highly coveted across the literacy landscape.
See the body of this report for additional information regarding the instruction and assessment tools
used.
The vision of the Minnesota Reading Corps is to provide a permanent part of Minnesota’s solution
to childhood illiteracy by impacting children, AmeriCorps members and communities as follows:
 All children in MN, ages 3 to grade 3, who qualify for MRC, will have access to MRC and will
meet reading standards by third grade.
 AmeriCorps members, through the training, development and service opportunity provided by
MRC, will pursue education related careers and/or continue to be ambassadors for children's
literacy throughout their lives.
 Schools and community institutions/organizations, through their experiences with MRC, will
understan, adopt, and promote the MRC methods for increasing literacy
3
Evaluation Design
The evaluation of the Minnesota Reading Corps (MRC) program has multiple purposes that will be
explained below. The student performance results reported will be an aggregation of the formative
evaluation information gathered by AmeriCorps Members. A variety of tools are used to gather the
student performance data depending on the age of the student and the literacy indicators deemed
crucial for ultimate grade-level reading.
The Evaluation is designed to address the following questions:
1. Is the Goal of the Minnesota Reading Corps valid in terms of the expectations of student
learning?
The Evaluation will consist of a literature review to address this question.
2. What is the current impact of the MRC on the state of Minnesota in terms of students and
programs receiving support?
The Evaluation will address:
i. The number of children receiving MRC support
ii. Demographics of program participants
3. Are the data collection tools being used valid and reliable to determine whether children are
attaining the literacy-learning goal?
The Evaluation will address:
i. The validity and reliability of the tools being used to measure student learning; and,
ii. The fidelity of the AmeriCorps Members use of those tools.
4. Do the interventions used with children have a research base and are Members
implementing the interventions correctly?
The Evaluation will address:
i. The research base of the instruction interventions being used with children; and,
ii. The fidelity of the AmeriCorps Members' implementation of those interventions.
5. Is the performance of students in terms of their literacy improvement consistent with
expectations?
The Evaluation will address:
i. The performance of children in pre-k programs during the year;
ii. The performance of children in k-3 programs during the year; and,
iii. To the extent that data are available, the performance of children that have been in
MRC programs over a period of more than one year.
6. Are the organizations with which the MRC is working changing to adopt the practices of the
MRC?
4
The Evaluation will address level of “systems change” that is occurring at the sites involved
with the MRC to adopt the basic model of the MRC.
7. What is the impact of the MRC experience on the AmeriCorps Members?
The Evaluation will seek to determine how this participation in community service has
impacted the persons serving as AmeriCorps Members.
5
Assessment Data Collection
The following assessment data will be collected by Minnesota Reading Corps (MRC) Members for
all participating MRC students during the 2007-2008 school year:
Pre-school Programs
Age 3
on or before
Sept 1st *
Age 4
on or before
Sept 1st
Age 5
on or before
Sept 1st but not
enrolled in
Kindergarten




Fall (Oct 1 -12)
IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration



Winter (Jan 14-25)
IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration



Spring (April 21 – May 2)
IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration










IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration
Letter Naming Fluency
Letter Sound Fluency
IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration
Letter Naming Fluency
Letter Sound Fluency










IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration
Letter Naming Fluency
Letter Sound Fluency
IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration
Letter Naming Fluency
Letter Sound Fluency










IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration
Letter Naming Fluency
Letter Sound Fluency
IGDI Rhyming
IGDI Picture Naming
IGDI Alliteration
Letter Naming Fluency
Letter Sound Fluency
Program participants are strongly encouraged, but not required, to collect benchmark data on
three-year-old students in classrooms served by Minnesota Reading Corps Members.
K-3 Programs
Kindergarten


Grade 1
Grade 2




Grade 3

Fall (Sept 17 – 28)
Letter Naming Fluency
Letter Sound Fluency
Letter Naming Fluency
Letter Sound Fluency
Nonsense Word Fluency
Oral Reading Fluency (3
passages)
Oral Reading Fluency (3
passages)







Winter (Jan 7 -18)
Letter Naming Fluency
Letter Sound Fluency
Nonsense Word Fluency
Nonsense Word Fluency
Oral Reading Fluency (3
passages)
Oral Reading Fluency (3
passages)
Oral Reading Fluency (3
passages)






Spring (May 12 -23)
Letter Naming Fluency
Letter Sound Fluency
Nonsense Word Fluency
Oral Reading Fluency (3
passages)
Oral Reading Fluency (3
passages)
Oral Reading Fluency (3
passages)
All Programs
End of Year MRC Member Surveys
End of Year MRC Site Supervisor and Internal Coach Surveys
6
Evaluation Report
Is the Goal of the Minnesota Reading Corps valid in terms of the expectations of student
learning?
There is near universal agreement in Minnesota and the United States that teaching children to read
is of critical importance to our society at large and to each child individually. Rod Paige, former US
Secretary of Education (2001-2005) wrote “The child who can read on grade level will have a
reasonable chance at every opportunity that life sends his way. The child who cannot read will be in
trouble every step of his life until that handicap is lifted from his shoulders.” In their important
work, Preventing Reading Difficulties in Young Children (1998), Catherine Snow and her colleagues
write:
“Reading is essential to success in our society. The ability to read is highly valued and
important for social and economic advancement….In a technological society, the demands
for higher literacy are ever increasing, creating more grievous consequences for those who
fall short.”
Research on effective methods for promoting childhood literacy has been conducted for many
decades, with the time beginning in the 1980s being particularly productive in terms of learning what
factors support literacy gains as well as those factors that inhibit literacy gains (Rayner, Foorman,
Perfetti, Pesetsky, & Seidenberg, 2001). This research has also unequivocally demonstrated that
providing strong early language and literacy experiences in both the pre-school and early elementary
years, as well as early intervention for any reading difficulties within these first few years of school is
the most successful and efficient method for ensuring that large numbers of children become
proficient readers. (Vellutino et. al., 1998; Torgesen et. al., 2003; Torgesen et. al., 1999).
The work of promoting literacy for all children moved boldly to the political forefront most notably
within the United States when the federal No Child Left Behind Act of 2001 (PL 107-110) was
passed, requiring states to set reading standards that determine whether a child has attained adequate
reading skills by third grade. As a result of this legislation, there has been a rapid development of
statewide assessment across the country, and each state now has explicit standards for third grade
level reading, along with an assessment tool that further operationalizes these state standards in the
form of test questions students must pass to demonstrate proficiency.
In the state of Minnesota, the assessment tool used to determine reading proficiency in third grade
and beyond is the Minnesota Comprehensive Assessment II, taken annually by students each spring.
Each following fall, results of student performance statewide are broadly advertised, sharing the
percent of students attending each school earning proficient scores. Effectiveness of educational
programming within schools may be ultimately evaluated based on the extent to which students
meet these standards.
A primary goal of the Minnesota Reading Corps (MRC) is to ensure that all children in MN, ages 3
to grade 3, who qualify for MRC, will have access to MRC and will meet reading standards by third
grade based on the Minnesota Comprehensive Assessment II.
7
In order to assess progress toward this primary goal of the Minnesota Reading Corps program, a
plan has been designed to track and evaluate student progress from age 3 to grade 3. Fluency based
measures of pre-literacy, early literacy, and oral reading skills were selected as appropriate for each
age level. These measures are empirically supported by over 3 decades of extensive research
documenting their impressive statistical validity and reliability for use in determining current literacy
skills of children, and in predicting future reading success. References to this body of research are
listed further in this report. A second significant benefit to the chosen assessment materials is that
outcomes on these literacy assessments have documented predictive and concurrent validity with the
Minnesota Comprehensive Assessment II (MCA II). (Hintze & Silberglitt, 2005; Silberglitt et.al.,
2005; Silberglitt et al., 2006).
As a result of the strong correlations between performance on the selected fluency measures and on
the MCA II, a series of cut scores has been identified for the fluency measures used in the MRC
project. These cut scores, or target scores, define levels of performance on the fluency measures
that strongly predict future success on the grade 3 Minnesota Comprehensive Assessment II. A cut
score defines the critical score such that students who are at or above the cut score are considered to
be proficient in their developing skills. These cut scores can then act as a benchmark against which
teachers can quickly judge students’ reading proficiency. Students who are below target at a given
grade and season (fall, winter, or spring) are determined to be in need of additional support.
(Silberglitt & Hintze, 2005; Bollman, Silberglitt, & Gibbons, 2007). These cut scores are used by the
Minnesota Reading Corps program to identify students who are at increased risk for not passing the
grade three MCA II for participation in the MRC program. The use of fluency based target scores
to predict which students may be at risk for not passing the MCAII in the future is powerful as it
allows school systems to intervene in the early grades when these interventions are more likely to be
effective (Kame’enui, 1993). At the Pre-K level, all children in participating classrooms receive a
literacy-rich instructional environment, and have access to more intentional and intensive supports
as needed. In the K-3 system, students at risk participate in regular supplemental tutoring in
addition to the core instruction provided by the school. For all students participating in the MRC
program, frequent fluency measures data are collected to determine progress toward grade level cut
scores. Students exit the K-3 program when the fluency data indicate that they are on track to meet
spring targets, and therefore have a strong likelihood of passing the MCA II in grade 3.
There is some precident in our country for a model such as this one. A recent review of the
literature strongly suggests that programs such as the Minnesota Reading Corps, that harness the
power of community service to provide reading support in school settings, have the potential to be
successful (Erlbaum, Vaughn, Hughes, & Moody, 2000).. The success of these programs appears to
be related to a series of attributes that are very consistent with the defining features of the MRC
program: (a) specific well defined skills being targeted for intervention, (b) a generous amount of
quality education and support provided to community partners, (c) high levels of investment among
community members and the integrity with which they apply tutoring procedures, and (e) a high
level of commitment to assessing intervention outcomes (Johnston et al., 1998; Vadasy et al., 1997;
Power et al, 2004).
8
What is the Current Impact of the MRC on the State of Minnesota in Terms of Students and
Programs Receiving Support?
In the tables below, the number of Minnesota Reading Corps Members, full or part time, serving
during the 2007-2008 school year, who collected data for students and submitted the data for
evaluation is recorded, along with the number of students receiving MRC services for whom data
are recorded. Numbers of participating students are compiled according to the following criteria:





Number of students for whom at least 1 assessment datum point was collected
Number of preschool students with complete data from 3 benchmark windows
Number of K-3 students with at least 5 consecutive weeks of data on at least 1 measure
Number of K-3 students with at least 10 consecutive weeks of data on at least 1 measure
Number of K-3 students with at least 20 consecutive weeks of data on at least 1 measure
Table 1: Pre-Kindergarten Participation
Region
Number of
Members
1 or More Assessments
3 Assessments
Duluth
Grand Rapids
Metro
Moorhead
Rochester
Saint Cloud
Total
3
15
24
13
6
14
75
34
146
427
204
109
148
1068
20
100
203
92
40
97
552
Table 2: Kindergarten-Grade 3 Participation
Region
Number of
Members
1 or More
Assessments
Duluth
Grand Rapids
Metro
Moorhead
Rochester
Saint Cloud
Total
20
8
46
12
10
7
103
573
203
1475
364
233
219
3067
5 or More
Assessments on at
least 1 Measure
(LSF, NWF, or
ORF)
555
193
1358
338
191
203
2838
10 or More
Assessments on at
least 1 Measure
(LSF, NWF, or
ORF)
412
122
916
215
134
112
1911
20 or more
Assessments on at
least 1 Measure
(LSF, NWF, or
ORF)
207
56
387
80
56
38
824
In order to more fully describe the population of children served by the Minnesota Reading Corps
program, data regarding gender, ethnicity, special education entitlement and primary language
spoken were collected by Reading Corps Members. These demographic data are summarized in the
table below. Regarding special education entitlement, since the pre-k program serves all students in
participating classrooms, rates of entitled students reflect the naturally occurring rates of entitlement
in participating sites. At the K-3 level, it is more rare for a student entitled to receive special
9
education services to participate in Reading Corps as they typically have a specific intensive reading
instructional program already in place.
Table 3: Pre-Kindergarten – Grade 3 Participant Demographic Data
Gender
Pre-K
K-3
Ethnicity
Special Education
Entitlement
Primary Language
51% Male
49% Female
0%Unknown
18% African American
7% American Indian
2% Asian
13% Hispanic
7% Multiple
0% Pacific Islander
5% Unknown
49% White
88% General Education
12% Students with IFSPs
77% English as Primary
Language
23% ELL
49.39% Male
44.61% Female
6.00% Unknown
13.34%% African American
2.77% American Indian
2.00% Asian
4.35% Hispanic
2.53% Multiple
0.11% Pacific Islander
28.99% Unknown
45.17% White
96.81% General Education
1.68% Title 1
1.51% Students with IEPs
94.59% English as Primary
Language
5.41% ELL
10
Are the Data Collection Tools Being Used Valid and Reliable to Determine Whether
Children are Attaining the Literacy-Learning Goal?
As listed in the above, the assessment tools used to determine literacy progress of MRC-participating
students include the following measures:







Picture Naming Fluency
Alliteration Fluency
Rhyming Fluency
Letter Naming Fluency
Letter Sound Fluency
Nonsense Word Fluency
Oral Reading Fluency
These tools were selected for use in the MRC because of their well-established statistical reliability
and validity for screening and progress monitoring purposes. Picture Naming, Alliteration, and
Rhyming measures were developed through the University of Minnesota, and are commonly
referred to as “Individual Growth and Development Indicators” (IGDIs) of literacy. Letter
Naming, Letter Sounds, and Nonsense Words are measures of early literacy skills thoroughly
researched by many groups, but most famously packaged by two assessment programs: DIBELS
and AIMSweb. Oral Reading Fluency provides an assessment of connected text reading. Early and
ongoing research on this measure has also been conducted at the University of Minnesota. All these
measures fit under the umbrella of “Curriculum-Based Measurement (CBM), and are fluency based
assessments, meaning that students are given an unlimited opportunity to respond to items within a
fixed amount of time, and the number of correct responses is counted. The information the follows
summarizes empirical findings related to the statistical reliability and validity of the measures used in
the Minnesota Reading Corps.
Picture Naming Fluency:
r= .44 to .78 1 month alternate form reliability
r=.67 test-retest 3-week reliability
r=.47 to .75 with PPVT-3 and .63 to .81 with PLS-3
r=.32 to .37 with DIBELS Letter Naming Fluency and .44 to .49 with DIBELS Initial Sound
Fluency
r=.41 (longitudinal) and .60 (cross sectional) between scores and chronological age, with correlations
of .63, .32, and .48 for typically developing, HeadStart, and ECSE populations respectively
Sources:
McConnell, S.R., Priest, J.S., Davis, S.D., & McEvoy, M.A. (2002). Best Practices in Measuring
Growth and Development for Preschool Children, In A. Thomas & J. Grimes (Eds.), Best Practices
in School Psychology (4th ed., Vol. 2, --.1231-1246). Washington DC: National Association of School
Psychologists.
Missall, K.N., & McConnell, S.R. (April, 2004). Psychometric Characteristics for Individual Growth
and Development Indicators: Picture Naming, Rhyming, and Alliteration (Technical Report).
11
Minneapolis, MN: University of Minnesota. Accessed online at
http://ggg.umn.edu/techreports/ecri_report8html July 27, 2004.
Missall, K.N. et. al. (2007). Examination of Predictive Validity of Preschool Early Literacy Skills.
School Psychology Review 36 (3) 433-452.
Missall, K. N., McConnell, S. R., & Cadigan, K. (2006). Early literacy development: Skill growth and
relations between classroom variables for preschool children. Journal of Early Intervention, 29, 1-21.
Phaneuf, R. L., & Silberglitt, B. (2003). Tracking preschoolers' language and pre-literacy
development using a general outcome measurement system. Topics in Early Childhood Special
Education, 23, 114-123.
Priest, J. S., McConnell, S. R., Walker, D., Carta, J. J., Kaminski, R. A., McEvoy, M. A., Good, R.,
Greenwood, C. R., & Shinn, M. R. (2001). General growth outcomes for young children:
Developing a foundation for continuous progress measurement. Journal of Early Intervention, 24,
163-180.
Wackerle, Alisha K. (2007). Test review: Selection of general growth outcomes for children between
birth and age eight. Assessment for Effective Intervention. 33(1), 51-54.
Alliteration:
r= .46 to .80 test-retest reliability over 3 weeks
r= .40 to .57 with PPVT-3
r=.34 to .55 with Clay’s Concepts about Print
r=.75 to .79 with TOPA
r=.39 to .71 with DIBELS Letter Naming Fluency
r=.61 with chronological age
Sources:
McConnell, S.R., Priest, J.S., Davis, S.D., & McEvoy, M.A. (2002). Best Practices in Measuring
Growth and Development for Preschool Children, In A. Thomas & J. Grimes (Eds.), Best Practices
in School Psychology (4th ed., Vol. 2, --.1231-1246). Washington DC: National Association of School
Psychologists.
Missall, K.N., & McConnell, S.R. (April, 2004). Psychometric Characteristics for Individual Growth
and Development Indicators: Picture Naming, Rhyming, and Alliteration (Technical Report).
Minneapolis, MN: University of Minnesota. Accessed online at
http://ggg.umn.edu/techreports/ecri_report8html July 27, 2004.
Missall, K.N. et. al. (2007). Examination of Predictive Validity of Preschool Early Literacy Skills.
School Psychology Review 36 (3) 433-452.
Missall, K. N., McConnell, S. R., & Cadigan, K. (2006). Early literacy development: Skill growth and
relations between classroom variables for preschool children. Journal of Early Intervention, 29, 1-21.
12
Phaneuf, R. L., & Silberglitt, B. (2003). Tracking preschoolers' language and pre-literacy
development using a general outcome measurement system. Topics in Early Childhood Special
Education, 23, 114-123.
Priest, J. S., McConnell, S. R., Walker, D., Carta, J. J., Kaminski, R. A., McEvoy, M. A., Good, R.,
Greenwood, C. R., & Shinn, M. R. (2001). General growth outcomes for young children:
Developing a foundation for continuous progress measurement. Journal of Early Intervention, 24,
163-180.
VanDerHeyden, A.M., Snyder, P.A., Broussard, C., & Ramsdell, K. (2007). Measuring Response to
Early Literacy Intervention with Preschoolers at Risk. Topics in Early Childhood Special Education
27(4), 232-249.
Wackerle, Alisha K. (2007). Test review: Selection of general growth outcomes for children between
birth and age eight. Assessment for Effective Intervention. 33(1), 51-54.
Rhyming:
r= .83 to .89 test-retest reliability over 3 weeks
r= .56 to .62 with PPVT-3
r= .54 to .64 with Clay’s Concepts about Print
r= .44 to .62 with TOPA
r= .44 to .63 with IGDI Picture Naming and .43 with IGDI Alliteration
r=.48 to .59 with DIBELS Letter Naming Fluency
r=.44 to .68 with DIBELS Initial Sound Fluency
r= .46 with chronological age
Sources:
McConnell, S.R., Priest, J.S., Davis, S.D., & McEvoy, M.A. (2002). Best Practices in Measuring
Growth and Development for Preschool Children, In A. Thomas & J. Grimes (Eds.), Best Practices
in School Psychology (4th ed., Vol. 2, --.1231-1246). Washington DC: National Association of School
Psychologists.
Missall, K.N., & McConnell, S.R. (April, 2004). Psychometric Characteristics for Individual Growth
and Development Indicators: Picture Naming, Rhyming, and Alliteration (Technical Report).
Minneapolis, MN: University of Minnesota. Accessed online at
http://ggg.umn.edu/techreports/ecri_report8html July 27, 2004.
Missall, K.N. et. al. (2007). Examination of Predictive Validity of Preschool Early Literacy Skills.
School Psychology Review 36 (3) 433-452.
Missall, K. N., McConnell, S. R., & Cadigan, K. (2006). Early literacy development: Skill growth and
relations between classroom variables for preschool children. Journal of Early Intervention, 29, 1-21.
Phaneuf, R. L., & Silberglitt, B. (2003). Tracking preschoolers' language and pre-literacy
development using a general outcome measurement system. Topics in Early Childhood Special
Education, 23, 114-123.
13
Priest, J. S., McConnell, S. R., Walker, D., Carta, J. J., Kaminski, R. A., McEvoy, M. A., Good, R.,
Greenwood, C. R., & Shinn, M. R. (2001). General growth outcomes for young children:
Developing a foundation for continuous progress measurement. Journal of Early Intervention, 24,
163-180.
VanDerHeyden, A.M., Snyder, P.A., Broussard, C., & Ramsdell, K. (2007). Measuring Response to
Early Literacy Intervention with Preschoolers at Risk. Topics in Early Childhood Special Education
27(4), 232-249.
Wackerle, Alisha K. (2007). Test review: Selection of general growth outcomes for children between
birth and age eight. Assessment for Effective Intervention. 33(1), 51-54.
Letter Naming Fluency:
r= .94 inter rater reliability
r= .90 2 week test retest reliability
r= .88 1 month alternate reliability
r=.93 alternate forms reliability
r= .70 with WJ-R Readiness Cluster
r= .70 with WJ Psychoeducational Battery
r= .53 to .58 with CTOPP Composite
Predictive r= .65 with WJ Total Reading Cluster
Predictive r= .71 with R-CBM
ELL Predictive r = .67 with a composite of DIBELS NWF and R-CBM
Sources:
Assessment Committee Report for Reading First. (2002). Analysis of Reading Assessment Measures.
Retrieved February 21, 2007, from http://dibels/uoregon.edu/techreports/dibels_5th_ed.pdf
Good, R.H., Kaminski, R.A., Shinn, M. Bratten, J., Shinn, M., & Laimon, L. (in preparation).
Technical Adequacy and Decision Making Utility of DIBELS (Technical Report). Eugene, OR:
University of Oregon
Good, R.H. III., Kaminski, R.A., Simmons, D., Kame’enui, E.J. (2001). Using Dynamic Indicators
of Basic Early Literacy Skills (DIBELS) in an outcomes-driven model: Steps to reading outcomes.
Unpublished manuscript, University of Oregon at Eugene.
Elliot, J., Lee, S.W., Tolefson, N. (2001). A Reliability and Validity Study of the Dynamic Indicators
of Basic Early Literacy Skills – Modified. School Psychology Review, 30 (1), 33-49.
Haager, D. & Gersten, R (April, 2004). Predictive Validity of DIBELS for English Learners in
Urban Schools. DIBELS Summit conference presentation, Albuquerque, NM
Hintz, J.M., Ryan, A.L., & Stoner, G. (2003). Concurrent Validity and Diagnostic Accuracy of
DIBELS and the CTOPP. School Psychology Review
Kaminski, R.A. & Good, R.H. (1996). Toward a Technology for Assessment Basic Early Literacy
Skills. School Psychology Review, 25, 215-227.
14
Rouse, H., Fantauzzo, J.W. (2006). Validity of the Dynamic Indicators of Basic Early Literacy Skills
as an Indicator of Early Literacy for Urban Kindergarten Children. School Psychology Review 35
(3)3 341-355.
Letter Sound Fluency:
r= .83 2-week test-retest reliability
r=.80 alternate form reliability
r= .79 with Letter Naming Fluency
Predictive r=.72 with R-CBM
Sources:
Elliott, J., Lee, S.W., & Tollefson, N. (2001). A Reliability and Validity Study of the Dynamic
Indicators of Basic Early Literacy Skills – Modified. School Psychology Review, 30 (1), 33-49.
Fuchs, L., Fuchs D. (2004). Determining Adequate Yearly Progress from Kindergarten through
Grade 6 with Curriculum Based Measurement. Assessment for Effective Intervention 29 (4) 25-37.
Howe, K. B., Scierka, B. J., Gibbons, K. A., & Silberglitt, B. (2003). A School-Wide Organization
System for Raising Reading Achievement Using General Outcome Measures and Evidence-Based
Instruction: One Education District’s Experience. Assessment for Effective Intervention, 28, 59-72.
Scott, S.A., Sheppard, J., Davidson, M.M., & Browning, M.M. (2001). Prediction of First Graders’
Growth in Oral Reading Fluency Using Kindergarten Letter Naming Fluency. Journal of School
Psychology, 39(3), 225-237.
Ritchey, K.D (2008). Assessing Letter Sound Knowledge: A Comparison of Letter Sound Fluency
and Nonsense Word Fluency. Exceptional Children 74 (4) 487-506.
Nonsense Word Fluency:
r= .83 one month alternate form reliability
r=.36 to .59 with WJ-R Readiness Cluster
Predictive r= .82 with Spring R-CBM in Spring of grade 1
Predictive r = .65 with oral reading and .54 with maze in grade 3
Ell Predictive r= .63 with a composite of DIBELS NWF and R-CBM
Sources:
Burke, M. D., Hagan-Burke, S. (2007). Concurrent criterion-Related validity of early literacy
indicators for middle of first grade. Assessment for Effective Intervention. 32(2), 66-77.
Good, R.H., Kaminski, R.A., Shinn, M. Bratten, J., Shinn, M., & Laimon, L. (in preparation).
Technical Adequacy and Decision Making Utility of DIBELS (Technical Report). Eugene, OR:
University of Oregon
15
Good, R.H., Kaminski, R.A., Simmons, D., & Kame-enui, E.J. (2001). Using DIBELS in an
Outcomes Driven Model: Steps to Reading Outcomes. Unpublished manuscript, University of
Oregon, Eugene.
Haager, D. & Gersten, R (April, 2004). Predictive Validity of DIBELS for English Learners in
Urban Schools. DIBELS Summit conference presentation, Albuquerque, NM
Howe, K. B., Scierka, B. J., Gibbons, K. A., & Silberglitt, B. (2003). A School-Wide Organization
System for Raising Reading Achievement Using General Outcome Measures and Evidence-Based
Instruction: One Education District’s Experience. Assessment for Effective Intervention, 28, 59-72
Kaminski, R.A. & God, R.H. (1996). Toward a Technology for Assessment Basic Early Literacy
Skills. School Psychology Review, 25, 215-227.
Ritchey, K.D (2008). Assessing Letter Sound Knowledge: A Comparison of Letter Sound Fluency
and Nonsense Word Fluency. Exceptional Children 74 (4) 487-506.
Rouse, H., Fantauzzo, J.W. (2006). Validity of the Dynamic Indicators of Basic Early Literacy Skills
as an Indicator of Early Literacy for Urban Kindergarten Children. School Psychology Review 35
(3)3 341-355.
Vanderwood, M.., Linklater, D., Healy, K. (2008). Predictive Accuracy of Nonsense Word Fluency
for English Language Learners. School Psychology Review 37 (1) 5-17.
Oral Reading Fluency:
r= .92 to .97 test retest reliability
r= .89 to .94 alternate form reliability
r= .82 to .86 with Gates-MacGinite Reading Test
r= .83 to Iowa Test of Basic Skills
r = .88 to Stanford Achievement Test
r= .73 to .80 to Colorado Student Assessment Program
r= .67 to Michigan Student Assessment Program
r=.73 to North Carolina Student Assessment Program
r=74 to Arizona Student Assessment Program
r=.61 to .65 to Ohio Proficiency Test, Reading Portion
r= .58 to .82 with Oregon Student Assessment Program (SAT 10)
Sources:
Barger, J. (2003). Comparing the DIBELS Oral Reading Fluency indicator and the North Carolina
end of grade reading assessment (Technical Report). Ashville, NC: North Carolina Teacher
Academy.
Baker S. et. al,. (2008). Reading Fluency as a Predictor of Reading Proficiency in Low-Performing,
High-Poverty Schools. School Psychology Review 37 (1) 18-37.
Burke, M. D., Hagan-Burke, S. (2007). Concurrent criterion-Related validity of early literacy
indicators for middle of first grade. Assessment for Effective Intervention. 32(2), 66-77.
16
Deno, S. L., Mirkin, P. K., & Chiang, B. (1982). Identifying valid measures of reading. Exceptional
Children, 49. 36-45.
Howe, K. B., Scierka, B. J., Gibbons, K. A., & Silberglitt, B. (2003). A School-Wide Organization
System for Raising Reading Achievement Using General Outcome Measures and Evidence-Based
Instruction: One Education District’s Experience. Assessment for Effective Intervention, 28, 59-72
Hintze, J.M, et al (2002). Oral Reading Fluency and Prediction of Reading Comprehension in
African American and Caucasian Elementary School Children. School Psychology Review, 31 (4)
540-553
Hintze, J. M. & Silberglitt, B. (in press). A Longitudinal Examination of the Diagnostic Accuracy
and Predictive Validity of R-CBM and High-Stakes Testing. School Psychology Review.
Marston, D., Fuchs, L., & Deno, S. (1987). Measuring pupil progress: a comparison of standardized
achievement tests and curriculum-related measures. Diagnostique, 11, 77-90.
Marston, D. (1989). Curriculum-based measurement: What is it and why do it? In M. R. Shinn (Ed.),
Curriculum-based measurement: Assessing special children (pp. 18-78). New York: Guilford Press.
McGlinchey, M. T., & Hixson, M. D. (2004). Contemporary research on curriculum-based
measurement: Using curriculum-based measurement to predict performance on state assessments in
reading. School Psychology Review, 33(2), 193-204.
Schilling, S. G., Carlisle, J. F., Scott, S. E., & Zeng, J. (2007). Are fluency measures accurate
predictors of reading achievement? The Elementary School Journal, 107(5), 429-448.
Silberglitt, B. & Hintze, J. M. (in press). Formative Assessment Using Oral Reading Fluency Cut
Scores to Track Progress Toward Success on State-Mandated Achievement Tests: A Comparison of
Methods. Journal of Psychoeducational Assessment.
Shaw, R., & Shaw, D. (2002). DIBELS Oral Reading Fluency-Based Indicators of the third-grade
reading skills for Colorado State Assessment Program (CSAP) (Technical Report). Eugene, OR:
University of Oregon.
Shinn, M., Good, R., Knutson, N., Tilly, W., & Collins, A. (1992). Curriculum-based measurement
of oral reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review,
21, 459-479.
Stage, S. A., & Jacobsen, M. D. (2001). Predicting student success on a state-mandated performancebased assessment using oral reading fluency. School Psychology Review, 30(3), 407-420.
Tindal, G., Germann, G., & Deno, S. (1983). Descriptive research on the Pine County Norms: A
compilation of findings (Research Report No. 132). Minneapolis, MN: University of Minnesota
Institute for Research on Learning Disabilities.
17
Vander Meer, C. D., Lentz, F. E., & Stollar, S. (2005). The relationship between oral reading fluency
and Ohio proficiency testing in reading (Technical Report). Eugene, OR: University of Oregon.
Wilson, J. (2005). The relationship of Dynamic Indicators of Basic Early Literacy Skills (DIBELS)
Oral Reading Fluency to performance on Arizona Instrument to Measure Standards (AIMS).
Tempe, AZ: Tempe School District No. 3.
18
Are the Members Implementing the Assessments Correctly?
Analysis of fidelity with which assessments are conducted is a critical initial aspect to the evaluation
of the MRC program so that results from evaluation of these data may be reported with confidence.
In order to accomplish this, a series of Accuracy of Implementation Rating Scales (AIRS) have been
compiled from each Minnesota Reading Corps (MRC) site. MRC Internal Coaches were trained in
August 2007 to administer and score assessment measures, and to conduct observations of Reading
Corps members as they administer and score these measures. The AIRS are structured
observational protocols which provide an opportunity for observers to certify that each aspect of a
standardized administration for each assessment measure has been fully conducted. Internal
Coaches completed a minimum of 1 AIRS for each Reading Corps Member for each type of
assessment the member conducted at least 3 times each year, around the benchmark data collection
periods. The table below documents the number of AIRS assessments compiled and percent fidelity
documented for each measure.. In addition to reporting these data in aggregate format to document
high fidelity of assessment procedures across the state, this observation system also provided
Members with immediate feedback regarding the quality of their own assessment skills, and an
opportunity to receive clarification or re-training as needed in a timely manner.
Table 4: Fidelity of Assessment Data Collection Procedures
Measure
Total AIRS
Collected
Fidelity
Range
Reported
Median %
Fidelity
Reported
Mean %
Fidelity
Reported
Standard
Deviation
Rhyming
Letter
Naming
Letter
Sounds
Nonsense
Words
Oral
Reading
Fluency
124
127
190
217
116
303
18% - 100%
28% - 100%
33% - 100%
12% - 100%
44% - 100%
62% - 100%
62% - 100%
100%
100%
100%
100%
100%
100%
100%
96%
95%
95%
95%
96%
96%
94%
0.10
0.13
0.11
0.12
0.09
0.08
0.10
Picture
Naming
Alliteration
126
19
Do the Interventions Used with Children have a Research Base?
The K-3 interventions identified for use in the MRC program are each designed to provide
additional practice that is supplemental to the core reading instructional program offered by the local
school site. This practice is provided with the intention of building automaticity and fluency of
important reading skills that have already been introduced by local classroom teachers. It is
important to note at the outset that MRC participating students do so in addition to, not in
replacement of, a comprehensive core reading instructional program, and that the MRC program
should in no way be viewed as a substitute for high quality core instruction. MRC provides
important additional guided practice time in reading for students who need this support. For further
discussion regarding the benefit of supplemental support to students at risk for reading failure, see
Harn (2008). For a discussion of benefit of well matched interventions, see Wagner et al (2006).
The chosen interventions share a common theme in focus on building fluency for basic reading
skills such as phonemic awareness, letter sound knowledge, decoding skill, and sight word
recognition. Fluency is interpreted in this program as incorporating rate, accuracy, and prosody, or
expression. Richard Allington, Former president of the International Reading Association writes:
"There are a substantial number of rigorously designed research studies demonstrating (1)
that fluency can be developed, most readily through a variety of techniques that involve
rereading texts and (2) that fostering fluency has reliable positive impacts on comprehension
and performance. Thus when fluency is an instructional goal, as it should be for struggling
readers, we have a wealth of research to guide our instructional planning.” (Allington, 2001)
For futher discussion on the relationship between oral reading fluency and comprehension skills, the
interested reader is referred to Tenenbaum & Wolking (1989).
A unique feature of MRC is the consistent use of research-based intervention protocols with
participating students to provide this additional support. In the K-3 Program, MRC members select
from eleven research-based supplemental reading interventions for use with participating MRC
students as listed below. For each intervention protocol, a description of the research base, and/or
sources of empirical evidence of intervention effectiveness are listed.
Repeated Reading
Moyer, S.B. (1982). Repeated reading. Journal of Learning Disabilities, 45, 619-623
Rashotte, C.A., & Torgeson, J.K. (1985). Repeated reading and reading fluency in learning disabled
children. Reading Research Quarterly. 20, 180-188
Samuels, S. J. (1979). The method of repeated reading. The Reading Teacher, 32, 403-408.
Samuels, S.J., (1987). Information processing abilities and reading. Journal of Learning Disabilities,
20(1), 18-22.
Sindelar, P.T., Monda, L.E., & O’Shea, L.J. (1990). Effects of repeated reading on instructional
and mastery level readers. Journal of Educational Research, 83, 220-226.
20
Therrien, W.J. (2004). Fluency and comprehension gains as a result of repeated reading: A metaanalysis. Remedial and Special Education. 25(4) 252-261
Duet Reading
Aulls, M.W., (1982). Developing Readers in Today’s Elementary Schools. Allyn & Bacon: Boston.
Blevins, W. (2001). Building Fluency: Lessons and Strategies for Reading Success. New York:
Scholastic Professional Books.
Dowhower, S.L. (1991). Speaking of prosody: Fluency’s unattended bedfellow. Theory into
Practice, 30 (3), 165-175.
Mathes,P.G., Simmons, D.C., & Davis, B.I. (1992). Assisted reading techniques for developing
reading fluency. Reading Research and Instruction, 31, 70-77.
Weinstein, G., & Cooke, N. L. (1992). The effects of two repeated reading interventions on generalization of fluency. Learning Disability Quarterly, 15, 21–27.
Newscaster Reading
Armbruster, B.B., Lehr, F., & Osborn, J. (2001). Put reading first: The research building blocks for
teaching children to read. Washington, DC: US Department of Education, National Institute for
Literacy.
Dowhower. S.L. (1987). Effects of repeated reading on second-grade transitional readers’ fluency
and comprehension. Reading Research Quarterly. 22, 389-406. (listening to a tape)
Heckelman, R.G. (1969). A neurological-impress method of remedial reading instruction. Academic
Therapy, 4, 277-282.
Rasinski, T.V. (2003). The fluent reader: Reading strategies for building word recognition, fluency,
and comprehension. New York, NY: Scholastic Professional Books.
Searfoss, L. (1975). Radio Reading. The Reading Teacher, 29, 295-296.
Stahl S. (2004). What do we Know About Fluency?: Findings of the National Reading Panel. In
McCardle, P., & Chhabra, V. (Eds) The Voice of Evidence in Reading Research. Brookes: AU.
Stop Go
Blevins, W. (2001). Building Fluency: Lessons and Strategies for Reading Success. New York:
Scholastic Professional Books.
Rasinski, T., & Padak, N. (1994). Effects of fluency development on urban second-graders. Jorunal
of Education Research, 87.
21
Rasinski, T.V. (2003). The fluent reader: Reading strategies for building word recognition, fluency,
and comprehension. New York, NY: Scholastic Professional Books.
Pencil Tap
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Education Research. 77(1),
81-112.
Howell, K., W., & Nolet. V., (2000). Curriculum-Based Evaluation: Teaching and Decision Making
3rd Ed. Belmont, CA: Wadsworth.
Lysakowski, R.S., & Walberg, H.J. (1982). Instructional effects of cues, participation, and corrective
feedback: A quantitative synthesis. American Educational Research Journal Vol 19(4), 559-578
Tenenbaum, G., & Goldring, E. (1989). A meta-analysis fo the effecta of enhanced instruction:
Cues, participation, reinforcement and feedback and correctives on motor skill learning. Journal of
Research & Development in Education. Vol 22(3) 53-64.
Repeated Reading with Question Generation
Therrien, W.J., Wickstrom, K., & Jones, K (2006). Effect of a Combines Repeated Reading and
Question Generation Intervention on Reading Achievement. Learning Disabilities Research &
Practice, 21(2), 89-97
Great Leaps
Mercer, Cecil D., Campbell, Kenneth U., Miller, W. David, Mercer, Kenneth D., and Lane, Holly B.
Effects of a Reading Fluency Intervention for Middle Schoolers with Specific Learning Disabilities.
Learning Disabilities Research and Practice, 15(4), 179-189. 2000.
Meyer, Marianne. Repeated Reading: An Old Standard is Revisited and Renovated. Perspectives,
28(1), 15-18. 2002.
Letter Sound Identification
Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT
Press.
Adams, M.J. (2001).Alphabetic anxiety and explicit, systematic phonics instruction: A cognitive
science perspective. In S.B. Neuman & D.K. Dickinson (eds.), Handbook of Early Literacy
Research (pp. 66-80). New York: Guilford Press.
Chard, D.J., & Osborn, J. (1999). Word Recognition: Paving the road to successful reading.
Intervention in school and clinic, 34(5), 271-277.
22
Word Blending
Adams, M.J. (2001).Alphabetic anxiety and explicit, systematic phonics instruction: A cognitive
science perspective. In S.B. Neuman & D.K. Dickinson (eds.), Handbook of Early Literacy
Research (pp. 66-80). New York: Guilford Press.
Goswami, U. (2000). Causal connections in beginning reading: The importance of rhyme. Journal or
Research in Reading, 22(3) 217-240.
Greaney, K.T., Tunmer, W.E., & Chapman, J.W., (1997). Journal of Educational Psychology, 89(4)
645-651.
Phoneme Blending
Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT
Press.
Bos, C.D., & Vaughn, S. (2002). Strategies for teaching students with learning and behavioral
problems (5th Ed.). Boston: Allyn & Bacon.
Ehri, L.C., Nunees, S.R., & Willows, D.M. (2001). Phonemic awareness instruction helps children
learn to read: Evidence from the National Reading Panel’s meta-analysis. Reading Research
Quarterly, 36(3). 250-287.
Elkonin, D.B. (1973). U.S.S.R. In J. Downing (Ed.), Comparative Reading (pp.551-579). New York:
MacMillan.
National Reading Panel. (2000). Teaching children to read: Anevidence-based assessment of the
scientific research literature on reading and its implications for reading instruction. Bethesda, MA:
National Institutes of Health.
Santi, K.L., Menchetti, B.M., & Edwards, B.J. (2004). A comparison of eight kindergarten phonemic
awareness programs based on empirically validated instructional principals. Remedial and Special
Education, Vol 25(3) 189-196.
Smith, C.R. (1998). From gibberish to phonemic awareness: Effective decoding instruction.
Exceptional Children, Vol 30(6) 20-25
Smith, S.B., Simmons, D.C., & Kame’enui, E, J. (1998). Phonological Awareness: Research bases.
In D.C. Simmons & E.J. Kame’enui (Eds.), What Reading research tells us about children with
diverse learning needs: Bases and basics. Mahwah, NJ: Lawrence Erlbaum Associates.
Snider, V. E. (1995). A primer on phonemic awareness: What it is, why it is important, and how to
teach it. School Psychology Review, 24, 443–455.
23
Phoneme Segmentation
Adams, M.J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT
Press.
Blachman, B. A. (1991). Early intervention for children’s reading problems: Clinical applications of
the research on phonological awareness. Topics in Language Disorders, 12, 51–65.
Bos, C.D., & Vaughn, S. (2002). Strategies for teaching students with learning and behavioral
problems (5th Ed.). Boston: Allyn & Bacon.
Ehri, L.C., Nunees, S.R., & Willows, D.M. (2001). Phonemic awareness instruction helps children
learn to read: Evidence from the National Reading Panel’s meta-analysis. Reading Research
Quarterly, 36(3). 250-287.
National Reading Panel. (2000). Teaching children to read: Anevidence-based assessment of the
scientific research literature on reading and its implications for reading instruction. Bethesda, MA:
National Institutes of Health.
Santi, K.L., Menchetti, B.M., & Edwards, B.J. (2004). A comparison of eight kindergarten phonemic
awareness programs based on empirically validated instructional principals. Remedial and Special
Education, Vol 25(3) 189-196.
Smith, C.R. (1998). From gibberish to phonemic awareness: Effective decoding instruction.
Exceptional Children Vol 30(6) 20-25.
Smith, S.B., Simmons, D.C., & Kame’enui, E, J. (1998). Phonological Awareness: Research bases.
In D.C. Simmons & E.J. Kame’enui (Eds.), What Reading research tells us about children with
diverse learning needs: Bases and basics. Mahwah, NJ: Lawrence Erlbaum Associates.
Snider, V. E. (1995). A primer on phonemic awareness: What it is, why it is important, and how to
teach it. School Psychology Review, 24, 443–455.
24
Are the Members Implementing the Interventions Correctly?
As with the assessment tools, analysis of the level of fidelity with which the student intervention
protocols are followed is a critical initial aspect to the evaluation of the MRC program so that results
of student growth analysis may be attributed to accurate implementation of intervention scripts. In
order to accomplish this, a series of intervention integrity observations have been compiled from
each MRC site. MRC Master Coaches were trained to evaluate implementation integrity for each of
the MRC interventions. The integrity checklists provide an opportunity for observers to certify that
each aspect of a standardized administration for each intervention has been fully conducted. Master
coaches completed a minimum of 1 intervention integrity checklist for each MRC member during
each monthly visit, for a possible total of 9 checklists per member per year. The interested reader is
referred to Ehrhardt, Barnett, Lentz, Stollar, & Reifin, (1996) for a description of how to use scripts
to improve intervention integrity. The table below documents the number of integrity checklists
compiled and percent fidelity documented for each intervention:
Total Fidelity
Checks
Collected
Fidelity
Range
Reported
Median %
Fidelity
Reported
Mean %
Fidelity
Reported
Standard
Deviation
Phoneme Segmenting
Phoneme Blending
Word Blending
Letter Sound
Identification
Great Leaps
Pencil Tap
Stop Go
Repeated Reading w/
Question Generation
Newscaster Reading
Duet Reading
Intervention
Repeated Reading
Table 5: Fidelity of Intervention Implementation Procedures
287
207
51
5
9
32
111
150
80
35
26
17% 100%
15%100%
40% 100%
56% 100%
100% 100%
39% 100%
21%100%
0%100%
25%100%
63%100%
67% 100%
100%
100%
100%
83%
100%
100%
93%
88%
100%
100%
100%
93%
94%
92%
78%
100%
93%
89%
83%
92%
93%
94%
0.13
0.12
0.13
0.20
0.00
0.15
0.15
0.23
0.17
0.10
0.11
25
Is the Performance of Students in Terms of their Literacy Improvement Consistent with
Expectations?
The following sections document growth and achievements of children age 3 to grade 3 who
participated in the MRC program during the 2006-2007 school year. It is important to acknowledge
that MRC participating students are also supported by a variety of resources, most notably the
instruction and guidance provided by their schools and families. This evaluation is not intended to
address or control for the variables related to these resources, nor to suggest that student progress
or lack thereof must be attributed solely to the service being provided through the efforts of the
MRC program. This design’s purpose is to focus on the desired literacy outcomes for all children.
Pre-Kindergarten Student Performance
The five measurement tools utilized for the Pre-K Reading Corps program are listed below. For
each assessment tool, a target score was identified as the goal for the end of the year. These target
scores were based on the target scores used in Minneapolis Public Schools for incoming
kindergarten students, and upon 50th %ile scores for incoming kindergarten students within school
districts served by Saint Croix River Education District. Prior to use of these targets for the current
project, these targets were reviewed by experts at the University of Minnesota who had created the
Individual Growth and Development Indicators. The measures and target scores for this project are
listed below:
Measure
Rhyming
Picture Naming
Alliteration
Letter Sound Fluency
Letter Naming Fluency
Spring Target Score
12
26
8
8
14
Pre-Kindergarten student performance on fall, winter, and spring IGDI measures is listed in the
tables below for all students with a birthdate reported. Score ranges are also reported. Students
score “NA” when they do not complete the sample items sufficiently to warrant participation in the
assessment. A score of “0” indicates adequate performance on the sample items, but no accurate
responses during the assessment.
26
Table 6: Pre-Kindergarten Participant Performance on IGDIs: Fall Benchmark
Measure
Picture Naming
Alliteration
Rhyming
Letter Naming
Fluency
Letter Sound
Fluency
Output
Fall Number Students Tested
Range of Scores
Fall (Number) Percent Students Above Spring Target
Fall Number Students Tested
Range of Scores
Fall (Number) Percent Students Above Spring Target
Fall Number Students Tested
Range of Scores
Fall (Number) Percent Students Above Spring Target
Fall Number Students Tested
Three
Year Olds
96
NA-31
(3) 3%
96
NA-11
(2) 2%
96
NA-22
(3) 3%
68
Four Year
Olds
716
NA-38
(134) 19%
715
NA-17
(42) 6%
716
NA-21
(50) 7%
715
Five Year
Olds
22
NA-43
(4) 18%
23
NA-6
(0) 0%
23
NA-20
(4) 17%
23
Range of Scores
Fall (Number) Percent Students Above Spring Target
Fall Number Students Tested
Range of Scores
Fall (Number) Percent Students Above Spring Target
NA-34
(7) 10%
69
NA-20
(3) 4%
NA-55
(106) 15%
716
NA-25
(45) 6%
NA-28
(5) 22%
23
NA-15
(2) 9%
Table 7: Pre-Kindergarten Participant Performance on IGDIs: Winter Benchmark
Measure
Picture Naming
Alliteration
Rhyming
Letter Naming
Fluency
Letter Sound
Fluency
Output
Winter Number Students Tested
Range of Scores
Winter (Number) Percent Students Above Spring
Target
Winter Number Students Tested
Range of Scores
Winter (Number) Percent Students Above Spring
Target
Winter Number Students Tested
Range of Scores
Winter (Number) Percent Students Above Spring
Target
Winter Number Students Tested
Range of Scores
Winter (Number) Percent Students Above Spring
Target
Winter Number Students Tested
Range of Scores
Winter (Number) Percent Students Above Spring
Target
Three
Year Olds
96
NA-34
(22) 22.9%
Four Year
Olds
749
NA-56
(278) 37.1%
Five Year
Olds
24
NA-33
(11) 45.8%
54
NA-14
(9) 16.7%
508
NA-22
(143) 28.2%
19
NA-16
(8) 42.1%
66
NA-17
630
NA-66
21
NA-22
(13) 19.7%
(171) 27%
(8) 38.1%
65
NA-41
(24) 36.9%
687
NA-67
(317) 46.1%
23
NA-39
(15) 65.2%
48
NA-30
(20) 41.7%
529
NA-50
(177) 33.4%
20
NA-22
(6) 30.0%
27
Table 8: Pre-Kindergarten Participant Performance on IGDIs: Spring Benchmark
Measure
Picture Naming
Alliteration
Rhyming
Letter Naming
Fluency
Letter Sound
Fluency
Output
Spring Number Students Tested
Range of Scores
Spring (Number) Percent Students Above Spring Target
Spring Number Students Tested
Range of Scores
Spring (Number) Percent Students Above Spring Target
Spring Number Students Tested
Range of Scores
Three
Year Olds
101
NA-45
(35) 34.7%
99
NA-17
(17) 17.2%
99
NA-20
Four Year
Olds
649
NA-42
(382) 58.9%
645
NA-25
(280) 43.4%
646
NA-34
Five Year
Olds
17
8-38
(12) 70.6%
17
NA-16
(7) 41.2%
17
NA-28
Spring (Number) Percent Students Above Spring Target
Spring Number Students Tested
Range of Scores
Spring (Number) Percent Students Above Spring Target
Spring Number Students Tested
Range of Scores
Spring (Number) Percent Students Above Spring Target
(21) 21.2%
75
NA-59
(38) 50.0%
72
NA-34
(22) 30.6%
(284) 44.0%
646
NA-100
(409) 63.3%
647
NA-65
(322) 49.8%
(10) 58.8%
17
6-45
(13) 76.5%
17
NA-28
(7) 41.2%
28
The figure below shows the normative performance of all 4 year old students participating in the
IGDI measures during the 2006-2007 school year. Dark horizontal lines represent the target score
for each measure. As clarification, it is noted that students who did not successfully complete the
sample items on each assessment measure in order to continue on to the actual assessment were
given a score of “NA” which is recorded in this figure as a -6. The NA score is distinguished from a
score of 0 which reflects a performance in which the student’s appropriate responses to the sample
items warranted continuation with the assessment in accordance with standardized procedure, but
responses during the timed assessment yielded no correct answers.
Figure 1: Normative Performance of 4 Year Olds on IGDI Measures 2007-2008
29
A cross-cohort analysis of performance by 4-year-old students only has been compiled across the
five years of the Minnesota Reading Corps program. The following figure shows the percent of 4year-old MRC participants meeting the assessments’ spring target scores at fall, winter, and spring
assessment times across years. It is noted that the 2006-2007 data was analyzed by an outside
agency, and only includes students enrolled in Headstart MRC classrooms. Data from the current
year show performance roughly equivalent with or somewhat above that of previous years on fall
measures, above previous years performance on four of five assessments on winter measures, and
above previous years performance on four of five assessments on spring measures..
Figure 2: Cross-Cohort Percent Above Target on Early Literacy Measures
30
In addition, the number of children demonstrating growth across the school year from fall to spring
or who met target on each assessment measure has been calculated. Included in these calculations
are only data from children who are assessed using all 5 Pre-K measures during both the fall and
spring benchmark windows.
Table 9: Pre-Kindergarten Student Growth
Region
Duluth
Grand Rapids
Metro
Moorhead
Rochester
Saint Cloud
Total
# Assessed on
5 measures in
both Fall and
Spring
20
101
207
97
46
100
571
%
Demonstrating
Growth On At
Least 2
Measures
100%
96.0%
100%
91.8%
97.8%
99.0%
97.5%
%
Demonstrating
Growth On At
Least 3
Measures
100%
89.1%
97.6%
87.6%
95.7%
95.0%
93.9%
%
Demonstrating
Growth On At
Least 4
Measures
95.0%
71.3%
83.1%
69.1%
89.1%
80.0%
79.0%
%
Demonstrating
Growth On All
5 Measures
60.0%
41.6%
56.5%
39.2%
73.9%
44.0%
50.3%
31
Pre-Kindergarten Matched Sample Analysis
In order to address the evaluation question regarding growth across years for students participating
in the Minnesota Reading Corps program, analysis has been completed for a subset of MRC
participating students who were enrolled in PICA Head Start programming during the 2006-2007
school year, and continued on to elementary school within Minneapolis Public School (MPS) system
as kindergarteners in fall of 2007.
For comparison purposes, two additional groups of students were identified: students in PICA
(Head Start Program) who did not receive intervention from a Reading Corps member and enrolled
in MPS in fall of 2007, and students not in PICA who enrolled in MPS kindergarten in fall of 2007.
Minneapolis Beginning of Kindergarten Assessment
Beginning of Kindergarten literacy was assessed using the Minneapolis Beginning of Kindergarten
Assessment (BKA). This assessment is individually administrated by retired school teachers who are
trained to be reliable data collectors. The assessment includes 5 literacy domains: phonological
awareness (initial letter sounds and rhyming), alphabetic principles (letter names and letter sounds),
vocabulary (picture naming), concepts of print (front of the book, right to left sweep, etc.), and oral
comprehension.
The BKA has scores with a high level of reliability (Betts J., Heistad D., Pickart M., and Anderson,
A., 2005). The overall total test score, called the Early Literacy Composite has internal consistency
reliability at or above 0.90. Similarly high internal consistency reliability estimates have been found
among the three norm groups (below 5 ½, between 5 ½ and 6, and above 6 years of age as
measured by age at time of BKA). The test-retest reliability was found to be 0.92.
The scores on the BKA have been found to have strong validity evidence (content, criterion and
construct). The predictive validity of the BKA to 1st grade oral reading (0.80), 1st grade
comprehension (0.66) and to 2nd grade reading (0.66) as measured by the Northwest Achievement
Levels Tests (NALT) is also moderately strong. The BKA Early Literacy Composite is also highly
correlated with the End of Kindergarten Assessment (EKA) Early Literacy Composite (0.74).
Matched Sample Methodology
In order to locate matched samples for comparison, files of all kindergarten students were sorted
hierarchically on these variables in the MPS data system in fall 2007 using the following sort order:
•
•
•
•
•
•
•
•
•
•
Home Language
Special Education Status
Special Education Disability Category
Free or Reduced Price Lunch
Racial Ethnic Category
English Language Learner Status
Gender
Residential Zip Code
Resides with (e.g. single parent vs. both parents)
Student Birthday (matched students needed to be the same age within 6 months)
32
Students in the comparison groups were matched on at least 7 of 10 variables. The best match was
chosen by computer as the student directly above or directly below the Reading Corps student with
the most matches. Ties (e.g. if the student above and below in the file each had a match on 10
variables) were broken based on birth date closest to the Reading Corps student.
Matching between Reading Corps students and non-Reading Corps students in the PICA (Head
Start) program were as follows:
•
•
•
•
34 pairs (44%) had 10 out of 10 variables matched
32 pairs (41%) had 9 out of 10 variables matched
11 pairs (14%) had 8 out of 10 variables matched
2 pairs (3%) had 7 out of 10 variables matched
Matching between Reading Corps students and students not enrolled in the PICA (Head Start)
program were as follows:
•
•
•
•
45 pairs (57%) had 10 out of 10 variables matched
22 pairs (28%) had 9 out of 10 variables matched
10 pairs (13%) had 8 out of 10 variables matched
1 pair (2%) had 7 out of 10 variables matched
Demographic data for students in this analysis are listed below:
Table 10: Home Language of Study Participants
Home Language
Reading Corps
PICA
Non-Reading Corps
Non-PICA
(Other pre-K)
English
24 (30%)
24 (30%)
24 (30%)
English Dialect
1 (1.3%)
0 (0%)
1 (1.3%)
Hmong
1 (1.3%)
1 (1.3%)
1 (1.3%)
Laotian
1 (1.3%)
1 (1.3%)
1 (1.3%)
Oromo (Ethiopia)
1 (1.3%)
2 (3%)
1 (1.3%)
Spanish
29 (37%)
29 (37%)
29 (37%)
Somali
22 (28%)
22 (28%)
22 (28%)
Total
79
79
79
33
Table 11: “Resides with” Codes for Study Participants
Resides With
Reading Corps
PICA
Non-Reading Corps
Non PICA
(Other pre-K)
Both Parents
34 (43%)
43 (55%)
30 (38%)
Mother only
20 (25%)
17 (22%)
28 (35%)
Father only
2 (3%)
3 (4%)
3 (4%)
Guardian
1 (1.3%)
1 (1.3%)
1 (1.3%)
Mother and Step-Father
3 (4%)
3 (4%)
3 (4%)
Other Relative
2 (3%)
1 (1.3%)
0 (0%)
Not Given
17 (21%)
10 (13%)
14 (18%)
Table 12: Other Demographic Data for Study Participants
Kindergarten
Entry Codes
Reading Corps
PICA
Non-Reading Corps
Non PICA
(Other pre-K)
Free or reduced price lunch
76 (95%)
76 (95%)
76 (95%)
Special Education
10 (13%)
9 (12%)
10 (13%)
Male
46 (58%)
41 (52%)
46 (58%)
American Indian
1 (1.3%)
3 (4%)
1 (1.3%)
Asian
3 (4%)
3 (4%)
3 (4%)
Hispanic
29 (36%)
29 (36%)
29 (36%)
Black (Non-Hispanic)
46 (58%)
44 (56%)
44 (56%)
White
1 (1.3%)
1 (1.3%)
2 (3%)
34
Results of Matched Sample Analysis
The Dependent t-test comparing mean total literacy scores on the Minneapolis Public Schools
Beginning Kindergarten Assessment (BKA) for students in Reading Corps relative to students in
PICA non-Reading Corps was not statistically significant t (78)= 1.85; p=.069. The Dependent t-test
comparing mean total literacy scores on the Minneapolis Public Schools BKA for Reading Corps to
non-PICA (other pre-K) was statistically significant t (78)= 2.95; p=.004. To control for familywise
error, a Bonferroni correction was used across the two t-tests. The effect size (in standard deviation
units) for Reading Corps vs. PICA non-Reading Corps was .23. The effect size (in standard
deviation units) for Reading Corps vs. non-PICA was .42 which is a moderate effect.
The Minneapolis Public School system has identified a score of 80 as the target for the literacy
portion of the Beginning Kindergarten Assessment, representing a performance that predicts future
success in reading development and adequate performance on state assessments of reading. As seen
in Figure 3, the average total literacy score for Reading Corps participating students within the PICA
Headstart program who continued on to Kindergarten within the Minneapolis Public Schools
system exceeds this target score.
Figure 3: Results of Matched Sample Analysis
100
80
Total
Literacy
Score
60
40
20
0
Total Literacy
Reading Corps
PICA Non-
Non PICA
94.4
77.8
70.5
35
Kindergarten-Grade 3 Student Performance
The four assessment tools utilized for the K-3 Reading Corps program are listed below. For each
assessment tool, a target score was identified as the goal for the beginning, middle, and end of the
year. These target scores were based on research conducted at the St. Croix River Education
District which documented the predictive and concurrent validity of these measures with the
Minnesota Comprehensive Assessment.
As a result of the strong correlations between performance on the selected fluency measures and on
the MCA, a series of cut scores has been identified. The table below specifies assessments given at
each grade level and the cut scores for each assessment during several points throughout the school
year. These cut scores, or target scores, define levels of performance on the fluency measures that
strongly predict future success on the grade 3 Minnesota Comprehensive Assessment. For example,
a student who reads a Grade 1 passage in the winter if first grade at a rate of 20 words read correctly
per minute has an 80% chance of earning a score of 1420 or above on the 3rd grade MCA two years
later.
Grade
K
1
1
2
3
Measure
Letter Sound Fluency
Nonsense Word Fluency
Oral Reading Fluency
Oral Reading Fluency
Oral Reading Fluency
Fall Target
8
28
43
70
Winter Target
16
52
20
72
91
Spring Target
36
49
90
107
The target scores as listed above for each assessment used as a part of ongoing student literacy
measurement in Reading Corps grow across years from age 3 to grade 3, defining a pathway to
success. Through consideration of the inherent growth that would occur for a child who met each
of the targets, an expectation of growth rate at each grade level can be defined. For example, the fall
grade 2 target score is 43 on oral reading fluency. The spring grade 2 target score on this measure is
90. To grow from 43 to 90 in one academic year, a student would need to gain 1.31 words correct
per minute on the oral reading fluency assessment per week. Thus, 1.31 words growth per week
becomes the expectation for 2nd grade growth rates. Because our targets are connected to the statewide assessment rather than normative performance of other students in local districts, we have a
consistent and meaningful comparison across the state.
Students participating in the Reading Corps program are monitored frequently. The primary
purpose of this progress monitoring is to enable those providing support to the student the ability to
evaluate the effectiveness of current reading instruction, and to make data-based decisions regarding
changes in instruction. For the purposes of outcomes evaluation, the progress monitoring data also
provides a means for comparing the rate of growth of participating Reading Corps students to the
expected grade level growth rate. Students are selected for participation in the Reading Corps
program because they are identified as having below grade level skills in reading. These students
who achieve higher growth rates than those indicated by our targets are “catching up” to grade level
expectation by making more than one year’s growth in one year’s time. For the state-wide Reading
Corps project, one measure of our success is the extent to which our participating students are
achieving this primary goal.
36
In the table below, a comparison between weekly growth rate expectations and the average weekly
growth rates on program assessment measures of children participating in K-3 Reading Corps
programs who have at least 3 data points collected per measure is listed. The current analysis
includes all data collected between 9/4/2007 and 6/19/2008. Notably in all grade levels, the
average growth rate of Reading Corps participants exceeded the target growth rate (note that in
Grade 1, while mean growth of participants in the second half of the year, on Oral Reading Fluency,
was slightly below target growth, the mean growth in the first half of the year, on Nonsense Word
Fluency, exceeded target growth). Said in another way, the average growth rate for Reading Corps
participants exceeded a rate of one year’s growth in one year’s time. This is significant as it
demonstrates that participating students are actually catching up to grade level expectations.
Table 13: Kindergarten-Grade 3 Participant Growth
Grade K
Letter Sound
Fluency
Reading Corps Mean
Growth Rate
Target
Growth Rate
Number
of Students
Grade 1
Nonsense Word
Oral Reading
Fluency
Fluency (Winter
(Fall to Winter)
to Spring)
Grade 2
Grade 3
Oral Reading
Fluency
Oral Reading
Fluency
1.75
1.87
1.48
1.47
1.23
1.15
1.11
1.67
1.31
1.08
665
613
491**
741
758
* Only students with 3 or more data points on the given measure were included in growth rate calculations
** Students in this group may have also participated in Grade 1 (NWF)
The table below examines K-3 students who have at least 3 data points collected per measure,
represented by the “Total # of Students”. The table presents the percentage of these students whose
individual growth rates exceeded the target growth rates for that grade level and measure. This
calculation represents the portion of the population participating in a Reading Corps intervention
that had growth rates in excess of one years growth in one years time. Percentages are given by
region, and overall.
37
Table 14: Kindergarten – Grade 3 Percentage of Students Above Growth Targets by Region
Re
Duluth
Grand Rapids
Metro
Moorhead
Rochester
St. Cloud
TOTAL
% Above Target
Total # of Students
% Above Target
Total # of Students
% Above Target
Total # of Students
% Above Target
Total # of Students
% Above Target
Total # of Students
% Above Target
Total # of Students
% Above Target
Total # of Students
Grade K
(LSF)
76.27%
118
81.08%
37
81.50%
346
74.19%
62
76.92%
26
77.63%
76
79.25%
665
Grade 1
(NWF)
80.49%
164
64.29%
28
75.09%
269
73.91%
69
79.07%
43
77.50%
40
76.35%
613
Grade 1
(R-CBM)**
28.70%
115
33.33%
21
31.31%
214
30.16%
63
48.28%
29
48.98%
49
33.40%
491
Grade 2
(R-CBM)
62.99%
154
58.33%
36
63.66%
366
55.68%
88
72.22%
54
83.72%
43
64.10%
741
Grade 3
(R-CBM)
67.92%
106
69.70%
66
65.97%
382
52.43%
103
77.27%
66
82.86%
35
66.49%
758
TOTAL*
64.54%
657
64.89%
188
65.69%
1577
56.88%
385
72.48%
218
73.66%
243
65.42%
3268
* TOTAL represents the total number of slopes analyzed, not the total number of students, as students in
Grade 1 may have participated in two categories
** Students in this group may have also participated in Grade 1 (NWF)
38
Regarding student performance during the 2007-2008 school year, stakeholder groups asked an
additional question that was not a part of the initial evaluation plan. This question was, “What is the
typical number of weeks that a successful MRC program participant could expect to receive tutoring
sessions before graduating out of the program?” In order to respond to this question, the data set
was reviewed to identify the average number of weeks of measurement, only for those students with
individual growth rates above the target growth rate. The results of this analysis are displayed below,
by region and overall.
Table 15: Average Number of Weeks of Data, for Students with Growth Rate Above Target
Growth Rate
Duluth
Grand Rapids
Metro
Moorhead
Rochester
St. Cloud
TOTAL
Average Weeks
Number of Students
above Target Growth
Average Weeks
Number of Students
above Target Growth
Average Weeks
Number of Students
above Target Growth
Average Weeks
Number of Students
above Target Growth
Average Weeks
Number of Students
above Target Growth
Average Weeks
Number of Students
above Target Growth
Average Weeks
Number of Students
above Target Growth
Grade K
(LSF)
11.09
Grade 1
(NWF)
10.59
Grade 1 (RCBM)
14.97
Grade 2
(R-CBM)
15.02
Grade 3
(R-CBM)
14.18
90
7.80
132
10.67
33
10.14
97
12.14
72
12.41
30
9.04
18
8.52
7
10.88
21
15.15
46
14.68
282
8.43
202
9.14
67
11.42
233
14.22
252
13.02
46
11.45
51
7.65
19
11.64
49
14.38
54
14.22
20
9.49
34
8.48
14
9.42
39
11.36
51
10.45
59
9.41
31
9.19
24
11.59
36
14.54
29
13.93
527
468
164
475
504
39
Through the Minnesota Reading Corps program, it is possible for students to participate for more
than one year. The tables below represent data collected from the 2007-08 school year, comparing
students who had participated in the MRC in prior years in addition to the current year (“Prior
Participants”) to students who had not participated before (“No Prior Participation”). It is noted
that K-3 students involved in MRC for more than one year represent a potentially biased sample of
relative low-responders to the initial intervention as they likely would not qualify for MRC in
subsequent years if the first year of intervention had been successful. Despite this potential bias, no
statistically significant differences in average growth rates were found between prior participants and
new participants. There is not sufficient evidence to suggest a difference in growth between these
two groups. Table 17 also provides the percentage of students whose 2007-08 growth rates were
above target growth rate, across the two groups.
Table 16: Average Growth in 2007-08, by Participation in MRC in Prior Years
No Prior Participation
Prior Participants
TOTAL
Grade K (LSF)
1.75
*
1.75
Grade 1 (NWF)
1.87
1.93
1.87
Grade 1 (RCBM)
1.51
1.18
1.48
Grade 2 (RCBM)
1.47
1.43
1.47
Grade 3 (RCBM)
1.22
1.32
1.23
* Data not provided due to small sample size
** No statistically significant differences found (p<.01) for any measure
Table 17: Percentage of Students Above Growth Targets, by Participation in MRC in Prior Years
No Prior
Participation
Prior
Participants
TOTAL
Grade K
(LSF)
Grade 1
(NWF)
Grade 1
(R-CBM)
Grade 2
(R-CBM)
Grade 3
(R-CBM)
TOTAL
% Above Target
Total Number of Students
79.46%
662
76.22%
555
34.97%
449
63.96%
666
65.41%
688
65.63%
3020
% Above Target
Total Number of Students
% Above Target
Total Number of Students
33.33%
3
79.25%
665
77.59%
58
76.35%
613
16.67%
42
33.40%
491
65.33%
75
64.10%
741
77.14%
70
66.49%
758
62.90%
248
65.42%
3268
40
Pilot Analysis of MCA II Outcomes for MRC Participating Students
Purpose and Participants in Pilot Study
In order to begin evaluating the outcomes of MRC participating students on the Grade 3 Minnesota
Comprehensive Assessment II, a pilot analysis was conducted using data from six participating
schools collected during the 2005-2006 and 2006-2007 school year. The six schools all belong to the
St. Croix River Education District (SCRED) in East Central Minnesota. Schools in SCRED have a
30 year history of frequent progress monitoring for students at risk; refined, and broadly used the
reading intervention protocols that were later adopted for use in the MRC program; and have been
operating under a Problem Solving / RtI Framework for 12 years. SCRED districts serve a
generally homogenous population of students in a rural region with a moderate level of poverty (917% across counties served based on MN Kids Count 2002 data). Due to the small homogenous
sample, and the unique and sophisticated organizational structures in place in the participating
schools, results of this pilot analysis should be interpreted with caution, and should not form
generalized expectations for statewide performance.
Included in this analysis are all students in the six participating schools who participated in the MRC
program as third graders during the 2005-2006 school year; all the second graders who participated
in the MRC program during the 2005-2006 school year; and all the third graders who participated in
the MRC program during the 2006-2007 school year. Students were identified for participation in
the MRC program based on results of an Oral Reading Fluency assessment (Curriculum Based
Measurement of Reading) indicating below target performance. Target scores used for program
eligibility were developed by the St Croix River Education District, and are set at levels that predict
successful completion of the Minnesota Comprehensive Assessment II. In order to participate in
the MRC program, students’ performance on the Oral Reading Fluency assessment must indicate
that they have a less than 75% chance of passing the upcoming grade 3 MCA II. In addition, to be
included in the current analysis, a minimum of 3 data points needed to be present. A requirement of
the program is that one data point is collected per week of MRC service. Thus, number of data
points serves as a proxy for length of participation in the program. This minimum cut-off point was
established to ensure that all students included in the analysis participated in a reasonable amount of
MRC service. The table below specifies the number of students involved in this pilot analysis and
the range and median number of data points collected for each student.
Table 18: Number of Students in Analysis with Range of Data Points
Number of
Students
Range of Data
Points
Median Number
of Data Points
All
203
05-06 Grade 2
89
05-06 Grade 3
48
06-07 Grade 3
66
3-36
3-36
3-33
3-34
15
20
16
10
Program Outcomes
Initial Program Outcomes for students were organized into three categories. First, many
participating students successfully graduated (exit) from the MRC program prior to the end of the
41
school year after meeting assessment criteria. These students scored above their expected aimline
score on the weekly assessment for 3-5 consecutive weeks, indicating they were very likely to meet
end of school year target scores. Second, a group of students continued to participate in the MRC
program through the end of the school year (continue), having not shown enough progress on
weekly assessments to graduate early. Third, a group of students were discontinued from the MRC
program for a reason other than successful exit (stop). Possible reasons included the student
moving away, or the student transitioning into a more intensive intervention program offered by the
local school. The following table specifies the number and percent of students included in the
analysis represented in each initial outcome group.
Table 19: Initial MRC Program Outcomes for Participating Students
(#) % Successful
Exit
(#) % Continued
to Year End
(#) % Stopped
Program
All
(77) 38%
05-06 Grade 2
(16) 18%
05-06 Grade 3
(18) 38%
06-07 Grade 3
(43) 65%
(57) 28%
(24) 27%
(13) 27%
(20) 30%
(69) 34%
(49) 55%
(17) 35%
(3) 5%
Table 20: Percent of Available MCA II Scores for Participating MRC Students
Number of
Students with
MCA II Scores
% Of Possible
MCAII Scores
Available
All
05-06 Grade 2
05-06 Grade 3
06-07 Grade 3
170
63
45
62
84%
71%
94%
94%
Minnesota Comprehensive Assessment II Outcomes
Grade 3 Minnesota Comprehensive Assessment II scores were sought for each of these MRC
participating students in order to evaluate the relationship between successful completion of the
MRC program and the MCA II assessment. MCAII data were available for all students in the pilot
analysis who were still enrolled in a SCRED school at the time of the third grade MCA II
assessment, and participated in this assessment. The number of participants with available MCA II
data from each cohort is listed below, together with the percentage of total possible MCA II scores
that the number represents.
For MRC participating students with available grade 3 MCA II assessment scores, the number and
percentage of students who successfully exited the MRC program (exited), who completed a year
without exiting (continued), and who stopped the program for a reason other than successful exit
(stopped), and also met or exceeded the grade 3 Minnesota Reading Standards as measured by the
MCA II assessment are recorded below. In this as with all results tables, scores are shown for each
grade cohort, though it is noted that the individual grade cohorts are relatively small, particularly
42
when divided by program outcome, so any patterns noted should be interpreted cautiously. The
aggregate result of the three groups provides the strongest sample size.
Table 21: Percent of Students Meeting or Exceeding MCAII Standards By MRC Program Outcome
(#) % of Successfully Exited
Students who Met or Exceeded
Grade Level Standards on the
Grade 3 MCA II Assessment
(#) % of Continuing Students
who Met or Exceeded Grade
Level Standards on the Grade 3
MCA II Assessment
(#) % of Stopped Students who
Met or Exceeded Grade Level
Standards on the Grade 3 MCA
II Assessment
All
05-06 Grade 2
05-06 Grade 3
06-07 Grade 3
(55) 80%
(10) 83%
(14) 82%
(31) 76%
(28) 58%
(11) 65%
(8) 65%
(9) 45%
(25) 47%
(17) 50%
(6) 35%
(2) 100%
The following table displays the aggregate MCA II Outcomes for students who exited, continued, or
stopped the MRC program:
Figure 4: MCA II Outcomes for MRC Students
43
Discussion and Questions for Further Study
The group of students who met exit criteria for the MRC program after participation for at least 3
weeks met or exceeded grade level reading standards as measured by the MCA II assessment at
significantly higher rates relative to groups of students who did not meet program exit criteria.
Described relevant to target performance, all students in the current pilot study began participation
in the MRC program at a time when they were judged based on reliable valid screening data to have
less than a 75% chance of meeting the grade 3 state standards in reading as measured by the MCA
II. Eighty percent of those participating children who successfully exited from the MRC program
did meet or exceed state standards in reading. This promising result prompts further investigation.
This pilot study has been conducted with a small, non-randomized, and arguably non-representative
sample of the full statewide population of Minnesota Reading Corps participating students. As such,
it will be important to conduct this analysis on a larger scale in order to determine if results from this
study are generalizable.
During the 05-06 and 06-07 school years, MRC Members did not collect additional data for students
after they had successfully exited from the program. Therefore, it cannot be confirmed that
students who successfully exited the program at one point in the year went on to meet or exceed end
of year program target scores. Starting in the 2008-2009 school year, Members will collect mid-year
and end-of-year assessment data on all available students who participate in the program regardless
of their active, exited, or stopped status. These additional data will allow Members to re-start MRC
service for any previously exited student who appears to need additional support, and will allow a
more thorough investigation of the reliability of the program exit criteria, and generalizability of
program target scores across the broader state wide population.
Additional study of the initial level of performance of participating students is also warranted at this
time in order to determine the extent to which the size of initial discrepancy from target score
effects student program outcomes. It is possible that size of initial discrepancy from target score is
predictive of success in the MRC program.
It is noted that MCA II data were not available for all participating MRC students, and that there
was a smaller percentage of students who participated in the MRC program in grade 2 for whom
grade 3 MCA II results were available relative to the grade 3 participating cohorts. With student
transience rates up to around 40% per year in some participating schools, this lack of data was not
unexpected. However, it may be desirable to investigate methods for collecting MCA II scores for
students who have moved out of MRC participating sites in order to obtain a more complete data
set for analysis.
In addition, without a comparison group, it is impossible to judge whether students participating in
the current study performed any differently on the MCA II than they would have without the
benefit of the MRC program. Analysis to place these outcomes into a broader context will help
inform the field regarding the success of this program.
44
Are the Organizations with Which the MRC is Working Changing to Adopt the Practices of
the MRC?
In order to address the extent to which participation in MRC results in systems level change for
programs, survey were distributed to site supervisors and internal coaches at each participating site
in the spring of the 2007-2008 school year. Respondents were asked to provide information on
their perceptions of impact that MRC had on the local system, and the extent to which the MRC
model is becoming an organizational structure for the local building. In all 60 respondents
participated in the survey. 31.0% of respondents identified themselves as a Site Supervisor, 36.2%
as an Internal Coach, and 32.8% as having both roles. Overall, 97.8% of respondents agreed or
strongly agreed that the MRC program adds value to the instructional program at the local school or
center, and 90.0% of respondents indicated that the participation in the MRC program has been
somewhat influential or very influential in prompting systems change at the local school or center.
As evidenced by results in the table below, respondents consistently indicated philosophical
alignment with the MRC program. This may be a result of selection bias, whereby sites that elect to
apply for participation in MRC program already share a similar vision for reading instruction.
Table 22: Internal Coach and Site Supervisor Perception of Philosophical Alignment with MRC
Describes
well
Describes
partially
Describes
a little bit
Does not
describe
at all
78.9% (45)
17.5% (10)
3.5% (2)
0.0% (0)
40.4% (23)
45.6% (26)
12.3% (7)
1.8% (1)
36.4% (20)
36.4% (20)
18.2% (10)
9.1% (5)
Our school/center staff is receptive to including
Minnesota Reading Corps in our program.
77.6% (45)
17.2% (10)
1.7% (1)
3.4% (2)
Minnesota Reading Corps members make a
positive difference in the way that students'
literacy needs are met at our school/center.
87.7% (50)
8.8% (5)
3.5% (2)
0.0% (0)
Survey Questions
The MRC model and approach to literacy
complements our program.
Our school/center and Minnesota reading Corps
measure student progress in the same way.
The MRC program is an integrated part of our
building’s pre-referral, or other problem solving
or targeted intervention system.
Site Supervisors and Internal Coaches also responded to a series of questions designed to elicit their
perceptions of the role that MRC program participation had in specific systems change outcomes to
their sites. The table below summarizes the results of these items, and provides further support for
the role the MRC program has played in systems change in participating sites.
45
Table 23: Internal Coach and Site Supervisor Perception of Systems Change Resulting from MRC
Due (at least in part) to our participation in the MRC
program…
Strongly
Agree
Agree
Disagree
Strongly
disagree
our building has begun or will begin collecting fluency based
screening measures for all students K-3 at least 3 times per year.
31.0% (9)
31.0% (9)
13.8% (4)
24.1% (7)
30.6% (11)
44.4% (16)
8.3% (3)
16.7% (6)
20.0% (6)
66.7% (20)
0.0% (0)
13.3% (4)
teachers now collect more frequent fluency data using literacy
measures for pre-k students of concern.
14.3% (1)
71.4% (5)
14.3% (1)
0% (0)
teachers now view progress monitoring as an important
method to evaluate the impact of instruction on age 3 to grade
3 students.
24.3% (9)
59.5% (22)
8.1% (3)
8.1% (3)
our school now uses aggregated data as one way to evaluate the
instructional practices of the age 3 to grade 3 sites.
20.6% (7)
58.8% (20)
5.9% (2)
14.7% (5)
the building principal now shares data on k-3 student
performance with the superintendent or school board.
22.2% (6)
44.4% (12)
22.2% (6)
11.1% (3)
teachers now share progress monitor graphs with parents of
age 3 to grade 3 students.
24.3% (9)
56.8% (21)
8.1% (3)
10.8% (4)
32.4% (12)
51.4% (19)
13.5% (5)
2.7% (1)
27.0% (10)
59.5% (22)
8.1% (3)
5.4% (2)
the district has now adopted its own data warehouse system for
efficiently storing and accessing data for k-3 students.
29.6% (8)
29.6% (8)
33.3% (9)
7.4% (2)
the district has taken concrete steps I am aware of to formally
link Pre-K with K-3 literacy instruction.
22.2% (6)
48.1% (13)
25.9% (7)
3.7% (1)
our pre-k instructional environments are more literacy rich.
12.5% (1)
50.0% (4)
25.0% (2)
12.5% (1)
our pre-k teachers use the "Big 5" as the central framework for
literacy instruction.
0% (0)
42.9% (3)
42.9% (3)
14.3% (1)
the level of family involvement in literacy skill development for
our pre-k students has increased.
0% (0)
50.0% (4)
37.5% (3)
12.5% (1)
0% (0)
50.0% (4)
25.0% (2)
25.0% (2)
0% (0)
25.0% (2)
50.0% (4)
25.0% (2)
teachers in our building now use screening data to assist in
identifying age 3 to grade 3 students for supplemental
interventions.
teachers now regularly review progress- monitoring data
(weekly graphs) of k-3 students receiving supplemental
interventions.
greater emphasis has been placed on selecting reading
interventions for age 3 to grade 3 students that have a scientific
research base.
instruction is now modified if age 3 to grade 3 student
performance is not improving based on the progress
monitoring data collected.
our pre-k teachers more strongly display the attributes of a
SEEDS Quality Teacher (Sensitive and responsive, Encourages
and enjoys, Educates, Develops through doing, and Self-image
for school readiness).
there has been an increase in the number of our pre-k staff who
are pursuing more advanced credentials in early childhood
education or in a related field.
46
What is the Impact of the MRC Experience on the AmeriCorps Members?
In order to address the impact of the MRC experience on the participating AmeriCorps Members,
an electronic survey was distributed to Members at each participating site in the spring of the 20072008 school year. Respondents were asked to provide information on their perceptions of impact
that participation in the year of service as a Reading Corps Member has had on personal belief
systems related to literacy instruction, and future plans related to children’s literacy. In all, 133
respondents participated in the survey. 41.4% of respondents identified themselves as K-3 Literacy
Coordinators, 18.0% as K-3 Volunteer Coordinators, and 40.6% as Pre-K Classroom Members.
75.4% of all those responding identified themselves as full time Members. The table below
summarizes aggregate responses from all survey participants. Overall, the survey results indicate
that Members had a very positive experience with the Minnesota Reading Corps, believe their
service provided benefit to sites, and intend to continue pursuit of literacy support for children in
future endeavors.
Table 24: Reading Corps Member Perception of MRC Impact
Survey Item
I felt welcomed and part of the team at my
site.
The MRC program was well integrated
with other interventions or initiatives
ongoing at my site.
There are many more students at my site
who could benefit than the number I could
fit in my schedule.
Participation in the MRC program had a
positive impact on me this school year.
Participation in the MRC program had a
positive impact on the site I served this
school year.
Participation on the MRC program had a
positive impact on the students I served
this year.
As a result of my participation in the MRC
program, I am considering a career
involving children
As a result of my participation in the MRC
program, I am considering a career in
teaching or education
As a result of my participation in the MRC
program, I am committed to continued
volunteering in schools
Strongly
agree
Agree
Disagree
Strongly
disagree
52.3% (58)
38.7% (43)
5.4% (6)
3.6% (4)
36.6% (41)
49.1% (55)
12.5% (14)
1.8% (2)
29.4% (20)
41.2% (28)
26.5% (18)
2.9% (2)
67.9% (72)
30.2% (32)
0.9% (1)
0.9% (1)
71.7% (76)
27.4% (29)
0.9% (1)
0.0% (0)
83.2% (89)
16.8% (18)
0.0% (0)
0.0% (0)
44.1% (45)
39.2% (40)
11.8% (12)
4.9% (5)
38.8% (40)
33.0% (34)
21.4% (22)
6.8% (7)
38.5% (40)
45.2% (47)
10.6% (11)
5.8% (6)
47
As a result of my participation in the MRC
program, I am committed to ongoing
promotion of childhood literacy.
As a result of my participation in the MRC
program, I am committed to continued
community service.
As a result of my participation in the MRC
program, if a job I hold in the future does
not have community service as part of it's
mission, I will encourage the organization
to include opportunities for community
service.
73.8% (79)
24.3% (26)
0.0% (0)
1.9% (2)
50.5% (54)
39.3% (42)
6.5% (7)
3.7% (4)
40.6% (43)
47.2% (50)
9.4% (10)
2.8% (3)
48
References
Allingon, R. (1983). Fluency: The neglected reading goal. The Reading Teacher, 36, 556-561.
Ehrhardt, K. E., Barnett, D. W., Lentz, F. E., Stollar, S. A. & Reifin, L. H. (1996). Innovative
methodology in ecological consultation: Use of scripts to promote treatment acceptability
and integrity. School Psychology Quarterly, 11, 149–168.
Erlbaum, B., Vaughn, S., Hughes, M. T. & Moody, S. W. (2000). How effective are one-to-one
tutoring programs in reading for elementary students at risk for reading failure? A metaanalysis of the intervention research. Journal of Educational Psychology, 92, 605–619.
Harn, B. A., Linan-Thompson, S. & Roberts, G. Intensifying instruction: Does
additional instructional time make a difference for the most at-risk first graders?
Journal of Learning Disabilities. Vol. 41(2) Mar-Apr 2008, 115-125.
Hintze, J. M. & Silberglitt, B. A. Longitudinal Examination of the Diagnostic Accuracy
and Predictive Validity of R-CBM and High-Stakes Testing. School Psychology Review. Vol 34(3)
2005, 372-386.
Johnston, F. R., Invernizzi, M. & Juel, C. (1998). Book Buddies: Guidelines for volunteer tutors
of emergent and early readers. New York: Guilford.
Kame-enui, E.J. (1993). Diverse learners and the tyranny of time: Don’t fix blame; fix the leaky
roof. The Reading Teacher, 46, 376-383.
No Child Left Behind Act of 2001, PL 107-110, 115 Stat. 1425m 20 U.S.C. §§ 6301 et. seq.
Phaneuf, R. L. & Silberglitt, B. Tracking Preschoolers' Language and Preliteracy
Development Using a General Outcome Measurement System: One Education District's
Experience.Topics in Early Childhood Special Education. Vol. 23(3) Sep 2003, 114-123.
Rashotte, C. A. & Torgesen, J. K. (1985). Repeated reading and reading fluency in learning
disabled children. Reading Research Quarterly, 20, 180–188.
49
Rayner, K., Foorman, B.R., Perfetti, C.A., Pesetsky, D. & Seidenberg, M.S. (2001). How
psychological science informs the teaching of reading. Psychological Science in the Public
Interest, 2, 31-73.
Silberglitt, B., Burns, M. K., Madyun, N. H. & Lail, K. E. Relationship of
reading fluency assessment data with state accountability test scores: A longitudinal
comparison of grade levels. Psychology in the Schools. Vol. 43(5) May 2006, 527-535.
Silberglitt, B. & Hintze, J. M. Formative Assessment Using CBM-R Cut Scores to Track
Progress toward Success on State-Mandated Achievement Tests: A Comparison of
Methods. Journal of Psychoeducational Assessment. Vol. 23(4) Dec 2005, 304-325.
Silberglitt, B. & Hintze, J. M. How much growth can we expect? A conditional analysis
of R--CBM growth rates by level of performance. Exceptional Children. Vol. 74(1) Fall 2007, 71
84.
Snow, C. E., Burns, M. S. & Griffin, P. (1998). Preventing reading difficulties in young children.
Washington, DC: National Academy Press.
Tenenbaum, H. A. & Wolking, W. D. Effects of Oral Reading Rate and Inflection on
Intraverbal Responding. The Analysis of Verbal Behavior. 1989, 7. 83-89.
Power, T. J., Dowrick, P. W., Ginsburg-Block, M. & Manz, P.H. (2004). Partnership-Based,
Community-Assisted Early Intervention for Literacy: An Application of the Participatory
Intervention Model. Journal of Behavioral Education, Vol. 13, No. 2, 93–115.
Torgesen, J. K., Rashottte, C. A., Alexander, A., Alexander J. & MacPhee, K. (2003). Progress
toward understanding the instructional conditions necessary for remediating reading
difficulties in older children. In B.R. Foorman (Ed.), Interventions for children at-risk for reading
difficulties or identified with reading difficulties, pp. 275-298. Timonium, MD: York Press.
Torgesen, J. K., Wagner, R. K., Rashotte, C. A., Rose, E., Lindamood, P., Conway, T., et. al. (1999).
Preventing reading failure in young children with phonological processing. disabilities: Group
and individual responses to instruction. Journal of Educational Psychology, 91, 579-593.
Vadasy, P. F., Jenkins, J. R., Antil, L. R., Wayne, S. K. & O‘Connor, R. E. (1997). The effectiveness
of one-to-one tutoring by community tutors for at-risk beginning readers. Learning Disability
Quarterly, 20, 126–140.
50
Vellutino, F. R., Scanlon, D. M. & Tanzman, M. S. The case for early intervention in
diagnosing specific reading disability. Journal of School Psychology. Vol. 36(4) Win 1998, 367-397.
Wasik, B. (1997). Volunteer tutoring programs: Do we know what works? Phi Delta Kappan, 79, 282–
288.
51