Introduction to Response to Intervention: A

Best Practices in Data-Based
Decision Making Within an RTI
Model
Gary L. Cates, Ph.D.
Illinois State University
GaryCates.net
Ben Ditkowsky, Ph.D.
Lincolnwood School District 74
MeasuredEffects.Com
Acknowledgments
• Cates, Blum, & Swerdlik (2011). Authors of Effective RTI
Training and Practices: Helping School and District Teams
Improve Academic Performance and Social Behavior and
this PowerPoint presentation. Champaign, IL: Research
Press.
Universal Core
Curriculum
Universal
Screening
Measures
Identification of
At-Risk Students
Standard
Educational
Diagnostic Tool
Tier II Standard
Protocol
Instruction
Progress
Monitoring
Individualized
Diagnostic
Assessment
Tier III
Individualized
Instruction
Progress
Monitoring
Special Education
Progress
Monitoring
Entitlement
Response to Intervention Is
Data Based, Decision Making
• Comprehensive system of student support for
academics and behavior
• Has a prevention focus
• Matches instructional needs with scientifically
based interventions/instruction for all
students
• Emphasizes data-based decision making
across a multi-tiered framework
Tier III
Individualized
Instruction
Tier II
Small-Group Standard
Protocol Instruction
Tier I
Core Universal Curriculum
Data Based Decision Making with
Universal Screening Measures
Presentation Activity 1
• What have you heard about universal
screening measures?
• What are your biggest concerns?
3 Purposes of Universal Screening
 Predict which students are at risk for not
meeting AYP (or long-term educational goals)
 Monitor progress of all students over time
 Reduce the need to do more in-depth diagnostic
assessment with all students
 Needed for reading, writing, math, and
behavior
Rationale for Using Universal
Screening Measures
 It is analogous to medical check-ups (but
three times a year, not once)
 Determine whether all students are
meeting milestone (i.e., benchmarks) for
predicted adequate growth
 Provide intervention/support if they are
not
Characteristics of Universal Screening
Measures
 Brief to administer
 Allow for multiple administration
 Simple to score and interpret
 Predict fairly well students at risk for not
meeting AYP
Presentation Activity 2
• What universal screening measures do you
have in place currently for:
– Reading?
– Writing?
– Math?
– Behavior?
• How do these fit with the characteristics of
USM outlined on the previous slide?
Examples of Universal Screening
Measures for Academic
Performance (USM-A)
Curriculum-Based Measurement
Data-Based Decision Making
with USM-A
Student Identification: Percentile
Rank Approach
• Dual discrepancy to determine a change in
intensity (i.e., tier) of service
• Cut Scores
– Consider percentiles
– District-derived cut scores are based on screening
instruments’ ability to predict state scores
• Rate of Improvement
– Average gain made per day/per week?
sampling of students
all students included
Student
S, A
K, D
F, M
H, A
E, S
P, A
K, C
S, D
B, C
E, A
A, B
R, P
M, W
G, S
J, J
M, A
B, J
P, M
A, D
M, T
D, Z
M, M
D, A
K, A
A, J
Teacher
Smith
Jones
Smith
Smith
Smith
Jones
Jones
Armstrong
Armstrong
Armstrong
Smith
Armstrong
Jones
Jones
Smith
Smith
Jones
Smith
Armstrong
Jones
Armstrong
Smith
Jones
Armstrong
Jones
Fall Winter
WRC WRC
209
208
159
170
134
156
130
148
115
145
96
133
109
114
66
112
92
94
61
80
39
65
42
63
50
60
28
58
20
54
38
51
47
48
47
45
38
45
42
41
31
39
30
38
18
38
8
21
7
18
Winter
Percentile
Rank
1.00
0.93
0.90
0.81
0.75
0.68
0.51
0.46
0.36
0.25
0.24
0.22
0.20
0.19
0.17
0.15
0.14
0.10
0.10
0.08
0.07
0.03
0.03
0.02
0.00
Classification
Well Above Average
Well Above Average
Above Average
Above Average
Average
Average
Average
Average
Average
Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Well Below Average
Well Below Average
Well Below Average
Well Below Average
Well Below Average
Well Below Average
Student Identification:
Dual-Discrepancy Approach
• Rate of Improvement
• Average gain made per day/per week?
• Compared to peers (or cut score) over time
all students included
sampling of students
Student
S, A
K, D
F, M
H, A
E, S
P, A
K, C
S, D
B, C
E, A
A, B
R, P
M, W
G, S
J, J
M, A
B, J
P, M
A, D
M, T
D, Z
M, M
D, A
K, A
A, J
Teacher
Fall
WRC
Smith
Jones
Smith
Smith
Smith
Jones
Jones
Armstrong
Armstrong
Armstrong
Smith
Armstrong
Jones
Jones
Smith
Smith
Jones
Smith
Armstrong
Jones
Armstrong
Smith
Jones
Armstrong
Jones
209
159
134
130
115
96
109
66
92
61
39
42
50
28
20
38
47
47
38
42
31
30
18
8
7
Winter
WRC
208
170
156
148
145
133
114
112
94
80
65
63
60
58
54
51
48
45
45
41
39
38
38
21
18
Winter
Percentile
Rank
1.00
0.93
0.90
0.81
0.75
0.68
0.51
0.46
0.36
0.25
0.24
0.22
0.20
0.19
0.17
0.15
0.14
0.10
0.10
0.08
0.07
0.03
0.03
0.02
0.00
Classification
Rate of
Progress
Average
Rate of
Progress
Well Above Average
Well Above Average
Above Average
Above Average
Average
Average
Average
Average
Average
Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Below Average
Well Below Average
Well Below Average
Well Below Average
Well Below Average
Well Below Average
Well Below Average
-0.1
0.6
1.2
1.0
1.7
2.1
0.3
2.6
0.1
1.1
1.4
1.2
0.6
1.7
1.9
0.7
0.1
-0.1
0.4
-0.1
0.4
0.4
1.1
0.7
0.6
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
1.3
Dual Discrepancy
• Discrepant from peers (or empirically
supported cut score) at data collection point 1
(e.g., fall benchmark)
• Discrepancy continues or becomes larger at
point 2 (e.g., winter benchmark)
– This is referred to a student’s rate of improvement
(ROI)
Resources as a Consideration
• Example of comparing percentile rank or some
national cut score without considering
resources
• You want to minimize:
– False positives
– False negatives
• This can be facilitated with an educational
diagnostic tool
Correlations
• Direction (positive or negative)
• Magnitude/strength (0 to 1)
• If you want to understand how much overlap
(i.e., variance) between the two is explained,
then square your correlation
r = .70
then about 49% overlap (i.e., variance)
FALSE POSITIVES
Further Diagnostic Assessment
200
STUDENT PERFORMANCE ON HIGH-STAKES TEST
195
Negatives for At-Risk
190
185
180
175
170
165
160
155
150
145
140
135
False Negatives
Additional Data Currently Available
130
POSITIVES for At-Risk
125
120
0
10
20
30
40
50
60
70
80
90
100
110
120
130
140
150
160
170
180
Words Read Correctly Per Minute - 2nd Grade
Relationship Between ORF In Fall of 2 nd Grade and High-Stakes Testing in 3rd Grade
190
200
A Word About Correlations
• A correlation tells us about the strength of a relationship
• A correlation does not tell…
– …the direction of the relationship
• If A causes B, or if B cause A <or>
– …if the relationship is causal or if there is another variable
• if C causes A and B
• Strong correlations do not always equate to accurate
prediction of specific populations
Presentation Activity 3
• How are you currently making data-based
decisions using the universal screening
measures you have?
• Do you need to make some adjustments to
your decision-making process?
• If you answered yes to the question above,
What might those adjustments be?
Data-Based Decision Making
with USM-B
Some Preliminary Points
• Social behavior screening is just as important
as academic screening
• We will focus on procedures (common sense is
needed: If a child displays severe behavior,
then bypass the system we will discuss today)
• We will focus on PBIS and SSBD
– The programs are examples of basic principles
– You do not need to purchase these exact
programs
Screening:
Office Discipline Referrals
Confirmation:
And
Rating Scales
Teacher Nomination
Office Discipline Referrals
• Good as a stand-alone screening tool for
externalizing behavior problems
• Also good for analyzing schoolwide data
– Discussed later
Teacher Nomination
•
•
•
•
Teachers are generally good judges
Nominate three students as externalizers
Nominate three students as internalizers
Trust your instincts and make decision
– There will be more sophisticated process to
confirm your choices
Confirming Teacher Nominations
with Other Data
• Teacher, Parent, and Student Rating Scales
– BASC
– CBCL (Achenbach)
Example: Systematic Screening for
Behavior Disorders (SSBD)
• Critical Events Inventory:
– 33 severe behaviors (e.g., physical assault,
stealing) in checklist format
– Room for other behaviors not listed
• Adaptive Scale: Assesses socially appropriate
functional skills (e.g., following teacher
directions)
• Maladaptive Scale: Assesses risk for developing
antisocial behavior (e.g., testing teacher limits)
Data-Based Decision Making Using
Universal Screening Measures for Behavior
• Computer software available
• Web-based programs also available
• See handout (Microsoft Excel Template)
Average Referrals Per Day Per Month
Average Referrals Per Day Per Month
AVERAGE REFERRALS PER DAY
2.5
2
1.5
1
0.5
0
August
September
October
November
December
January
February
March
April
May
June
ODR Data by Behavior
Number of Referrals by Behavior Type
25
Number of Referrals
20
15
10
5
0
ODR Data by Location
Number of Referrals by Location
20
18
Number of Referrals
16
14
12
10
8
6
4
2
0
Hallway
Bathroom Classroom
Cafeteria
Locker
Room
Office
Playground
Bus
Gym
Music
Room
Library
Parking Log Unknown
5:00 PM
4:45 PM
4:30 PM
4:15 PM
4:00 PM
3:45 PM
3:30 PM
3:15 PM
3:00 PM
2:45 PM
2:30 PM
2:15 PM
2:00 PM
1:45 PM
1:30 PM
1:15 PM
1:00 PM
12:45 PM
12:30 PM
12:15 PM
12:00 PM
11:45 AM
11:30 AM
11:15 AM
11:00 AM
10:45 AM
10:30 AM
10:15 AM
10:00 AM
9:45 AM
9:30 AM
9:15 AM
9:00 AM
8:45 AM
8:30 AM
8:15 AM
8:00 AM
7:45 AM
7:30 AM
7:15 AM
7:00 AM
Number of Referrals
ODR Data by Time of Day
Number of Referrals by Time of Day
9
8
7
6
5
4
3
2
1
0
ODR Data by Student
Number of Referrals by Student
14
Number of Referrals
12
10
8
6
4
2
0
3
15
18
21
22
23
29
30
36
41
48
49
51
52
53
70
88
92
107
128
129
133
Review of Important Points:
Academic Peformance
• USMs used for screening and progress
monitoring
• It is important to adhere to the characteristics
when choosing a USM
• USM-A’s typically are similar to curriculumbased measurement procedures
• There are many ways to choose appropriate
cut scores, but it is critical that available
resources be considered
Review of Important Points:
Behavior
• Social behavior is an important area for
screening
• Number of office discipline referrals is a strong
measure for schoolwide data analysis and
external behavior
• Both internalizing and externalizing behaviors
should be screened using teacher nominations
• Follow-up with rating scales
• Use computer technology to facilitate the
data-based decision-making process
Data Based Decision Making with
Diagnostic Tools for Academic
Performance and Social Behavior
Presentation Activity 1
• What have you heard about diagnostic tools?
• What are your biggest concerns?
3 Purposes of Diagnostic Tools
 Follow up with any student identified on the
USM as potentially needing additional support
 Identify a specific skill or subset of skills for
which students need additional instructional
support
 Assist in linking students with skill deficits to
empirically supported intervention
Rationale for Using Universal
Screening Measures
 Rule out any previous concerns flagged by
a universal screening measure
 Find an appropriate diagnosis
 Identify an effective treatment
Characteristics of Diagnostic Tools
 Might be administered in a one-to-one format
 Require more time to administer than a USM
 Generally contain a larger sample of items than a
USM
 Generally have a wider variety of items than a
USM
Presentation Activity 2
• What diagnostic tools (DT) do you have in
place currently for:
– Reading?
– Writing?
– Math?
– Behavior?
• How do these fit with the characteristics of
DTs outlined on the previous slide?
Examples of Diagnostic Tools for
Academic Skills (DT-A) at Tier III and
Special Education
Curriculum Based Evaluation
Curriculum-Based Evaluation
1. Answer this: What does the student need in
addition to what is already being provided (i.e.,
intensification of service)?
2. Conduct an analysis of student responding
–
–
–
Record review: Work samples
Observation: Independent work time
Interview: Ask the student why he or she struggles
3. Develop a hypothesis based on the above
4. Formulate a “test” of this hypothesis
Data-Based Decision Making
with DT-A
Example of CBE: Tammy
• Fourth-grade student
• Did not make adequate progress with the Tier II
standard protocol intervention in winter
• School psychologist administered an individual probe
(i.e., diagnostic tool) and observed Tammy’s
completion of this probe
• An analysis of responding yielded a diagnosis of the
problem
• This diagnosis of the problem informs intervention
selection
1. What seems to be
the problem?
2. What should the
intervention target?
3. Describe something a
teacher could do to target
this problem.
4. Do you have to buy
an expensive program
just for Tammy?
Revisiting the 3 Purposes of
Diagnostic Tools: Tammy
 Follow up with any student identified on the
USM as potentially needing additional support
 Identify a specific skill or subset of skills for
which students need additional instructional
support
 Assist in linking students with skill deficits to
empirically supported intervention
Revisiting the Characteristics of Diagnostic
Tools: Tammy
 Might be administered in a one-to-one format
 Require more time to administer than a USM
 Generally contain a larger sample of items than a
USM
 Generally have a wider variety of items than a
USM
Presentation Activity 3
• How are you currently making data-based
decisions using the diagnostic tools you have?
• Do you need to make some adjustments to
your decision-making process?
• If you answered yes to the question above,
what might those adjustments be?
Data-Based Decision Making
with Diagnostic Tools for Social
Behavior (DT-B)
Screening:
Teacher
Nomination
And
Office Discipline
Referrals
Confirmation:
Rating Scales
Descriptive
Functional
Assessment:
Interviews,
Record Review,
Observations
Experimental
Functional
Analysis:
FBA plus
Manipulation of
the environment
to note effects
Office Discipline Referrals
• Good as a stand-alone screening tool for
externalizing behavior problems
• Also good for analyzing schoolwide data
– Discussed later
• See example teacher nomination form –
Chapter 2 of book and on CD
Teacher Nomination
•
•
•
•
Teachers are generally good judges
Nominate three students as externalizers
Nominate three students as internalizers
Trust your instincts and make decision
– There will be more sophisticated process to
confirm your choices
• See example teacher nomination form –
Chapter 2 of book and on CD
Confirming Teacher Nominations
with Other Data
• Teacher, Parent, and Student Rating Scales
– BASC
– CBCL (Achenbach)
Example: Systematic Screening for
Behavior Disorders (SSBD)
• Critical Events Inventory:
– 33 severe behaviors (e.g., physical assault,
stealing) in checklist format
– Room for other behaviors not listed
• Adaptive Scale: Assesses socially appropriate
functional skills (e.g., following teacher
directions)
• Maladaptive Scale: Assesses risk for developing
antisocial behavior (e.g., testing teacher limits)
Functional Assessment and/or
Experimental Functional Analysis
• Set of procedures that requires extensive
training
• Functional Assessment: Results in a testable
hypothesis about reason for behaviors (e.g.,
social attention, escape, tangible
reinforcement, sensory reinforcement)
• Functional Analysis: Results in empirical
support for the tested hypothesis
Functional Assessment:
Remember to RIOT
• Record review
– ODRs, antecedent-behavior-consequence (A-B-C)
logs, teacher narratives
• Interview
– Teacher, child, parent, key personnel
• Observation
– A-B-C logs, frequency counts
– Classroom observations
• Test (not done): This is what the experimental
functional analysis is all about
Data-Based Decision Making Using DT-B:
Antecedent-Behavior-Consequence Logs
Behavior Recording LOG
Directions: Please be as specific as possible.
Child’s Name:
Karyn E._______________________
Grade: 2nd
Setting: School: Library, classroom, recess
Date
Time
Setting
Where did the behavior
take place?
10/14
9:15
Library
Task
What should student be
doing?
Picking out a book
Behavior
What did
student do?
Pushed a peer
Threw glue
bottle at peer
10/16
10:05
Small group art project
Working with peers
10/17
9:45
Recess
Free play
10/18
10/19
9:00
10:45
Classroom
Classroom
Date: _4/30_________
Teacher: Mrs. Becker
Observer: Ryan M.____________________
Transitioning between
reading and specials (today
was computer skills)
Working with peers on
piñata
Consequences
How did you and/or
students react?
I sent him to the
office
Effect
What happened after these
reactions?
Came back and was polite
Given a time-out in
the hall
Came back in calm
Hit peer in face
with small
pebble
Stood him against
wall. Peer cried
Went to class with bad
attitude
Did not
transition
quietly
Reminded him he
must transition
quietly
He continued singing “don’t
you wish you girlfriend was
hot like me” and asking a
peer about American idol –
He even asked if I watched
it.
Pushed peer’s
work materials
on the floor
Sent him to the
office and called
mother
His mother picked him up
and took him home
Comments:
As you can see he is often rude, does not respond well to traditional discipline, and is aggressive towards peers.
1. What patterns do you see here? 2. What is the likely function of behavior?
Data-Based Decision Making Using DT-B:
Frequency Counts
1. What day does the behavior
most often occur? What day
is it least likely to occur?
2. What time of day does the
behavior most often occur?
Least often?
3. When should someone come
to visit if they wanted to
witness the behavior?
Note: It is just as important to look
at when the behavior occurs
as it is to look at when it doesn’t.
Data-Based Decision Making Using DT-B:
Direct Behavioral Observations
Behavioral Observation Form
Target Student Name:_Larry F.__________________
Birth date: 4/1/1998____
School: Metcalf__________________________________
Teacher: Havey_____
Observer: _Blake M.__________________________
Behavior(s)
Behavior 1: Aggression (A)
Date: ___5/30/________
Definitions
Physical or verbal actions toward another person that has
potential for harm
Verbalizations without permission
Oriented to academic task or appropriate engagement with
materials
Behavior 2: Talk-outs (TO)
Behavior 3: On-task (OT)
Behavior 4:
Behavior 5:
Target Child
Behavior 1
1 A
2 TO
X
3 OT
X
4
5
2
3
X
X
4
5
6
7
X
X
X
X
8
X
X
9
X
X
10 11 12 13 14 15 16 17 18 19 20
X
X
X
X X X X
X X X X X X
Behavior 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40
1 A
X
X
2 TO
X X
X
X
X
X
3 OT
X
X
X
X
X
4
5
Composite Child
Behavior 1
1 A
2 TO
X
3 OT
X
4
5
2
3
4
5
6
X
7
8
9
X
X
X
X
X
X
X
X
10 11 12 13 14 15 16 17 18 19 20
X
X
X
X
X
X
X
X
X
X
X
Behavior 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40
1 A
2 TO
X
X
3 OT
X X X X X X X X
X X X X X
X X X X
4
5
TCB1 _4/40_
TCB2 __12/40
TCB3 22/40_
TCB4 ______
TCB5 ______
CCB1 _1/40_
CCB2 _5/40_
CCB3 _35/40
CCB4 ______
CCB5 ______
(#Occurrences/#Observations) X 100
1. What can you get from this?
2. Are all of these behaviors
severe enough to warrant
individualized intervention?
Experimental Functional Analysis
• Experimentally testing a hypothesis about why
a behavior occurs:
– Social attention
– Escape
– Tangible reinforcement
– Sensory reinforcement
• Requires expertise, cooperation, and time
• Strongest empirically supported method
available today for identifying cause(s) of
behavior
Example of Experimental Functional
Analysis: Talking Out in Class
Potential Function
Tangible reinforcement
Test Condition
Contingent access to
reinforcement
Attention
Contingent reprimand
Escape
Contingent break upon talking
out after demand
Sensory stimulation
Leave isolated in room
Control condition
Free time with attention and
no demands
3
2.8
RATE OF TALKING OUT BEHAVIOR
2.6
Attention
2.4
2.2
2
1.8
1.6
1.4
1.2
1
Escape
Tangible R+
0.8
0.6
0.4
0.2
Toy Play
0
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
SESSIONS
What is the primary function of behavior?
15
16
17
Review of Important Points
• Three Purposes for Diagnostic Tools
– As a follow-up to USM
– To identify a specific skill that needs additional support
– To assist in linking students to intervention
• Four Characteristics of Diagnostic Tools
– Might be administered in a one-to-one format
– Require more time to administer than a USM
– Generally contain a larger sample of items than a USM
– Generally have a wider variety of items than a USM
Review of Important Points
• DT-A procedures may differ at Tiers II and III
• DT-B procedures may differ at Tiers II and III
• DT data are not the only data to consider
when developing an intervention
Progress Monitoring
Evaluating Intervention Effects
Purpose and Rationale
• Determine student responsiveness to
intervention at any tier
• Ensure that students are receiving an
appropriate level and type of instructional
support
• Identify problems early if performance “slips”
are observed
Characteristics of Progress
Monitoring Tools
• Similar to USM:
– Brief to administer
– Allow for multiple administrations and repeated
measurement of student performance
– Simple to score and interpret
• Can often be administered to groups of
students
Progress Monitoring Tools for
Academics (PMT-A)
• Curriculum-Based Measurement (CBM)
– Reading: DIBELS, AIMSweb, easyCBM
– Math: AIMSweb, easyCBM
• Progress should be presented on a graph to all
stakeholders (parent/guardian, student,
teacher, principal)
Progress Monitoring Tools for
Behavior (PMT-B)
• Completion of forms
– Review data collection forms on topics related
diagnostic testing
• Collection of observation data
• Progress should be presented on a graph to all
stakeholders (parent/guardian, student,
teacher, principal)
• These graphed data should be similar to
baseline/diagnostic data
Frequency of Progress Monitoring:
A Tiered Approach
• Tier I
– Three times per year at grade level
• Tier II
– Once per week on grade-level probe
– Once per week on intervention effects
• Tier III
– Once per week at grade level
– Nearly daily monitoring of intervention effects
• Special Education
– Once per week at grade level
– Nearly daily monitoring of intervention effects
Data-Based Decision Making
with Progress Monitoring Tools
Evaluating Intervention Effectiveness
Rate of Improvement Relative to Peers
• Performing a gap analysis between target
student(s) and same-grade peers
• Goal of the intervention is to decrease gap
• Minimal desired outcome is to maintain gap
(i.e., keep student from falling farther behind)
• At least two measurements are needed
105
Average Spring Performance
100
95
Average Winter Performance
90
85
Goal Line
80
WORDS READ CORRECTLY PER MINUTE
75
Average Fall Performance
70
65
60
55
Student Expectation
50
Student Aim Line
45
40
Student Baseline
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Gap Analysis
The gap was maintaining (as shown on previous
slide)
• We would prefer to see the gap decrease (as
shown on next slide)
• We need a more potent intervention
– More time
– Different intervention
Average Spring Performance
105
100
Average Winter Performance
95
90
85
Goal Line
80
WORDS READ CORRECTLY PER MINUTE
75
Average Fall Performance
Student Goal
70
65
60
55
Student Aim Line
50
45
40
35
Student Baseline
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Rate of Improvement Relative to Criterion
• Focus on decreasing gap between student’s
current performance a specific criterion
– Example: Cut score that might predict student
meeting AYP
• This may be higher than the average peer
performance in low-functioning schools
• This may be lower than the average peer
performance in high-functioning schools
105
Spring Benchmark
100
95
90
85
Goal Line
80
75
WORDS READ CORRECTLY PER MINUTE
Fall Benchmark
Winter Benchmark
and Student Goal
70
65
60
55
50
Student Aim Line
45
40
Student Baseline
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Evaluating Intervention Outcomes
Comparing Slopes
How long must an intervention be
implemented before calling it quits?
•
•
•
•
Whatever the manual says
10-15 data points
Quarter system?
Do not stop an intervention until a prespecified date based on one of the above has
been reached!
– Doing so will result in a violation of treatment
integrity of the scientifically based/empirically
supported intervention being implemented
Slope Rules
(“Changing Interventions”)
• Change means new or severely intensified
Intervention
• Do not make any changes without having
differences in slopes between rate of
improvement (ROI) of target student(s)
compared to average peer or criterion
• Three possible slope decision rules …
Slope Comparison Decision Rule #1
• If the slope of the trend line is flatter than the
slope of the aim/goal line (as shown on next
slide), then a change should be made
– Intensify the intervention or
– Start a new intervention based on assessment
data
105
100
95
Aim Line
90
85
80
WORDS READ CORRECTLY PER MINUTE
75
70
65
60
55
Trend Line
50
45
Student Intervention
Performance
40
Student Baseline
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Slope Comparison Decision Rule #2
• If the slope of the trend line is steeper than
the slope of the aim/goal line (as shown on
next slide), then a change in intensity can be
made
– Decrease the frequency of the current
intervention per week, or
– Decrease the duration of the current intervention
per week, or
– Fade out the intervention, but do not stop it all
together!
135
130
125
120
115
110
Trend Line
105
100
WORDS READ CORRECTLY PER MINUTE
95
90
85
Aim Line
80
75
70
65
60
55
Student Intervention
Performance
50
45
40
Student Baseline
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Slope Comparison Decision Rule #3
• If the slope of the trend line is similar to the
slope of the aim/goal line (as shown on next
slide), then a change should be made
– Intensify the intervention, or
– Start a new intervention based on assessment
data
• The intervention did not close the gap (the
intervention was therefore ineffective)
• The student was unresponsive to the
intervention
105
100
95
Aim Line
90
85
80
WORDS READ CORRECTLY PER MINUTE
75
70
Trend Line
65
60
55
50
Student Intervention
Performance
45
40
Student Baseline
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Monitoring Progress Along the Way
Three-Point Decision Rules:
Adjustments
Three-Point Decision Rules
(Adjusting)
• Adjust does not mean change
– Adjust: Accommodation (slight change in current
Intervention)
– Change: Modification (new intervention)
• Do not make any adjustments without having
three consecutive data points above or below
the goal/aim line.
• Three possible three-point decision rules …
Three Data-Point Decision Rule #1
• If you have three data points below the
aim/goal line (as shown on next slide), then
you can do something different
– Accommodations only
– Accommodation must be left in place for three
consecutive data points (above or below the line)
before removing or adding additional
accommodations
70
65
60
55
WORDS READ CORRECTLY PER MINUTE
50
45
Aim Line
40
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Three Data-Point Decision Rule #2
• If you have three data points above the
aim/goal line (as shown on next slide), then
you can do something different
– Accommodations only
– Accommodation must be left in place for three
consecutive data points (above or below the line)
before removing or adding other accommodations
– Keep in mind the goal is to facilitate growth. If you
are above the line you might consider doing
nothing because you are on track to meet criteria
70
65
60
Aim Line
55
WORDS READ CORRECTLY PER MINUTE
50
45
40
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
Three Data-Point Decision Rule #3
• If you do not have three data points above the
aim/goal line (as shown on next slide), then do
nothing different
– Continue the intervention according to protocol
– Changing something here will violate intervention
integrity
70
65
60
55
Aim Line
WORDS READ CORRECTLY PER MINUTE
50
45
40
35
30
25
20
15
10
5
0
1
2
3
4
5
6
7
8
9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
WEEKS
HAWK Report (Helping A Winning Kid)
Date _________________ Teacher_______________________
Student_______________
0 = No
1= Good
2= Excellent
Class
Parent’s signature______________________________
Be Safe
Be Respectful
Keep hands, feet,
and objects to self
0
1
2
Use kind words
and actions
0
1
2
Be Your Personal Best
Follow directions
0
1
2
Recess
0
1
2
0
1
2
0
1
2
Class
0
1
2
0
1
2
0
1
2
Lunch
0
1
2
0
1
2
0
1
2
Class
0
1
2
0
1
2
0
1
2
Recess
0
1
2
0
1
2
0
1
2
Class
0
1
2
0
1
2
0
1
2
Total Points =
Points Possible =
Today ______________%
Teacher
initials
Working in class
0
1
2
0
1
2
0
1
2
0
1
2
Goal ______________%
50
Comments:
AU: we’ll need to include the permission statement here, in small print.
Monitoring Behavior with a Check-In/
Check-Out System
Analyzing Data from a Check-In/
Check-Out System
Evaluating the RTI Model
• Both formative and summative evaluation should be
conducted
– Annually for formative evaluation
– Every three to five years for summative evaluation
• Process variables
–
–
–
–
Self-assessment
External assessment
Administrative feedback
Parent satisfaction
• Outcome Variables
– High-stakes test scores, attendance, ODR
– Percentage of students receiving services at each tier
– Disaggregated data are important to AYP
Review of Important Points
• Progress monitoring is essential component of RTI
– It is how you evaluate the effectiveness of the intervention
and determine RTI
• Rate of improvement (ROI)
– Relative to peers or to specific criterion are options
• Data-based decision making
– Three data points required before deciding whether to
adjust an intervention (i.e., make a small accommodation)
– At least 10 to 15 data points often suggested as a
minimum for decisions about making larger modifications
Review of Important Points
• Daily Behavior Report Cards
– Typically used at Tier II
– It is ideal to have the daily report card contain items that
reflect established schoolwide expectations.
• Program Evaluation
– Evaluated by team and by external observer
– Evaluate process variables and outcome variables
– Feedback should be provided to teams
• Parent/Guardian Involvement and Satisfaction
– Often can be gathered in a questionnaire at the end of
problem-solving team meetings and/or parent-teacher
conferences
Questions?
Ben Ditkowsky
[email protected]
http://measuredeffects.com
Gary Cates
[email protected]
http://www.garycates.net
Questions