Open Access version via Utrecht University Repository

STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
1
Is Demonstrating Understanding Rewarding? The Relation Between Student Academic
Performance, Student Expression Mode, and Micro Differentiation
Bachelorthesis Onderwijskunde
Universiteit Utrecht
Marije van Braak (3953718)
Group number: 04
Supervisor: J. E. van de Pol
June 09, 2015
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
2
Abstract
The present study investigates the relationship between students’ academic performance and
student expression mode in student-teacher interactions. Research found that academically high
performers receive higher levels of micro differentiation (i.e., higher quality of support). The present
study aimed to investigate an explanation for this finding and hypothesized that high-performing
students utilize more demonstrations of understanding than claims of understanding compared to lowperforming students. Thereby, they would elicit higher levels of micro differentiation by enabling
teacher assessments students’ needs and facilitating dialogic exchange. The method of investigation
included 1) development of a coding scheme for student expression of understanding and 2) analysis
of student and teacher responses in teacher-student interactions in relation to the students’ academic
performance using the developed coding scheme. Results showed that the hypothesized correlation
between student academic performance and student expression mode was absent. Despite the study’s
limitations, findings of subsequently conducted exploratory analyses can serve as a starting point for
future work. In addition, the coding scheme development has contributed to the methodological array
of student-teacher interaction assessment.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
3
Is Demonstrating Understanding Rewarding? The Relation Between Student Academic
Performance, Student Expression Mode and Micro Differentiation
Recent developments in education have focused the attention of politicians, professional
educators, scientists, school boards and parents alike on student diversity in a variety of classrooms
and school types. Since the introduction of the Inclusive Education Law (Wet Passend Onderwijs,
August 1, 2014), adaptive education has become the norm in Dutch educational institutions.
Moreover, the call for adaptive education is prevalent in school reform literature across a broad range
of Western countries (Subban, 2006; Tomlinson, 2003).
Adaptive education refers to education in which the broad range of students’ diversity is taken
into account by tailoring the curriculum and instruction to the learners’ readiness, interest, and
learning profile (Tomlinson, 2003) and can take multiple forms. Park and Lee (2003) distinguish
adaptive approaches at the macro level (e.g., ability grouping) and micro level (e.g., scaffolding, cf.
Van de Pol, Volman, & Beishuizen, 2010). For several reasons, the present research focuses on
interactions taking place in the context of micro differentiation (also internal differentiation, Bosker,
2005; Lowyck & Terwel, 2009; and within-class differentiation [binnenklasdifferentiatie], Coubergs,
Struyven, Engels, Cools, & De Martelaer, 2013).
First, although macro level adaptations seem to be the first adaptations implemented by
schools utilizing adaptive education, meta-analytic comparison of the effectiveness of macro level (d <
0.28) and micro level (d > 0.57) adaptations on student learning (Hattie, 2009) suggests that micro
differentiation is most promising (cf. Fuchs et al., 2008; Maxey, 2013). Other research (Deunk,
Doolaard, Smale-Jacobse, & Bosker, 2015) points out that macro and micro differentiation should be
applied to gain favorable effects, which also underscores the importance of micro differentiation.
Second, research into micro differentiation is relevant because of the crucial but underexposed
role students play in the process (in contrast to macro differentiation). Traditionally, design and
implementation of micro differentiation are carried out by teachers (Van de Pol, Poorthuis, Mainhard,
& Brekelmans, submitted). Teachers assess their students’ ability levels and are responsible for
adapting their instruction according to the students’ needs. Students’ roles in micro differentiation
have received much less attention in micro differentiation research (Van de Pol et al., submitted). Yet,
taking the students’ role into account in teaching is preconditional to successful micro differentiation
(Stone, 1998) and effective instruction (Hattie, 2009). Students influence teachers’ instruction by
asking questions and giving answers; so do student characteristics like academic performance,
motivation, and engagement (Nurmi, 2012). The influence of student academic performance is
especially relevant in the context of micro differentiation, since Nurmi (2012) suggests that teachers
construct specific perceptions of students’ academic performance “as a first step to tailor instruction
for that student” (p. 192 [emphasis added]).
Taking recent research into the relation between student academic performance and micro
differentiation (e.g., Nurmi, 2012; Van de Pol, Poorthuis, Mainhard, & Brekelmans, submitted) as the
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
4
starting point for the present study, I raise the question why academic performance influences micro
differentiation. I hypothesize that higher academic performance is positively related to student
expression of understanding during student-teacher interactions; in turn, more informative student
expression of understanding enables better teacher assessment of student level of understanding – and
thus higher levels of micro differentiation (cf. Van de Pol et al., submitted). I aim to provide evidence
for differential expression modes for students with different levels of academic performance by
developing a coding scheme for the analysis of student utterances during student-teacher interactions. I
use the coding scheme to analyze transcripts of student-teacher interactions and relate the participating
students’ grades to the coding results; teacher elicitations of student expression modes are also
analyzed, since these are dependent on and influence student utterances during interaction (Nystrand,
Wu, Gamoran, Zeiser, & Long, 2003; Van de Pol, Volman, Oort, & Beishuizen, 2014). The
development of a coding scheme for student utterance analysis adds to the methodological array of
tools to investigate individual contributions to student-teacher interactions. By analyzing both student
and teacher contributions, I pursue to gain further insight into the complex reciprocal nature of
student-teacher interactions and thus contribute to the disentanglement of student and teacher roles in
interaction.
Micro Differentiation
The practice of micro differentiation is rooted in Vygotsky’s concept ‘zone of proximal
development’. This concept is defined as the distance between the actual developmental level and the
level of potential development (Vygotsky, 1987 in Subban, 2006). Adaptive teachers promote
interaction and aim to provide instruction that will extend the learner to a level just beyond the current
developmental level (Subban, 2006). Social interaction thus is an important element of micro
differentiation, since it is interaction that allows learners to progress to the zone of proximal
development (Vygotsky, 1978 in Subban, 2006). Although teacher guidance is a key aspect of these
interactions, the student role is equally essential (cf. Nurmi, 2012; Stone, 1998). In Vygotsky’s theory,
therefore, the student-teacher interaction has a reciprocal nature (Subban, 2006).
The process of micro differentiation involves providing contingent support to students during
instruction by providing continuously less support as students demonstrate increasing task control
(Corno, 2008; Van de Pol & Elbers, 2013). Support is considered contingent when teacher control is
increased in case students fail and decreased in case students succeed – thus contingent support is
adaptive to students’ understanding (Van de Pol & Elbers, 2013).
An example of micro differentiation is scaffolding (Subban, 2006; Wood, Bruner, & Ross,
1976). During the interactive process of scaffolding, responsibility is transferred from the teacher to
the student; the teacher provides contingent, fading support to a student “when performing a task that
the student might otherwise not be able to accomplish” (Van de Pol et al., 2010, p. 274). Note that the
reciprocal nature of the interactive process implies that scaffolding and micro differentiation in
general partly depend on the learner (cf. Van de Pol et al., 2010; Nystrand, Wu, Gamoran, Zeiser, &
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
5
Long, 2003). This raises the question whether micro differentiation levels vary between learners. Do
students who perform better academically also receive higher levels of micro differentiation?
Micro Differentiation and Academic Performance
Nurmi (2012) suggests that quantitative and qualitative differences exist between instruction
(including support) given to academically low- versus high-performing students. Low-performing
students need and receive a greater amount (quantity) of instruction and support to proceed with tasks
(Babad, 1993; Brophy & Good, 1984; cf. Nurmi, 2012). Interestingly, more instruction does not
always guarantee more effective instruction: “low achievers receive more instruction and more
learning support than high achievers” (cf. Babad, 1990; Nurmi, Viljarant, Tolvanen, & Aunola, 2012),
but this “instruction is of lower quality” (Babad, 1993, p. 355 [emphasis in original]). Teacher
instruction and support need to be contingent to be effective (Van de Pol et al., submitted) – in other
words: instruction needs to be ‘micro differentiated’ (quality) to be effective.
In light of these findings, the notion that variations in student academic performance evoke
varying levels of micro differentiation is of great importance. High-performing students were found to
experience higher degrees of micro differentiation (i.e., contingent support) than low-performing
students (Van de Pol et al., submitted). Students’ academic performance, therefore, has an “evocative
impact” (Nurmi et al., 2012, p. 572) on the quality of teacher instruction – which in turn influences
student learning outcomes (Nurmi et al., 2012).
Academic Performance and Expression of Understanding
Noting the described relationship between academic performance and micro differentiation,
the question rises why high academic performance evokes higher quality (i.e., higher level of micro
differentiation) instruction. Research suggests that high-performing students are better able to explain
what they do (not) understand (De Bruin & Van Gog, 2012; Van de Pol et al., submitted). Harris and
Rosenthal (1985) state that feedback is – albeit to a limited extent – “more contingent upon the
correctness or incorrectness of responses of [high-performing] students and is directly related to what
the student said” (p. 365, emphasis added; cf. Van de Pol et al., 2010). Student verbal participation
and expression of understanding, therefore, are crucial to the process of micro differentiation. Active
student verbal participation allows for “online diagnosis and accompanying calibration of support
carried out by the teacher” (Stone, 1998, p. 348). Furthermore, student participation turns teacherstudent interactions into dialogic exchanges, which are crucial to the emergence of (academic)
understanding: “The transfer of information does not lead to understanding. There has to be a dialogic
exchange, and the response from the other creates the basis for understanding.” (Bakhtin, 1981, in
Flem, Moen, & Gudmundsdottir, 2000, Theoretical Framework, para. 9).
Following the reasoning above, the possibility of assessing student needs and the emergence
of dialogic exchanges – which are important features of micro differentiation (Stone, 1998) – depend
on the content (i.e., what is said) and reciprocality (i.e., to what extent both partners contribute) of the
student-teacher interaction. Teacher contributions to the interaction have been studied before (e.g.,
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
6
Van de Pol, Volman, & Beishuizen, 2014), but research concerning student contributions (or student
and teacher contributions) is sparse. Yet, we have seen that the student role in micro differentiation is
highly valuable in its own right and can shed light on the relation between student academic
performance and micro differentiation.
Student contributions to interactions (i.e., what students utter during interaction) in the context
of micro differentiation can be related to student utterances during self-explanation sessions (e.g., Chi
& Bassok, 1989), student-teacher interactions in general (e.g., Koole, 2010), and small group-work
situations (e.g., Van de Pol et al., 2014). Although students do not respond to other interaction partners
when self-explaining, all three situations involve student contributions as a ‘mode of studying’ that
can mediate learning (cf. Chi & Bassok, 1989). Self-explanations express students’ level of
understanding and trigger students to “actively construct an interpretation” of the material to be
learned (Chi & Bassok, 1989); this same mechanism applies to students contributing to student-teacher
interactions in general and during small-group work. Most importantly, we saw that expressing levels
of understanding and actively constructing knowledge (with the help of more knowledgeable others)
are key concepts to micro differentiation (cf. Stone, 1998; Subban, 2006).
Based on Chi and Bassok (1989), Koole (2010) and Van de Pol et al. (2014), I relate student
interaction contributions to academic performance and argue that student contributions to micro
differentiation differ in quantity and quality between low- and high performers. Chi and Bassok (1989)
found that high-performing students generated a greater amount (quantity) of explanations than lowperforming students while studying work-out solution examples. Specifically, both groups of
performers showed a similar distribution of types of elaboration (explanations, monitoring statements,
and miscellaneous others), but high-performing students generated on average four times as many
explanations compared to low-performing students. In addition, qualitative differences between their
explanations were found: high-performing students more often explicated tacit knowledge relevant to
the task and their monitoring statements provided “specific inquiries through with they could search
for an answer” (Chi & Bassok, 1989, p. 27); low-performing students often merely paraphrased given
information and their self-inquiries were general undirected questions.
Koole (2010) makes a similar distinction in student contributions, namely between claims of
understanding (e.g. “Yes, I get it.”, which is a display of knowing) and demonstrations of
understanding (e.g., explanations of a concept, which are displays of understanding; Koole, 2010). The
distinction between claims and demonstrations of understanding is relevant to the emergence of
dialogic exchange mentioned earlier. Student content questions, which might be considered
demonstrations of (non)-understanding, are found to “heighten the dialogic potential” of the
interaction (Nystrand et al., 2003). Furthermore, both modes of expression can aid the teacher in
assessing students’ needs. Demonstrations of understanding, however, “give teachers more detailed
information about the students’ understanding” than claims of understanding (Van de Pol, Volman,
Oort, & Beishuizen, 2014, p. 603). Also, claims of understanding are not necessarily valid or correct
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
7
(Van Loon, De Bruin, Van Gog, & Van Merriënboer, 2013). Students often misjudge their
understanding of task content. Demonstrations of understanding, therefore, seem to be most effective.
Expression of Understanding and Teacher Elicitations
By analyzing teachers’ utterances during interaction (in addition to student utterances), further
insight is gained into the essence of the reciprocal nature of interaction and its consequences for
effective implementation of micro differentiation (cf. Bakhtin, 1981, in Flem et al., 2000; Nystrand et
al., 2003). As well as students elicit varying levels of teacher micro differentiation by various modes
of expressing their understanding (either as a claim or as a demonstration), teachers elicit varying
student expression modes. Effective teacher contributions in the context of micro differentiation
involve eliciting demonstrations of understanding (e.g. “Can you explain why …?”) instead of merely
and less effective claims of understanding (e.g., “Do you understand that?”), resulting in a) better
assessment of student performance and needs, b) improved quality of “their checks of students’
learning”, and c) higher quality of support (i.e., adapting contingently and relating to students’
contributions) (Van de Pol et al., 2014). The interdependent relation between teacher elicitations,
student expression mode, and student academic performance is depicted in Figure 1. Relations
between the concepts are two-way, since they reflect correlations between the concepts: as explained,
adjacent concepts influence each other in both directions.
student academic
performance
student expression mode
(claim versus demonstration
of understanding)
teacher elicitations
Figure 1. Schematic representation of the correlations between teacher elicitations, student expression
mode, and student academic performance.
The Present Study
In the present study, I investigate the relation between student academic performance and
student expression mode (as elicited by and facilitating teacher micro differentiation). Integrating the
described research evidence, I hypothesize that the higher a student’s performance level, the more
demonstrations of understanding (s)he will utter during the interaction (in contrast to claims of
understanding). High percentages of demonstrations of understanding would elicit higher degrees of
micro differentiation (i.e., higher quality of support). Since teacher elicitation mode also influences
student expression mode in interactions, I control for teacher elicitation mode during analysis. The
form of teacher and student utterances (e.g., question, response, statement) is not taken into account,
as labeling (elicitations of) claims and demonstrations depends on the content – not form – of what is
said. I assume that utterance form does not influence utterance content.
To test the hypothesis, I developed a coding scheme for analysis of student utterances during
student-teacher interactions in the context of micro differentiation. I used the developed coding
scheme together with an existing coding scheme for analysis of teacher elicitations (Van de Pol et al.,
2014) to analyze student-teacher interactions and relate the results to students’ academic performance.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
8
Analyzed interactions took place during mathematics classes; math interactions were chosen because
of their correspondence with the interactions analyzed by Koole (2010).
Method
Participants
Participants of whom interactions were recorded, were 45 first class VMBO-T students from
four classes of four secondary schools, ages ranging between 12 and 14 years of age (M = 12.77 years,
SD = .56). Four teachers were recorded in interaction with these students (2 male, 2 female). Their
teaching experience ranged from 2 to 12 years of experience.
Data
For the analysis of student and teacher contributions to student-teacher interactions, I used
transcripts of student-teacher interaction video fragments recorded during math instruction. These
video fragments and transcripts were part of a larger data set (Van de Pol, Poorthuis, Mainhard, &
Brekelmans, submitted). The video fragments were recordings of content-related dyadic interactions
between one teacher and one student. Van de Pol et al. (submitted) video recorded the interactions by
installing two video cameras in participating teachers’ classrooms for two weeks; teachers wore
microphones; voice recorders were given to every two or four students. Teachers did not receive
teaching instructions. The interactions took place during seatwork time; students could ask for help
with the task when needed. In some cases, provision of help was teacher-initiated.
For the present study, interactions during math classes (recorded between March and June
2013) were analyzed. For technical reasons, 12 transcripts were excluded from the data set, resulting
in 153 transcripts with a total duration of 122.08 minutes. Of these transcripts, 40.66 minutes of
material (33,3%, 51 transcripts) were used for coding scheme development. The remaining 102
transcripts (81.43 minutes) were coded for analysis of student and teacher interaction contributions.
Instruments
Student expression of understanding. Student expression of understanding was represented
by the continuous variable ‘percentage of demonstrations of understanding’. I analyzed students’
utterances during the recorded interactions using a developed coding scheme, since no coding scheme
for student expression of understanding during student-teacher interaction existed. While developing
the present coding scheme, I followed the principles of directed content analysis, in which initial key
coding categories are identified using existing theory or prior research and secondary codes are
developed using data (Hsieh & Shannon, 2005). The codes thus emerged from an interaction of theory
and data (cf. Miles & Huberman, 1994).
The frame of the coding scheme was formed by the two major categories used by Koole
(2010) to describe student expression of understanding: demonstration of understanding and claim of
understanding. Demonstrations of understanding are defined as student utterances that show how a
student understands or interpreted the subject matter, e.g. “Oh, five times fifteen is what you do” as a
reformulation of ‘five quarters’ (Koole, 2010; cf. Van de Pol, Volman, Oort & Beishuizen, 2014).
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
9
Claims of understanding are student utterances containing a confirmation whether or not the student
has understood, e.g. “Yes, I get it” (Koole, 2010; cf. Van de Pol et al., 2014). These two coding
categories, both derived from theory and defined before data analysis, form the top-down codes of the
coding scheme (Hsieh & Shannon, 2005).
Using the coding categories demonstration and claim of understanding, I read through 51
randomly selected transcripts of 40.66 minutes (total duration). During coding, I also used the video
fragments to assist interpretation of utterances. I chose the utterance (i.e. the speech produced by a
student and delimited by pauses or teacher contributions) as my unit of analysis, since a demonstration
or claim of understanding can be expressed in any physical linguistic unit (word, sentence, etc.; Zhang
& Wildemuth, 2009). I coded student utterances into the demonstration or claim categories and sublabeled them (cf. Hsieh & Shannon, 2005) to further refine and explain the major coding categories
(Weston et al., 2001). To differentiate between codes, I identified words that tended to be markers of
these codes or coding categories during the process (e.g. ‘moet’ as a marker of request confirmation
approach, which word would not be found in utterances to be coded as approach; cf. Weston et al.,
2001). These words, as well as operational definitions of the codes, exemplars of student utterances
per code, and transcript sentences where the codes originated from, were documented in a codebook to
enhance consistent coding and serve as a written representation of the development process (cf. Hsieh
& Shannon, 2005; Weston et al., 2001).
After having coded the selected transcripts, I clustered codes by means of comparison (“what
things are like each other/unlike each other?”, Miles & Huberman, 1994; e.g. the code request
explanation was merged with claim state+content, see Table 1). I aimed to preserve the detailed
refinement of demonstrations and claims of understanding, but at the same time pursued parsimony.
The latter aim was also addressed during pilot analysis of the resulting coding scheme. The resulting
coding scheme is depicted in Figure 2, see also Appendix I for the complete code book.
Figure 2. Developed coding scheme for analysis of student expression of understanding in studentteacher interaction.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
10
During the development of the coding scheme, I aimed to ensure content validity by using
concept labels grounded (Sacks, 1992), tested (Koole, 2010), and related to micro differentiation (Van
de Pol et al., 2014) in earlier research (cf. Field, 2013). Data-based concept labels were compared to
codes developed in the context of student utterances in general classroom interactions (Nassaji &
Wells, 2000; Wells, 2010) to further ensure validity.
Student academic performance. Data on student academic performance were available from
Van de Pol et al. (submitted). The level of student academic performance was represented by the
continuous variable student grade. Participants’ grade marks for the subject mathematics (M = 6.73,
SD = 1.06) ranged from 3.25 to 9.14.
Teacher elicitations. Teacher elicitations were coded using an existing coding scheme (Van
de Pol et al., 2014). The unit of analysis was a teacher utterance (i.e., the speech produced by a teacher
and delimited by pauses or student contributions), which was coded for the “mode of response elicited
from the student”. Teacher utterances were coded as a) a claim of understanding (e.g., “but the reason
why they go together is clear right?”), b) a demonstration of understanding (e.g., “okay, so now you
try to explain why they go together and you use that word”), or c) no elicitation (e.g., yes, but how are
they put together? those different institutions? are they being elected or are they appointed, have a look
at that. then you can come up with another reason.”). In the study, ‘percentage of demonstrations of
understanding elicitations’ was used as a continuous variable representing teacher elicitation mode.
Design and Procedure
Coding of the transcripts was done using the QSR NVIVO 10 software for qualitative analysis
of (textual) information. Using the described coding schemes, I first pilot coded 20% of the remaining
math instruction transcripts to ensure reliable coding. Sixteen minutes of data (16 transcripts, 20% of
remaining transcripts, 13.4% of total) were coded independently by a second coder and me. Again,
video fragments were used to assist interpretation of utterances. Student and teacher utterances were
coded separately. Cohen’s kappa over all codes was .70 after the first coding round, which indicates
substantial agreement (following the norms of Landis & Koch, 1977). Kappa’s for student utterance
coding and teacher utterance coding were .66 and .87, respectively, which indicate substantial student
code agreement and almost perfect teacher code agreement (Landis & Koch, 1977). Four (sub) codes
in our pilot analysis showed less than substantial intercoder agreement (κ < .60): demonstration
comment on approach/solution , demonstration meaning, demonstration approach, and demonstration
request confirmation meaning. Utterances assigned to these codes were analyzed, after which I further
explicated their operational definitions in the code book. In addition, a rule was added to the code
book before further coding was done (marked in red in Appendix I).
After pilot coding, I coded the transcripts that had not yet been coded (65.36 minutes, 86
transcripts, 53.5% of total) using the teacher elicitation codes and the final version of the student
utterance coding scheme. I coded the transcripts twice (once for student utterances, once for teacher
elicitations) to reduce coding bias and used the corresponding video fragments to assist coding.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
11
Following the final coding, I calculated percentages of demonstrations of understanding (of
total number of student utterances) and percentages of elicitations of demonstrations of understanding
(of total number of teacher utterances) as well as percentages of every sub code (in case of student
utterances). Finally, I used student numbers to link transcripts and student grades, enabling analysis of
the relation between student expression of understanding and academic performance.
Analyses
Analysis was done using SPSS Statistics 20. I investigated the relation between student
expression of understanding and student academic performance by conducting a partial correlation
between the continuous variables ‘percentage of demonstrations of understanding’ and ‘grade mark’,
while controlling for teacher influence by adding ‘percentage of demonstrations of understanding
elicitations’ as a control variable. In addition, I examined the distributional differences between (sub)codes exploratively.
Results
Distribution of Assigned Codes
In the current study, transcripts of math lessons were coded for data analysis using a
developed coding scheme for coding of student utterances and an existent coding scheme for teacher
utterances. A coded example is displayed in Table 1 in Appendix II. The resulting distribution of
codes over the total of transcript utterances is presented in Table 2 (see next page), which depicts the
percentage of total codes assigned to particular codes.
As listed in Table 2, student utterances were most often coded as demonstration (of
understanding), closely followed by miscellaneous. Less than 25% of the codes were coded claim (of
understanding). Relative frequencies of sub codes detailed under demonstration differ considerable
and reveal three most frequently assigned codes: request confirmation, solution and approach. The
codes comment approach/solution, meaning, and approach + solution occur considerably less
frequently. Inspection of the request confirmation categories underscores the relatively low use of
codes related to meaning, although the request confirmation solution code was not extensively used
either. Sub code percentages for claim categories show that utterances containing explanations of
claims were not widely attested. Approximately one third of the total number of claims are statements
of (non-)understanding, whereas two thirds are a statement and the content of (non-)understanding.
Teacher elicitations were unevenly distributed over the codes no elicitation, demonstration
and claim. Again, but to a more considerable degree than the student claim utterances, teacher
utterances regarded as elicitations of claims were most infrequently found. It appears that 6.2% of the
teacher elicitations were not content-related. The demonstration code was assigned to almost 20% of
the utterances, but the large majority of the utterances (70.9%) was coded as no elicitation.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
12
Table 2
Code percentages (percentages of total codes assigned to each code) in descending order
Code type
Code
student codes
demonstration of understanding
request confirmation
request confirmation approach
request confirmation solution
request confirmation meaning
solution
approach
comment approach/solution
meaning
approach + solution
miscellaneous
claim of understanding
state + content
state
state + content + explanation
state + explanation
no elicitation
elicitation of demonstration
elicitation of claim
miscellaneous
teacher codes
% of codes assigned
to this code
39.6
12.4
8.3
2.4
1.7
9.3
9
4.7
2.7
1.5
37.3
23.1
14.3
7.2
0.9
0.4
70.9
18.7
4.2
6.2
Note. req. conf. = request confirmation. Non-content related teacher elicitations were not coded, following Van de Pol et al.
(2014). Non-content related student utterances were coded under miscellaneous and thus were included in the total number of
coded student utterances; similarly, non-coded (non-content related) teacher utterances were also included in the total number
of teacher utterances, namely under miscellaneous.
Relation between Student Expression of Understanding and Student Academic Performance
To assess the relationship between the percentage of student demonstrations of understanding
and student grades, a partial Pearson’s correlation was calculated. The assumptions of normality and
linearity were not confirmed by the normality test (Shapiro-Wilk) and upon visual inspection of a
scatterplot of percentage of demonstrations against grade. Therefore, I ran a bootstrapping procedure
using 2000 bootstrap samples (cf. Field, 2013, p. 199) before analyzing the data.
Descriptive statistics of the variables percentage of demonstrations of understanding, grade
and percentage of elicitations of understanding are given in Table 3 below. The correlation between
grade and percentage of student demonstrations was almost zero, r(81) = .022, p = .422 (one-tailed).
Adding the control variable percentage of teacher demonstration elicitations slightly increased the
correlation size, r(80) = .088, p = .215 (one-tailed). Both correlations
did not reach significance (α = .05).
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
13
Table 3
Descriptive statistics and bootstrap data on variables ‘grade’, ‘percentage of demonstrations of
understanding’, and ‘percentage of elicitations of demonstrations’
Variable
grade
% student demonstrations
% teacher demonstration elicitations
n
M
SD
range
n
M
SD
range
n
M
SD
range
Statistic
83
6.73
1.06
3.25-9.14
83
0.39
0.25
0.00-1.00
83
0,19
0,20
0.00-0.67
Bootstrap
Bias
SE
0
.00
.003
.11
-.008
.09
0
-.000
-.002
.00
.03
.02
0
-.000
-.002
.00
.02
.01
Explorative Analyses
Although the described data analysis did not reveal a relationship between student academic
performance and student expression mode, several considerations triggered further analyses. First,
since incidences of sub codes of the main student code demonstration appeared to vary greatly (both
with respect to frequency and distribution over transcripts), I suspected that the resulting correlation
between sub codes and student grades might vary as well. Second, although higher performing
students did not appear to use a higher percentage of demonstrations, it might be still be expected that
they would use less claims (which are less informative to the teacher with regard to assessment of
student needs) than lower performing students. In addition, I had noticed several small but striking
differences between transcripts during the coding process. Shorter transcripts of student-teacher
interactions seemed to include irrelevant (miscellaneous) utterances more often than longer transcripts.
In general, I was inclined to think that code incidences differed across transcripts with different
lengths (i.e. different durations of the transcribed interactions). Furthermore, I noticed marked
variation between transcripts featuring different teachers.
These factors of consideration together induced me to carry out several explorative analyses,
to obtain better insight into how the data were structured and determine whether the expectations
would be met. The results of these analyses are displayed below. Bootstrap procedures were applied
for all analyses if relevant; unless otherwise noted, analyses are two-tailed at α = .05. Only correlations
which are r > .1 (i.e. at least ‘small’; cf. Cohen, 1988) are reported.
Student grade - student demonstration sub codes. Student grade and percentage of student
demonstrations (the latter being the aggregate of all demonstration codes) were not significantly
correlated. Despite that, and considering both the variation in sub code frequencies, the unequal
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
14
distribution of sub codes over transcripts, and the variation in content of sub codes, it might be
expected that correlations between sub codes and student grade differ between sub codes of the main
demonstration code. Even so, additional analyses of student demonstration sub codes revealed just two
small but non-significant correlations: between student grade and percentage of approach/solution
(i.e. the code assigned to utterances that demonstrate (non-)understanding by explaining or giving
information on an approach/procedure and a solution; r(81) = .112, p = .312) and between student
grade and percentage of request confirmation solution (i.e. the code assigned to utterances that
demonstrate (non-)understanding by asking whether a solution is correct or intended; r(81) = .130, p =
.243). Adding the control variable percentage of teacher demonstration elicitations to these two
correlations yielded comparable and still insignificant results.
Student grade - claim sub codes. Codes assigned to student claims of (non-)understanding
were analyzed similarly to the codes assigned to student demonstrations of (non-)understanding. The
correlation between student grade and percentage of claims of (non-)understanding appeared close to
zero and was non-significant, both with and without controlling for the percentage of teacher claim
elicitations. Furthermore, all correlations between student grade and student claim sub codes were
smaller than r = .10; none reached significance.
Duration of interactions. Upon inspection of the coded transcripts, many student utterances
of very short transcripts (i.e. short duration of interaction) appeared to be coded as miscellaneous (noncontent related). Extensive transcripts, on the other hand, often seemed to contain more student
demonstrations and claims than shorter transcripts. Furthermore, it seemed that variations in transcript
length was associated with corresponding variations in students’ grades. Together, these observations
triggered inclusion of the variable duration (operationalized as the duration of full interactions in
seconds). If students’ grades would correlate with duration, and at the same time duration would
correlate with particular (sub) codes, then students’ grades could probably indirectly be linked via
duration.
The correlation between students’ grades and duration appeared insignificant (also, r < .1).
Correlations between (student and teacher) code percentages and duration are provided in Table 4 (see
next page). Table 4 shows that longer interactions (i.e. high duration) are associated with significantly
lower percentages of demonstrations coded as comment on approach/solution, but higher percentages
of demonstrations coded as solution or approach. No correlation was found for percentage of
demonstrations in general. On the contrary, a significant negative correlation was found between
duration and percentage of claims in general. Of the claim subtypes, only state + content appeared to
be significantly and negatively related to duration.
Longer transcripts (i.e. high duration) were transcripts appeared to include higher percentages
of utterances coded as elicitation of demonstration, but lower percentages of utterances coded as no
elicitation.
As can be seen observed from Table 4, significant correlations between student codes and
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
15
duration became non-significant when percentages of teacher elicitations were controlled for.
Table 4
Bivariate and partial correlations (controlled for percentage of teacher elicitations) between
percentages of assigned codes and duration of interaction
Correlation between duration and …
student codes
demonstration
comment approach/solution
meaning
solution
approach
approach/solution
request confirmation
request confirmation meaning
request confirmation approach
request confirmation solution
claim
state
state + content
state + explanation
state + content + explanation
teacher codes
elicitation of demonstration
elicitation of claim
no elicitation
r
p
controlled for % of
teacher elicitations
r
p
-.219
.141
.246
.226
*.047
.204
*.025
*.040
-.205
.069
.146
.185
.197
.101
-.145
.190
-.110
-.136
.329
.228
-.140
-.182
.206
*.099
-.130
.251
-.225
*.041
-.123
.277
.275
*.012
-.247
*.024
n.a.
n.a.
n.a.
n.a.
n.a.
n.a.
Note. Correlations smaller than r = .1 are not presented. n.a. = not applicable. * p < .05.
Teacher differences. Upon visual inspection of students’ grades and the students’ teachers, I
noticed considerable variation in the grades of students of different teachers. Since dissimilar
variances and location of scores would possibly negatively influence the comparability of student
grades across teachers (and thus influence the comparability of students in the sample), I carried out an
one-way ANOVA with teacher number as independent variable and student grade as dependent
variable. The assumption of equal variances was not violated, indicated by Levene’s test (F(3, 79) =
1.345, p = .266). The normality assumption was violated for teachers 12, 20 and 38. Although
ANOVA is sometimes seen as relatively robust against normality violations (Field, 2013), a
bootstrapping procedure was applied.
Student grades differed significantly between teachers, F(3, 79) = 14.881, p < .001. Post hoc
analyses using Tukey HSD indicated that teacher 25 differed significantly from teachers 12, 20 and
38; teacher 38 differed significantly from teachers 12, 20 and 25.Thus, three ‘subsets’ of teachers
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
16
could be constructed: subset 1 containing teacher 25 (M = 7.86, SD = 0.60), subset 2 containing
teacher 38 (M = 5.76, SD = 1.21), and subset 3 containing teachers 12 and 20 (M = 6.68, SD = 0.82).
Separate correlation analyses were conducted for each teacher subset. The resulting
correlations are presented in Table 5, together with the correlation for the complete sample.
Table 5
Correlation between student grade and percentage of student demonstrations, controlled for
percentage of teacher elicitations of demonstrations, for the entire sample and three sub samples
sample
full sample
sub sample 1 (teacher 25)
sub sample 2 (teacher 38)
sub sample 3 (teachers 12 and 20)
N
83
15
14
54
r
.022
.040
.041
.050
p
.422
.888
.889
.720
controlled for % of
teacher elicitations
of demonstrations
r
p
.088
.215
-.159
.588
.034
.912
-.017
.904
The presented subsets reveal dissimilar correlation patterns for the separate sub samples; sub
sample controlled correlations even are in opposite directions. Note, however, that none reached
significance. Based on the dissimilarity between correlation patterns for different teachers, one could
expect correlations between student grade and sub codes to differ as well. To test this expectation, I
analyzed these correlations for each separate sub sample after applying a bootstrap procedure. The
results are displayed in Table 6 (see Appendix III).
A major observation from Table 6, focusing on the significant correlations (which are all
medium sized), is again the presence of opposite correlation directions (see code state + content +
explanation, which can be seen as the most informative type of claim) for teachers 38 and 12/20). As
for teacher 38, higher performing students would use significantly more claims that included a
statement of, the content of and the reason for (non-)understanding. High performing students taught
by teachers 12 and 20, on the contrary, used significantly less of these.
Extreme cases. Still, the described explorative analyses did not clarify the difference between
code differences between high and low performing students considerably. Therefore, I decided to
select students with grades deviating more than one standard deviation (SD = 1.06) from the mean
student grade (M = 6.73). Scatter/dot plots of data from these students (N = 13, Nlow = 7, Nhigh = 6) are
depicted in Figure 3 below. Taking the already described teacher grading differences into account,
data points in each plot are marked by teacher number.
None of the four plots shows a clear relation between the variables plotted. Note, though, that
data points associated with one teacher often cluster together in one plot. Importantly, as can be
observed in all plots, each student (except for one) scoring more than one standard deviation above
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
17
average is taught (and graded) by one particular teacher. This observation is in line with the already
described differences between grades of students taught by different teachers.
a.
b.
c.
d.
Figure 3. Scatterplots of students with extreme grades (M +/- 1 SD): a. student grade against
percentages of student demonstrations, b. student grade against percentages of student claims, c.
student grade against percentages of teacher elicitations of demonstration, and d. student grade against
percentages of teacher elicitations of claims. Data points are marked by teacher number.
Note. Spe_dem = percentage of student demonstrations; Tpe_dem = percentage of teacher elicitations of demonstrations;
Spe_cl = percentage of student claims; Tpe_cl = percentage of teacher elicitations of claims; teachernr = teacher number;
grade = student grade.
If data points from students taught by teacher 38 are looked at in isolation, two remarkable
patterns can be observed. First, in Figure 3a., the higher a student’s grade, the higher their percentages
of demonstrations seem to be. Second, in Figure 3d., higher student grades appear to be associated
with lower percentages of teacher elicitations of claims. Furthermore, isolated inspection of grades of
students taught by teacher 20 in Figure 3c. seems to indicate that the higher students’ grades are
associated with lower percentages of teacher elicitations of demonstration. The latter association was
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
18
also found for the entire data set: a small but insignificant correlation between student grade and
percentage of teacher elicitations of demonstrations being r(81) = -.156, p = .159.
Although further analyses extreme cases using sub code distributions would potentially
provide additional information about the differences in student expression modes between high and
low performing students, I decided not to conduct any further explorative analyses. I expected these
analyses to merely provide information about the differences between students in different classes
(since extremely high scoring students were almost exclusively from one class or teacher), thus not
reliably reflecting differences between high and low scoring students.
Discussion
The aim of the present study was to investigate the relation between student academic performance
and student expression mode (as elicited by and facilitating teacher micro differentiation). I expected
student-teacher interactions to reflect varying degrees of micro differentiation depending on the level
of students’ academic performance. I thus hypothesized that the higher a student’s performance level,
the more demonstrations of understanding (s)he will utter during the interaction (in contrast to claims
of understanding). Since teacher elicitation mode was expected to influence student expression mode, I
controlled for teacher elicitation mode during analysis. I developed a coding scheme to enable
assessment of the student utterances during student-teacher interactions; teacher utterances were coded
using an existing coding scheme (cf. Van de Pol et al., 2014). Before turning to an evaluation of the
formulated hypothesis, I briefly discuss the distribution of codes assigned to student and teacher
utterances in the analyzed transcripts.
First, analysis of the distribution of assigned codes revealed that almost 40 percent of student
utterances were demonstrations of understanding, one third of which were requests of confirmation
(mostly of approaches taken, which were demonstrated while requesting confirmation). Claims of
understanding were used much more infrequently by students (slightly more than 20 percent). The
majority of these claims consisted of a statement and the content of (non-)understanding. These results
suggest that student utterances that enable assessment of students’ needs (i.e. demonstrations) are
definitely not scarce. This finding, however, is slightly nuanced by the prevalence rates of non-content
related student utterances (37.3 percent, which is comparable to demonstration prevalence rates).
In addition, taking the percentages of teacher elicitations of demonstrations (18.7) and claims
(4.2) into account, the proportion of teacher elicitations to student utterances is much smaller for
claims (1:5.5) than it is for demonstrations (1:2.1). It seemed that students produce markedly more
non-elicited claims than non-elicited demonstrations. Note, though, that the relation between teacher
elicitations and student utterances cannot causally or correlationally be inferred from these data, since
adjacent teacher and student utterances were coded separately and in isolation. Further analysis should
take into account the functional relation between teacher and student codes (by recording and
analyzing whether particular student utterances are elicited by particular teacher utterances) to confirm
the provisionary hypothesis. Analyses that focus on sequential features of the data, for example by
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
19
“close, turn-by-turn sequential analysis of the talk” (Schegloff, Koshik, Jacoby, & Olsher, 2002, p. 18)
or applying the principles of Conversation Analysis (cf. Schegloff, Koshik, Jacoby, & Olsher, 2002),
offer “the best chance for illuminating dynamic processes of interaction” (Bakeman & Gottman,
1997). Considering the reciprocal nature of student-teacher interactions (cf. Van de Pol et al.,
submitted), conducting these analyses enable clarification of the hypothesized interdependency
between student and teacher utterances.
Noting the high percentage of non-eliciting teacher utterances and considering the positive
effects of promoting elicitations of demonstrations for teacher diagnostic purposes (cf. Van de Pol et
al., 2014), I think that the presented distribution (of student and teacher codes together) illustrates
room for improvement of teacher diagnostic strategies. This is in line with earlier research, which has
indicated that teachers consider diagnosis of student understanding very difficult and has suggested
diagnosis-based teacher support (Wittwer, Nückles, Landmann, & Renkl, 2010). The distribution of
codes obtained within the present study provides further insight into the structure of student-teacher
interactions. At the same time, future work employing already mentioned sequential analyses of
interactions is needed to validate and expand knowledge about student-teacher interaction patterns –
especially with a focus on the still inconclusively answered question whether teacher utterances and
student utterances indeed directly influence each other and how this facilitates student understanding.
A major finding of the current study is the absence of a significant correlation between student
academic performance and student expression of understanding (both for student use of
demonstrations, and for student use of claims). Thus, the hypothesis that higher grades would be
correlated with higher percentages of student demonstrations (controlling for teacher elicitations of
demonstrations) was not supported by the obtained results. Although the opposite was expected, these
findings correspond to some findings of Chi and Bassok (1989). They report similar distributions of
content-related utterances during self-explanation for high- and low-performing students (although
absolute figures were between the two groups of students were not comparable, cf. Chi & Bassok,
1989). On the other hand, Chi and Bassok did find qualitative differences between the contents of
content-related utterances of high- and low-performing students. In the present study, however,
subtypes of student demonstration and claim utterances did not correlate with student academic
performance.
Although the present findings seem to challenge earlier described inferences from literature
regarding the relation between student expression mode and academic performance (cf. Chi & Bassok,
1989; Koole, 2010; Van de Pol et al., 2014), careful interpretation of the results is required for several
reasons. First, the analyzed data did not optimally fit the requirements for conducting correlation
analyses. Bootstrap procedures were carried out to cope with this drawback. Yet, it is not clear to what
(if any) extent not fully meeting the requirements has influenced the validity of the correlations (cf.
Field, 2013). Second, although steps were taken to ensure validity and reliability of the coding, the
entire data set (i.e. transcripts) was merely coded once and by one coder. Ideally, double coding would
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
20
have functioned as an extra reliability check, as would have comparison of teacher utterance coding
with excerpts originally coded in Van de Pol et al. (2014). Potential differences between the present
teacher utterance coding and teacher utterance coding in Van de Pol et al. (2014), where the teacher
coding scheme originated from, call for prudent comparison of certain results and conclusions.
Another point of consideration to be taken into account while interpreting the present findings,
is the substantial dissimilarity between grades of students taught by different teachers. As reported,
grading patterns differed significantly between teachers. Subsequently, students taught by different
teachers might have received identical grades (for instance, a 7.00) and still perform at different levels
of academic performance (which is not unlikely when teacher A’s students’ grades average around
5.76, while teacher B’s students’ grades average around 7.86). This would implicate that the validity
of student grades as a measure of student academic performance would be dubious. Questioning this
validity, however, might prove problematic. Student grades were assumed to be valid measures of
academic performance by Van de Pol et al. (submitted), which research conclusions constitute part of
the research basis of the present study. Questioning grade validity would consequently question prior
conclusions that led to the current research question. On the other hand, taking the issue of grades as a
valid measure of academic performance more broadly, research using different types of grades (for
standardized tests, seatwork, exams, essays) in regression analyses on motivation and self-regulated
learning components and academic performance shows that results can be dissimilar, depending on the
grade type used (cf. Pintrich & De Groot, 1990).
Although one could argue that grading patterns differed between teachers, one could also
assert that teachers simply teach differently (i.e. one teacher teachers better than another), resulting in
higher grades for students’ of good teachers. Furthermore, student performance could have been
unevenly distributed over classes, resulting in a class with relatively many high performing students.
Notwithstanding these explanations, the differences between grades of different teachers’students
highlight the relevance of using analyses that take the nested character of the data. Correlations
between student expression mode and academic performance were of different size and directions
when data were analyzed separately for subsets of teachers. This might explain why the expected
relation was absent when the entire data set was analyzed. To gain further insight into the
hypothesized relation, including a larger number of teachers (and thus increasing sample size) and
conducting analyses that consider the nesting of data, e.g. multilevel analyses, might prove fruitful (cf.
Van de Pol et al., submitted).
A third major point of notice is that exploratory analyses of extreme cases hinted at the
absence of the assumed positive correlation between academic performance and teacher elicitations of
demonstrations (these elicitations were argued to be most effective and beneficial to teacher
adaptivity, cf. Van de Pol et al., 2014). One of the present study’s aims was to test whether the relation
between academic performance and teacher adaptivity could be explained by a relation between
academic performance and student expression of understanding. Yet, contrary to findings in Van de
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
21
Pol et al. (submitted), in our study higher student academic performance seemed to be associated with
lower levels of teacher elicitations of demonstrations (and claims, too). I interpret this discrepancy
between both studies as another indication of the potential usefulness of expanding the present data set
for future research. Encompassing data on subjects other than mathematics could be part of that,
although it is unclear whether (and if so, why) student-teacher interactions during mathematics classes
would yield different results compared to interactions during other classes.
In sum, the present study did not find evidence for the hypothesis that the higher students
perform, the more demonstrations of understanding they use. The result should be interpreted with
caution due to the nature of the data, i.e. its nested character and limited scope. Despite the limitations,
the present study’s development and evaluation of a coding scheme for student utterances contributes
to the methodological array of student-teacher interaction assessment. Furthermore, the study has
provided valuable insight into the content of student-teacher interactions. The study constitutes a first
start to research into the actual composition and role of student contributions to micro differentiation.
As such, findings can serve as a starting point for future work.
Additionally, the current research is also of practical relevance. Although earlier research has
proved eliciting demonstrations to be most effective while providing adaptive support (Van de Pol et
al., 2014), the majority of teacher utterances analyzed in the present research were no elicitations.
Teacher support focusing on increasing the proportion of demonstration eliciting teacher contributions
might help to improve the level of teacher adaptivity during student-teacher interactions. This
suggestion can serve as part of educationalists’ reply to the recent national, political and educational
call for (improvement of) adaptive education.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
22
References
Babad, E. (1990). Measuring and changing teachers’ differential behavior as perceived by students and
teachers. Journal of Educational Psychology, 82, 683–690. doi:10.1037/0022-0663.82.4.683
Babad, E. (1993). Teachers’ differential behavior. Educational Psychology Review, 5, 347-376.
doi:10.1007/BF01320223
Bakeman, R., & Gottman, J. M. (1997). Observing interaction: An introduction to sequential analysis.
Cambridge: Cambridge University Press.
Bosker, R. J. (2005). De grenzen van gedifferentieerd onderwijs (Inaugurele rede). Rijksuniversiteit
Groningen.
Brophy, J., & Good, T. L. (1984). Teacher behavior and student achievement [Occasional Paper No.
73]. East Lansing: Institute for Research on Teaching.
Chi, M. T. H., & Bassok, M. (1989). Learning from examples via self-explanations. In L. B.
Resnick (Ed.), Knowing, learning and instruction: Essays in honor of Robert Glaser (pp. 251282). Hillsdale, NJ, England: Lawrence Erlbaum Associates, Inc.
Corno, L. (2008). On teaching adaptively. Educational Psychologist, 43, 161-173. doi: 10.1080/
00461520802178466
Coubergs, C., Struyven, K., Engels, N., Cools, W., & De Martelaer, K. (2013).
Binnenklasdifferentiatie: leerkansen voor alle leerlingen. Leuven: Acco.
De Bruin, A. B., & Van Gog, T. (2012). Improving self-monitoring and self-regulation: From
cognitive psychology to the classroom. Learning and Instruction, 22, 245-252. doi:10.1016/
j.learninstruc.2012.01.003
Deunk, M., Doolaard, S., Smale-Jacobse, A., & Bosker, R. (2015). Differentiation within and across
classrooms: A systematic review of studies into the cognitive effects of differentiation
practices [Report]. Received from http://www.nro.nl/wp-content/uploads/2015/03/RoelBosker-Effectief-omgaan-met-verschillen-in-het-onderwijs-review.pdf
Field, A. (2013). Discovering statistics using IBM SPSS Statistics. London: SAGE Publications Ltd.
Flem, A., Moen, T., & Gudmundsdottir, S. (2000). Towards inclusive schools: a study of how a
teacher facilitated differentiated instruction. Paper presented at the ECER Conference,
Edinburgh.
Fuchs, L. S., Fuchs, D., Craddock, C., Hollenbeck, K. N., Hamlett, C. L., & Schatschneider, C. (2008).
Effects of small-group tutoring with and without validated classroom instruction on at-risk
students’ math problem solving: Are two tiers of prevention better than one? Journal of
Educational Psychology, 100, 491-509. doi:10.1037/0022-0663.100.3.491
Harris, M. J., & Rosenthal, R. (1985). Mediation of interpersonal expectancy effects: 31 metaanalyses. Psychological Bulletin, 97, 363-386. doi:10.1037/0033-2909.97.3.363
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement.
New York, NY: Routledge.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
23
Hsieh, H.-F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative
Health Research, 15, 1277-1288. doi:10.1177/1049732305276687
Koole, T. (2010). Displays of Epistemic Access: Student Responses to Teacher Explanations.
Research on Language & Social Interaction, 43, 183-209. doi: 10.1080/08351811003737846
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data.
Biometrics, 33, 159–174. doi:10.2307/2529310
Lowyck, J., & Terwel, J. (2009). Ontwerpen van leeromgevingen. In N. Verloop & Lowyck, J. (Eds.),
Onderwijskunde (pp. 284-329). Groningen: Noordhoff Uitgevers.
Maxey, K. S. (2013). Differentiated instruction: Effects on primary students’ mathematics
achievement (Doctoral dissertation). Available from ProQuest Dissertations and Theses
database. (UMI No. 3573708)
Miles, M. & Huberman, M. (1994). Qualitative data analysis: An expanded sourcebook. London:
Sage.
Mustonen, A., & Pulkkinen, L. (1997). Television violence: A development of a coding scheme,
Journal of Broadcasting & Electronic Media, 41, 168-189. doi:10.1080/08838159709364399
Nassaji, H., & Wells, G. (2000). What’s the use of ‘Triadic Dialogue’?: an investigation of teacherstudent interaction. Applied Linguistics, 21, 376-406. doi:10.1093/applin/21.3.376
Nurmi, J. (2012). Students’ characteristics and teacher–child relationships in instruction: A metaanalysis. Educational Research Review, 7, 177-197. doi: 10.1016/j.edurev.2012.03.001
Nurmi, J. E., Viljaranta, J., Tolvanen, A., & Aunola, K. (2012). Teachers adapt their instruction
according to students’ academic performance. Educational Psychology, 32, 571-588. doi:
10.1080/01443410.2012.675645
Nystrand, N., Wu, L. L., Gamoran, A., Zeiser, S., & Long, D. A. (2003). Questions in time:
Investigating the structure and dynamics of unfolding classroom discourse. Discourse
Processes, 35, 135–198. doi:10.1207/S15326950DP3502_3
Park, O-c., & Lee, J. (2003). Adaptive instructional systems. Educational technology research and
development, 25, 651-684.
Pintrich, P. R, & De Groot, E. V. (1990). Motivational and self-regulated learning components of
classroom academic performance. Journal of Educational Psychology, 1, 33-40. doi:10.1037/
0022-0663.82.1.33
Sacks, H. (1992). Lectures on conversation. Oxford, England: Blackwell.
Schegloff, E. A., Koshik, I., Jacoby, S., & Olsher, D. (2002). Conversational analysis and applied
linguistics. Annual Review of Applied Linguistics, 22, 3-31.
Stone, C. A. (1998). The metaphor of scaffolding: Its utility for the field of learning
disabilities. Journal of Learning Disabilities, 31,344–364. doi:10.1177/002221949803100404
Subban, P. (2006). Differentiated instruction: A research basis. International Education Journal, 7,
935-947.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
24
Tomlinson, C. A., Brighton, C., Hertberg, H., Callahan, C. M., Moon, T. R., Brimijoin, K., Conover,
L. A., & Reynolds, T. (2003). Differentiating instruction in response to student readiness,
interest, and learning profile in academically diverse classrooms: A review of literature.
Journal for the Education of the Gifted, 27, 119-145.
Van de Pol, J., & Elbers, E. (2013). Scaffolding student learning: A micro-analysis of teacher–student
interaction. Learning, Culture and Social Interaction, 2, 32-41. doi:10.1016/j.lcsi.2012.12.001
Van de Pol, J., Poorthuis, Mainhard, T., & Brekelmans, M. (submitted). Smart students receive more
adaptive support.
Van de Pol, J., Volman, M., & Beishuizen, J. (2010). Scaffolding in teacher-student interaction: A
decade of research. Educational Psychology Review, 22, 271-297. doi: 10.1007/s10648-0109127-6
Van de Pol, J., Volman, M., Oort, F., & Beishuizen, J. (2014). Teacher scaffolding in small-group
work: An intervention study. Journal of the Learning Sciences, 23, 600-650. doi:
10.1080/10508406.2013.805300
Van Loon, M. H., de Bruin, A. B., van Gog, T., & Van Merriënboer, J. J. (2013). The effect of
delayed-JOLs and sentence generation on children’s monitoring accuracy and regulation of
idiom study. Metacognition and learning, 8, 173-191. doi:10.1007/s11409-013-9100-0
Wells, G. (2010). Coding scheme for the analysis of classroom discourse. Retrieved from
http://people.ucsc.edu/~gwells/Files/Courses_Folder/documents/CodingManual.pdf
Weston, C., Gandell, T., Beauchamp, J., McAlpine, L., Wiseman, C., & Beauchamp, C. (2001).
Analyzing interview data: The development and evolution of a coding system. Qualitative
Sociology, 24, 381-400.
Wittwer, J., Nückles, M., Landmann, N., & Renkl, A. (2010). Can tutors be supported in giving
effective explanations? Journal of Educational Psychology, 102, 74-89.
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem-solving. Journal of Child
Psychology and Psychiatry and Allied Disciplines 17, 89–100. doi:10.1111/j.14697610.1976.tb00381.x
Zhang, Y., & Wildemuth, B. M. (2009). Qualitative analysis of content. In B. Wildemuth (Ed.),
Applications of social research methods to questions in information and library science (pp.
308-319). Portland: Book News.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
Appendix I: CODE BOOK ‘STUDENT EXPRESSION OF UNDERSTANDING’
1. Coding scheme
[student code]
[content-related]
claim
[not content-related]
demonstration
miscellaneous
state
comment solution + approach
state + explanation
meaning
state + content
solution
state + content + explanation
approach
approach + solution
request confirmation
request confirmation approach
request confirmation solution
request confirmation meaning
25
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
26
2. Codes and Their Operational Definitions, Signaling Words and Exemplars
code
claim_oU
state
state + explanation
state + content
state + content +
explanation
demonstration_oU
operational definition
confirmation of (non-)understanding
Student states THAT (s)he (does not) understand
something (either when asked or spontaneous).
* sometimes also: “oke”/”ja” (test: is it possible to
replace ‘oke’/’ja’ by ‘ik snap het’?)
* also: answer to questions like “ja?”/ “yes?”
* (oh is often miscellaneous)
Student states THAT (s)he does (not) understand
something and explains WHY (s)he does (not)
understand it.
Student states THAT (s)he does (not) understand
something and adds WHAT (s)he does (not)
understand.
* Pointing (or something similar to pointing) to
something in book is also seen as telling what the
content of (non)-understanding is.
! It should be clear what exactly the student is referring
to in case of ‘dit’/’hier’/etc.
Student states THAT (s)he does (not) understand
something and adds WHAT (s)he does (not)
understand and adds WHY (s)he does (not) understand
it.
display of (absence of) content
knowledge/understanding
signaling words
exemplar
transcript
snap
ja
oke
* (o ja nou snap ik het ja.)
* ↑JA dus ik weet=
* oh dat komt [wel goed.
809
1052
2194
snap
want
er staat geen
* o ja dit ja maar ik had dit wel gedaan
dus ik snap het eigenlijk wel
* ik snap deze vraag niet want hoeveel
euro: is haar beltegoed maar je weet het
niet omdat het beltegoed in euro:s er niet
staat en beltijd in minuten er niet sta[at
* [ja heb ik ook
* Ja weet ik [maar die gaat dus ook steeds
fout
* o ik wou net zeggen want dan [()
1052
som …
hier
dit
* ik snap die een komma zeven in het
kwadraat snap ik die zeven(°niet°)
* wat wat bedoelen ze hiermee.
* [wat is nu het begin getal in de formule?
* maar hoe weet je dan dat het per dag is?
snap
want
(getal)
* kijk ik snap (.) >eenentwintig< niet want
er staat vijf [in
1528
1883
2592
2902
1906
1052
1509
1540
2408
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
comment
solution+approach
meaning
solution
Student demonstrates (non-)understanding by
COMMENTING on answer/solution/approach
found/applied/used earlier.
! not: “ik snap niet waarom dit X is”, since this
comment mainly claims (non-)understanding
Student demonstrates (non-)understanding by giving
the MEANING of something, typically after questions
like “waar staat X voor?” “wat betekent X?”, and in the
form of “X is het aantal ...”, “X houdt … in”, “X staat
voor …”.
Student demonstrates (non-)understanding by giving
the SOLUTION to calculation or read from graph.
* u::hm::: i ja ik dacht vierenzestig maar
>dat is< een beetje veel
* ja ik heb denk- heb deze gebruikt=
* PLUS zeven maar dat kan dan toch helemaal
niet meer,
809
is
staat voor
betekent
* (LKR: het startbedrag is als je?) aantal
kaarten nul is,
1077
numbers
* vijftig cent
* acht >nee< ja acht
* na enige tijd heeft am::i:n:a: zeven
alb(h)ums gedo(h)wn(h)load
* =ik heb b [vij:f
809
809
1052
* ja dus die tabel invullen met dat
prepaidte=
* ↑vijftien (.) keer de prijs van die
kaarten
* =en er wordt niet echt iets gezegd over,=
* (.) ja maar kijk uhm (.) nou bij deze is
het tussen haakjes dat heb ik uitgerekend,
* >maar er staat geen tussen haakjes<
* je hebt eh drie keer drie is eh ( )
* o keer
* ja >maar ik bedoel< u zei net dat e:hm
e:h het het ging dan eerst met aantal
1052
(calculation +
answer)
* plus twee euro vijftig is twee euro
vijfti[g
1077
oo, dus, ?
moet
toch?
* oo bij acht euro?
* o:h die↑
809
2906
klopt
maar
! not: answers to questions like those mentioned under
‘meaning’.
approach
Student demonstrates (non-)understanding by
explaining/giving information about an
APPROACH/procedure (i.e., how something
is/was/should be done).
(werkwoord)
keer, min, etc.
* also: answer to questions like ‘wat doe je (dus)?’
* also: student calculates something out loud (see
Rules if solution to calculation is given in same turn).
approach + solution
request confirmation
Student demonstrates (non-)understanding by
explaining/giving information about an
APPROACH/procedure and SOLUTION (cf. codes
‘approach’ and ‘solution’).
Student demonstrates (non-)understanding by ASKING
WHETHER X is indeed intended/asked/the right
answer.
27
3277
3298
1880
1502
1534
1875
1540
2404
2896
2896
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
28
req_conf_approach
Student demonstrates (non-)understanding by ASKING
WHETHER X is intended/asked/the right answer,
whereby X is an APPROACH (cf. code ‘approach’).
bedoelen
moet
doen
* =maar moet je dan ook helemaal >die die<
uh (co-) die dingen dan invullen?=
* =[oh dus dat is gewoon elf keer elf?]
* huh twee keer min twee
1906
2182
2764
req_conf_solution
Student demonstrates (non-)understanding by ASKING
WHETHER X is intended/asked/the right answer,
whereby X is a SOLUTION (cf. code ‘solution’).
is
wordt
* (.) dus elf keer elf is eigenlijk min
honderdeenentwintig?
* =min keer plus wordt toch min?
* [meneer heb ik em zo goed?
2182
Student demonstrates (non-)understanding by ASKING
WHETHER X is intended/asked/the right answer,
whereby X is a MEANING (cf. code ‘meaning’).
Student utterances which are EMPTY regarding
content or are UNRELATED to the task.
is
staat voor
betekent
oh
req_conf_meaning
miscellaneous
* also: Student tells on which task/question (s)he is
working (provided that phrases like ‘ik snap som X
niet’ are not mentioned).
* also: Student repeats (part of) teacher’s
question/answer.
* also: Student reads something from book.
* also: Student starts to formulate something, but the
utterance is without content (e.g. when a student stops
formulating, or pauses without having said something
with content).
* dus dat ↑keertje staat dan voor hoeveel
dagen het is
* nou
* oke
*(.) ↑dertig eu↑ro: echt?
* =[ja (.) maar uhm nou heb ik]
* ([nee nee kijk ])
* hier staat min
* dan heb je eigenlijk weer gewoon
3. Coding rules
Universal
A. ALWAYS code utterances for sub AND main categories.
E.g.
An utterance that is decided upon to contain a request of confirmation regarding a solution, should be coded in the categories
req_conf_solution, request confirmation, and demonstration.
B. When coding, do not select pauses (indicated in seconds between brackets) and student numbers.
N.B. Selection of line numbers is inevitable when coding adjacent lines as one utterance.
2616
3277
1540
809
1052
1509
1875
1880
2182
2896
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
N.B.
E.g.
29
Line number of first line selected should not be coded.
1
2
3402:
maar uh ik heb een ↑vraag (.) bijvoorbeeld bij de eers↑te
dan doe je toch gewoon (.) min een komma zeven
C. The unit of analysis is the student utterance, demarcated by pauses (indicated in seconds between brackets) and/or teacher utterances. IF an utterance
contains (.) and the separated parts of the utterance have different content, THEN code the parts of the utterance separately as two utterances. IF the content
of the separated parts is similar, THEN code the utterance as one utterance.
D. IF it is unclear what an utterance means or how it should be interpreted, THEN consult the surrounding (teacher) utterances and the video recordings to
clarify or disambiguate the utterance content.
Specific
A. IF utterance contains ‘dit’/’hiermee’/’deze’/’die’/etc., but content thereof is explained in next utterance, THEN do not code as claim;state + content.
E.g.
38
11102:
(vul de tabel in) o ja en dit snapte ik ook niet
39
40
41
42
43
LKR12:
11102:
(2.8)
wat wat bedoelen ze hiermee.
(1.1)
wat bedoelen ze waarmee,=
hiermee met (b)
The content of the student’s non-understanding is conveyed in line 43 (see reaction/question teacher). Line 38 should be coded claim; state,
since we know from this utterance THAT the student does not understand something. The same applies for line 40. Line 43 should be coded
claim; state + content, since we know from this utterance THAT and WHAT (s)he does not understand.
B1. IF multiple codes could apply to one utterance, THEN ask yourself what the main function of the utterance is and code the utterance in accordance with
that main function (but see rule C).
E.g.
1
3402:
maar uh ik heb een ↑vraag (.) bijvoorbeeld bij de eers↑te
2
3
LKR20:
dan doe je toch gewoon (.) min een komma zeven
ja:,=
It becomes clear from lines 1 and 2 that the student does not understand something, what the student does not understand and that the student
is asking for confirmation. However, the THAT and WHAT information is subordinate to the request confirmation function (and almost
irrelevant, when taking the teacher’s answer into account).Therefore, line 1 and 2 (together, since this is one utterance) should be coded as
demonstration; request confirmation; req_conf_approach.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
30
B2. IF utterance clearly does not have one, but multiple main functions, THEN code utterance with codes in code tree that contain double content (if relevant;
demonstration; approach + solution or claim state + content + explanation).
E.g.
47
11118:
plus twee euro vijftig is twee euro
48
vijfti[g
This utterance is a demonstration of understanding from which we get to know something about the student’s knowledge of approach and
solution to this particular question. The utterance (line 47 and 48 together) should therefore be coded demonstration; approach + solution.
E.g.
1
2
3402:
kijk ik snap (.) >eenentwintig< niet want er staat vijf
[in
This utterance gives information about the THAT, WHAT and WHY of non-understanding and all three functions are main functions of the
utterance. The utterance (line 1 and 2 together) should be coded claim; state + content + explanation.
B3. IF utterance includes demonstration followed or preceeded by claim, THEN code utterance as demonstration.
E.g.
29
11118:
vierendertig (o ja nou snap ik het ja.)
The student mainly demonstrates that (s)he understands something, after which a “side note” is made containing a claim of understanding. The
main function of the utterance is therefor demonstrating understanding, which is why the utterance should be coded as demonstration;
solution.
C1. IF “oke” is uttered to bring across that something is understood, THEN code as claim; state. IF this is not the case, THEN code as miscellaneous.
C2. IF “ja” can be interpreted as “ik snap het, ga door”, THEN code as claim; state. IF this is not the case, THEN code as miscellaneous.
D. IF utterance is an answer to questions like “wat moet je daar invullen?”, THEN answer is coded as approach.
E. IF student is interrupted by teacher and thus has not yet fully formulated the content of a question/demonstration/etc., THEN code as miscellaneous.
E.g.
8
2402:
o:h dus dan wordt het=
4. Some Complicated Examples with Explanation
16
17
18
LKR12:
11118:
hehe hh hoe hoe kom je daarop?
(0.4)
ja: >geen idee< (°gewoon een gok°)
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
19
LKR12:
31
ja maar
The student claims THAT (s)he does not understand something, and explains WHY (s)he does not understand it, but that it was “gewoon een gok”. Should be
coded as claim; state + explanation.
1
2
LKR20:
3410:
uh hier bij jou ook ik zie de x-as niet (2.0)
oja ik dacht dat ik die getekend had
The student demonstrates understanding by commenting on an ‘answer’ found. Should be coded as demonstration; comment on answer found.
7
8
9
3404:
e::n nu zie ik dit ((docent wijst in boek)) maar deze lijn klopt
dat die lijn? (.)
ik dacht het (.) van wel
The student reacts on a teacher utterance, which cannot be a claim of understanding. The student demonstrates his or her understanding by commenting on an
‘answer’ found. Should be coded as demonstration; comment on answer found.
101
102
103
104
LKR25:
5432:
ja
(1.4)
oke (.) maar het makkelijkste is gewoon het antwoord opschrijven
want die wist ik gelijk al
The student claims THAT (s)he understands something (cf. “oke” and “makkelijkste”) and then explains WHY (s)he understands it (“want…”). Should be
coded as claim; state + explanation.
66
67
68
69
5420:
[ik had deze geleerd
(.)
(ik) had ↑deze be↑keken om m hier op te lossen en ik heb die
die heb ik toen wat fout gedaan
The student claims that (s)he thinks THAT (s)he did something wrong, refers to WHAT (s)he thinks is wrong (“deze”) and explains WHY (s)he thinks it is
wrong. Should be coded claim; state + content + explanation.
21
22
23
24
25
5402:
LKR25:
5402:
LKR25:
ja maar nou=
=als je [komt te]kort blijkbaar=
[kom je]
=kom je nooit op ↑antwoord [uit
[wordt
dit ↑dertig.
The student utters something without explicitly relating it to content, this is probably an utterance of frustration. Should therefore be coded as miscellaneous.
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
Appendix II: Coded example
Table 1
Coded example (including student and teacher codes)
Transcript
( (LKR en LL praten gedurende hele gesprek op fluistertoon.) )
1
6513:
ik snap niet hoe ik die e:h
2
(.)
3
tabel van (elf) door moet krijgen=
4
LKR38:
=kij- uh spiek je >stiekem< even verder.
5
(0.4)
6
misschien heb je hier wat aan.
7
(0.5)
8
of misschien heb je hier wat aan.
9
(0.8)
10
hier zijn het ↑andere letters.
11
(0.7)
12
welke letters ga jij hie::r gebruiken,°
13
(1.8)
14
[bij dit sommetje,
15
6513:
[u:h
16
(4.6)
17
6513:
bee?
18
(0.8)
19
LKR38:
en de?
20
(1.1)
21
6513:
wee,
22
(.)
Assigned codes
claim; state + content
no elicitation
no elicitation
no elicitation
no elicitation
demonstration
demonstration
miscellaneous
demonstration; request confirmation;
request confirmation approach
demonstration
demonstration; approach
32
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
LKR38:
6513:
LKR38:
6513:
LKR38:
6513:
LKR38:
6513:
en de wee.
(.)
en welke letter moet bovenaan,
(0.5)
de wee?
(.)
>oke<
(1.4)
aantal we:ken perfect
(0.7)
en je ↑begint niet met aantal weken is éé:n maar
het aantal
weken is nul.
(1.0)
ja,
(1.5)
weet je zo ↑wat je moet doen?
(0.9)
jhha::
(.)
want wat ga jij onder de nul zo meteen
invullen bij de bee,
(0.3)
tien euro=
=( (LKR38 steekt duim op en loopt weg) )
(3.2)
[en bij week één?
[( (LKR draait zich terug om naar LL) )
(1.4)
twaalf vijftig.=
=( (LKR steekt weer duim op en loopt nu echt weg) )
no elicitation
demonstration
demonstration; request confirmation;
request confirmation approach
no elicitation
no elicitation
no elicitation
no elicitation
claim
claim; state
demonstration
demonstration; solution
demonstration
demonstration; solution
33
STUDENT ACADEMIC PERFORMANCE AND STUDENT EXPRESSION MODE
Appendix III: Correlation matrix
Table 6
Correlation between student grade and student (sub) codes, controlled for percentage of teacher elicitations, for the three teacher sub samples
controlled for % of teacher elicitations
of demonstrations
Correlation between student grade
and ...
demonstration
comment approach/solution
meaning
solution
approach
approach/solution
request confirmation
request confirmation meaning
request confirmation approach
request confirmation solution
sub1 (25)
sub2 (38)
sub3 (12,20)
sub1 (25)
sub2 (38) sub3 (12, 20)
-.159 (.588)
-.137 (.626)
-.129 (.354)
.160 (.585)
.263 (.343)
.129 (.661)
.166 (.232)
-.185 (.509)
-.132 (.347)
.111 (.706)
.159 (.603)
.385 (.175)
.127 (.680)
-.109 (.437)
.172 (.557)
.199 (.476)
.125 (.670)
-.159 (.587)
.216 (.439)
-.266 (.357)
-.186 (.524)
*-.576 (.031)
-.184 (.511)
-.176 (.566)
*-.594 (.033)
.129 (.354)
-.399 (.157)
.111 (.428)
controlled for % of teacher elicitations
of claims
claim
state
state + content
state + explanation
state + content + explanation
.259 (.351)
.371 (.192)
.417 (.156)
-.141 (.616)
-.346 (.225)
.343 (.251)
.441 (.100)
.272 (.347)
.151 (.623)
.252 (.364) *.575 (.031)
*-.436 (.001)
.408 (.148)
Note. Correlations smaller than r = .1 are not presented. subx (xx) = subsample x (teacher number). * p < .05.
*.621 (.023)
*-.436 (.001)
34