Building models explaining student participation behavior in

Computers & Education 94 (2016) 241e251
Contents lists available at ScienceDirect
Computers & Education
journal homepage: www.elsevier.com/locate/compedu
Building models explaining student participation behavior in
asynchronous online discussion
Sean Goggins*, Wanli Xing
School of Information Science & Learning Technologies, University of Missouri, Columbia, MO, 65211, USA
a r t i c l e i n f o
a b s t r a c t
Article history:
Received 19 February 2015
Received in revised form 29 October 2015
Accepted 3 November 2015
Available online 24 November 2015
Previous studies have invested much effort in understanding how participation in asynchronous online discussion affects student learning, and what factors influence student
participation behavior. Results of these studies have been inconclusive and these investigations are often conducted from isolated perspectives. Relying on social cognitive
theory, this study proposes two dynamic student participation models in online dialogue
and particularly highlights understudied factors e collective efficacy, social ability, reading
behavior, the time dimension of participation e to examine the mediation and causal
relationship among those factors and their influence on learning. The models are tested
utilizing data collected from a large US university. Specifically, while the predictive constructs are operationalized through the survey instruments, the outcome measures are
modeled using electronic trace data and actual evaluation information. Data is analyzed
using the Partial Least Squares modeling method. Results demonstrate the intertwined
relationship among constructs and a different influencing mechanism for each construct
on participation behavior and learning. By comparing these two built models, the time
dimension of participation is shown to be more influential in predicting student learning
than posting and reading actions. The paper concludes with a discussion of the implications of this study.
© 2015 Elsevier Ltd. All rights reserved.
Keywords:
Asynchronous online discussion
Social cognitive theory
Social ability
Collective efficacy
Partial least squares
1. Introduction
Computer-mediated communication (CMC) has been widely applied as a teaching and learning tool for both blended and
online course in numerous disciplines (An, Shin, & Lim, 2009). These technologies offer new instructional possibilities and
, Collimore, & Joordens, 2011). Web-based asynunprecedented opportunities for educational interactivity (Cheng, Pare
chronous discussion forums are one example of these CMC tools that have been extensively used to support and complement
current educational practice. The most common discussion forum is usually a text-based environment that allows individuals
to interact with one another without the constraint of time and place (Vonderwell & Zachariah, 2005). Students discuss by
writing posts and responding to the posts of others. Though the prevalence of online discussion is well-established, clear,
empirical evidence of how online discussion affects student learning is not. Examining this asynchronous tool closely will
allow instructors to understand how their students participate and learn via this medium, improve design and facilitate the
asynchronous interactions.
* Corresponding author.
E-mail addresses: [email protected] (S. Goggins), [email protected] (W. Xing).
http://dx.doi.org/10.1016/j.compedu.2015.11.002
0360-1315/© 2015 Elsevier Ltd. All rights reserved.
242
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
Theoretically, it is reasonable to expect that online discussion has the potential to promote collaborative learning and
facilitate knowledge acquisition as manifested by enhanced student academic outcomes. To illustrate, Vygotsky (1978)
pioneered in investigating the role of language in thought and proposed that conceptual learning was a collaborative
effort requiring supportive dialogue. Slavin (1995) provided the substantial research support, showing the positive effects of
collaborative learning on achievement. While this all seems sensible in theory, empirical evidence emphasizing the beneficial
impact of online forum participation is limited. Research on online discussion has tended to center on issues like learner
completion rates, learner satisfaction, and differences between online learning and its face-to-face counterpart (Dennen,
2008a). Some studies specifically evaluated the effect of participating in online discussion on students’ learning and
generated mixed results. For example, Cheng et al. (2011) found that a significant relationship between student final grades in
an online class with the number of discussion board postings student made during the semester. However, in other studies,
students reported that discussion board postings had little value and did not benefit their understanding of the course
content (Reisetter & Boris, 2004). Given the amount of time and energy required to develop, manage and maintain the online
discussion board, it is crucial to continue testing its value to student learning.
Moreover, posting behavior has dominated previous research on the evaluation of online board participation on student
learning. Those who contribute too few posts are often labeled as “lurkers” or “passive recipients” and are not assumed to be
actively engaged in learning (Dennen, 2008b). However, true learning dialogue requires students to read and reflect in order
to be part of a dialogue instead of just posting activities. Students who participate by reading and not by posting is analogous
to students who listen to lessons but rarely ask questions (Dennen, 2008b). Focusing only on posts may overlook the
pedagogical benefits of discussion boards associated with the students' participation through reading. Furthermore, posting
and reading is still only one dimension of measure for students' involvement in online discussion. Another useful dimension
to gauge student's participation is time (Goggins, Xing, Chen, Chen, & Wadholm, 2015). Since online discussion is roughly
considered collaborative learning (Cheng et al., 2011), time, interaction and performance (TIP) theory (McGrath, 1991) can be
applied here. This theory highlights the temporal processes in group interaction and its impact on performance. Indeed,
asynchronous online dialogue generates inevitable delays between posting and replying to messages within a thread. Reading
and reflection time invested also varies among students. Son (2002) has speculated that this time attribute of participation
may impact student learning. Besides reading behavior, a more formal investigation of the time feature has yet to be
conducted.
In addition to studies exploring how participation in online discussion influences student learning, a great deal of research
has also inspected factors impacting student participation behavior (Wang, Laffey, Xing, & Ma, in press). After all, a necessary,
if not sufficient, condition for discussion to aid learning is to participate in the discussion. A review of previous literature
reveals that interface characteristics, content and material, student roles, instructional tasks, and information overload were
identified as factors influencing student participation in online dialogue (Ma, Friel, & Xing, 2014; Vonderwell & Zachariah,
2005). However, little research has been conducted to examine the influence of collaboration-related factors in affecting
students’ participation in online discussion (Xing, Kim, & Goggins, 2015). Bandura (1997) proposed collective efficacy, the
perception of group capability to achieve a goal, to be one of the most powerful motivational beliefs, has positive effect on
various aspects of group collaboration. Similarly, social ability, the measure of how able students are in using the resources of
their social context to achieve goals, is also suggested to impact collaborative work (Laffey, Lin, & Lin, 2006). The current study
plans to investigate how collective efficacy and social ability impact student behavior and learning in online discussion. On
the other hand, the documented studies have typically examined how different variables affect student participation and how
participation influences student learning in separate models, losing insight into the intertwined relationship of these factors
with participation and learning. It is desirable to investigate how online discussion activity can contribute to student learning
from a more integrative perspective.
Overall, the present study seeks to build a model to understand how students participate and learn in online discussion.
Inspired by social cognitive theory, two competitive models are constructed using the Partial Least Squares modeling techniques to examine how system functionality, collective efficacy and social ability are embodied as two dimensions of
participation behaviors (posting and reading; time) in online discussion influencing student learning outcomes. While the
first model focuses on how posting and reading actions in influencing learning performance, the second model centers on the
time dimension of posting and reading in affecting learning performance. Our results present models that explain how these
constructs influence student learning in the asynchronous online discussion as a whole.
2. Theoretical framework and research models
Social Cognitive Theory (Bandura, 1986) relies on the premise of triadic reciprocal causation, which shows that cognitive
factors, environmental factors and human behavioral factors interact with and influence each other. Cognitive factors, also
known as personal factors, include personal cognition, emotion, efficacy and biological events. In our study, the psychological
characteristics of collective efficacy and social ability are contextualized as cognitive factors. Environmental factors refer to
social and physical traits reflected by system functionality in this study. Behavioral factors refer to people involved in
cognitive and social events. The concept of behavior can be viewed in many ways (Bandura, 2001). For example, people can
acquire new behaviors and knowledge by simply observing a model of person. There is also an assumption of goal-directed
behavior, where people set goals for themselves and direct their behaviors accordingly. Or Behavior ultimately becomes selfregulated that people being to regulate their own learning and behavior. In the context of online discussion, these behaviors
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
243
can be denoted mainly in two aspects: action (posting and reading), and time. The time dimension can be more specifically
classified as posting delay time and reading time.
In order to better examine and compare posting and reading as well as the typically overlooked time aspect of behavior
and their influence on student learning, we build two models. Reading and posting are put into one model and time-aspect
behavior is put into another one. These models are inspired by social cognitive theory, in which system functionality (SF),
collective efficacy (CE), social ability (SA), posting action (PA), reading action (RA), posting delay time (PT), and reading time
(RT) are exogenous and/or mediation variables. Learning performance (LP) is an endogenous variable. Fig. 1 shows the two
models along with the research hypotheses (H1 to H14), where Model 1 focuses on action and Model 2 focuses on time. These
models cannot only show the significant relationship between two related constructs (e.g. SF significantly affects CE, or RA
significantly influences LP) but also show the overall structure (Model 1 or Model 2) of student participation in asynchronous
online discussion. This structure can inform the later mediation analysis to examine the mediating effects e.g. whether SA
mediates SF's influence on RA or whether RT mediates CE's influence on LP. Detailed of the hypotheses will be explained in the
following sections.
2.1. Behavioral factors
Research has found that participation in asynchronous online discussion can promote knowledge construction, critical
thinking and problem solving through interaction with other students and instructors (Kay, 2006). Without student
participation, a discussion cannot occur in the first place. Participation in online discussion boards can be measured from
multiple dimensions. Specifically, posting and reading actions measure the number of posts a student writes and reads.
Posting time in this study gauges the time between a message being posted and a student's response to that message. Reading
time measures the time that the student spends viewing posts. Various studies have found that these actions and times can
affect student learning performance. For example, Patel and Aghayere (2006) studied the relation between forum participation and learning performance in two undergraduate civil engineering courses. They found that number of posts (posting
action) and post views (reading action) in the discussion forum were positively correlated with student final grade. Similar
findings were also obtained in a study by Ramos and Yudko (2008). In terms of the time attribute, even though not many
analyses have been performed, a greater delay in response is generally considered to have a negative impact on learning (Kay,
2006; Son, 2002). Kuboni and Martin (2004) asked students to estimate the frequency and average time of each visit. They
suggested that more frequent visits and more time spent on each visit can increase student learning performance in the
course. Therefore, reading, posting and time are all expected to influence student learning. We propose the following
hypotheses:
H1.
Posting action is positively correlated with student learning performance.
H2.
Reading action is positively correlated with student learning performance.
H3.
Less responding time delay is positively correlated with student learning performance.
H4.
Reading time is positively correlated with student learning performance.
2.2. Cognitive factors
Collective efficacy represents a group's shared beliefs that it can accomplish a task successfully, and is essential to
motivation and effectiveness (Bandura, 1997). Previous studies have focused on individual motivation and efficacy despite the
Fig. 1. Hypothesized Research models: Does student action (posting and reading) or time (posting delay time and reading time) in online dialogue have greater
influence on learning performance? Specifically, these abbreviations are represented as below: system functionality (SF), collective efficacy (CE), social ability
(SA), posting action (PA), reading action (RA), posting delay time (PT), reading time (RT), Learning performance (LP). H is an abbreviation for Hypothesis, and there
are 14 hypotheses in total (H1 to H14) for the two models.
244
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
fact that online discussion occurs in a collaborative setting. What a student learns from an online discussion board is not static
knowledge but is a creative cognitive process of coming up with ideas and reshaping them in light of peer interactions.
Collaborative knowledge construction takes place through such cognitive processes. When students are involved in an online
discussion, the stronger the sense of collective efficacy that the student has, the more challenging goals the student may set
and persist in achieving in face of difficulty (Bandura, 1997). Students with this belief are ultimately more likely to participate
and interact more in online dialogue. In fact, research has also demonstrated that collective efficacy has a significant effect on
student collaborative processes, such as levels of effort and persistence. For instance, Lee and Farh (2004) indicate that
collective efficacy plays a significant role in group cohesion. Task cohesion, concentrating on task commitment, also has a
strong relationship with collective efficacy (Wang & Hwang, 2012). Both group cohesion and task cohesion requires active
participation in the group activity. Therefore, a positive relationship between collective efficacy and student participation in
online discussion is expected and the following hypotheses are proposed.
H5.
Collective efficacy is positively correlated with student posting action.
H6.
Collective efficacy is positively correlated with student reading action.
H7.
Collective efficacy is positively correlated with less time delay for student to response.
H8.
Collective efficacy is positively correlated with reading time for student.
Social ability refers to how capable students are in employing resources in a social context to accomplish goals (Laffey et al.,
2006). Social ability in online learning situation is defined as the fit among people, task and tools. It is mainly constructed
from three aspects: social navigation, social presence and connectedness (Laffey et al., 2006). Social navigation indicates “a
phenomenon, in which a user's navigation through an information space was primarily guided and structured by the activities of others within that space” (Dourish, 1999, p. 18). It represents being aware of what others are doing as a guidance for
one's own actions. In a social context, much of what people do derives from learning and observing the actions of others
(Laffey et al., 2006). For instance, when students participate in online dialogue, they can identify hot topics by looking at the
number of posts under each heading. Social presence reflects the capacity to connect and keep students engaged in interaction. In networked virtual environments such as online discussion boards, social presence is advanced as the sense of “being
there” and the sense of “being there with others”. Picciano (2002) found positive relationships between students' perceived
social presence and the perceived learning experience. Social connectedness is the social ties among people and the value
they perceive in being connected in a social context. Social ability is a synthetic representation of individuals' beliefs about
their capacity to learn within the social affordances of their environment (Tsai et al., 2008). In other words, students who have
greater social ability tend to be more aware of and act upon peer and instructors' actions (Tsai et al., 2008). Research has
shown that social ability is a critical construct for encouraging students from peripheral to central roles in a community (Tsai
et al., 2008), and heavily depends on participation in the social activity. Hence, a positive relationship between social ability
and student participation is expected:
H9.
Social ability is positively correlated with student posting action.
H10.
Social ability is positively correlated with student reading action.
H11.
Social ability is positively correlated with less time delay for student response.
H12.
Social ability is positively correlated with reading time for student.
2.3. Environmental factors
System functionality of the online discussion systems is a fundamental determinant of effective online collaboration
(Schweir, 2002). It measures the perceived ability of the environment for task-specific support, problem solving, and
collaboration. Similar to the construct of self-efficacy, the environment of a student group provides sources of information for
collective efficacy. Bandura (1997) suggested that collective efficacy is very likely conveyed by the interaction process affected
by the environment and context. That is, as a concept that develops over time, collective efficacy is impacted by the environment and the interacting entities in them. Moreover, empirical studies have found that system functionality majorly
influenced user beliefs and attitude in multiple distance learning contexts. For example, Hara and Kling (2000) found that
students' feelings of how they are supported by the communication system affect their satisfaction with the online learning
and their motivation to use the system. As an environmental factor, system functionality is supposed to influence students'
collective efficacy. On the other hand, without quality system provisions (e.g., communication channels), students may feel
isolated and disempowered to accomplish the scheduled tasks. In Schweir (2002)’s study, system functionality was crucial to
building online learning communities as a foundational infrastructure for students to be social. We hypothesize that system
functionality will affect students’ social ability as well.
H13.
System functionality is positively correlated with student collective efficacy.
H14.
System functionality is positively correlated with student social ability.
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
245
3. Research methodology
3.1. Research context
The data reported in this paper represent a subset of data gathered in a larger study conducted in the context of an online
graduate student course on Computer Support for Collaborative Learning, offered in a large mid-western US university.
Twenty-four students were involved in this study and completed all course activities. Sakai was used as a collaborative
environment that supported the CSCL course. The CANS (http://www.cansaware.org) system was applied to provide activity
awareness information. When a student logged into Sakai and posted a message or read a message, CANS made a note of it
and saved the information into the logs automatically. All discussions in the course were facilitated through Sakai using a
JForum discussion board. JForum is integrated with both CANS and Sakai. Specifically, the data we analyzed using one module
of the data lasting two weeks, where students worked collaboratively in groups using the discussion board to design a two
day online learning module. Survey data were collected before the module started and the log data from CANS were gathered
from all the students when the module was completed.
3.2. Measure development
All instruments were adapted from existing literature to increase validity. Social ability is measured via the social ability
survey developed by Laffey et al. (2006) with 12 items in it. The initial alpha coefficient for these 12 items is 0.811, suggesting
that the items have relatively high internal consistency. Collective efficacy was operationalized through a survey constructed
by Hardin, Fuller, and Valacich (2006) for virtual teams with 4 items in it. The initial alpha coefficient value 0.893, also
indicating a high internal consistency for these items. System functionality was informed by Sonnenwald's (2005) information horizons concept regarding how a person perceives the usefulness of an environment, and adapted to the Sakai
discussion context including 3 items. The initial alpha coefficient is 0.67, suggesting an acceptable internal consistency for
these items. These constructs were presented to students with a Likert scale.
Unlike previous studies using self-report questionnaires to reflect student learning experience or performance, we applied
actual assignment evaluation data to generate measures for student learning. The purpose of the task analyzed was for group
members to develop an online module that can be implemented in a real-life environment. Two raters proceeded to evaluate
the work products and generate a group grade based on the rubric. Each individual student was then asked to rate and
nominate their group members and the whole class. The frequency of rating from other group members or other students
generates the individual grade. Therefore, student learning performance is represented by two indicators e group and individual grade.
Similar to the measure of learning performance, student participation behaviors are measured based on the actual usage
data instead of perceived information from the questionnaire. Because CANS already records specific information about
student writing and reading behavior on the discussion board, the number of posts a student writes and reads can be
calculated directly from the CANS log data. Specifically, posting action (PA) is represented as the number of posts a student
wrote during this module. Reading action is calculated as the total frequency of reading action recorded by CAN during this
time. The posting time is first modeled as the sum of time delay for responding to posts. Then, an inverse value was calculated
because a low number indicates a quick response. Last, because students had different numbers of posts, posting time (PT)
was finally reflected as an inverse value of the total delay time for response divided by number of posts to create an average.
Reading time was estimated as the total time student used for consecutive reading actions. For example, as shown in Fig. 2,
if the log data for a student showed as sequence: reading, reading, reading, writing, reading, reading. The reading time is
Fig. 2. Reading time estimation method.
246
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
calculated as the time from the first reading to the third reading added to the time from fourth reading to the fifth reading. The
intersection between reading and writing was removed. Also, since most of the posts are relatively short, it is difficult for a
graduate student to finish one post using a long time. Therefore, if time between one reading action to the next is more than
30 min, it was considered to be inaccurate data and was not incorporated into the reading time calculation due to the
likelihood that a student may have turned to another activity instead of reading posts. This is our estimation and assumption.
Other measurement strategies such as using screen capture technologies and eye-tracking may generate more accurate
measures of reading time (see Fig. 3).
3.3. Partial Least Squares (PLS) modeling
The PLS method is a multivariate statistical modeling technique used to test the relationships between a set of independent variables and dependent variables. It is considered to be the second generation of multivariate analysis (Fornell &
Larcker, 1981), integrating multiple regression, path analysis, principle component analysis, and multiple discriminant
analysis. As a component-based structural equation modeling technique, PLS is particularly suitable for predicative applications, while the linear structure relationships (LISREL) model is more oriented towards theory testing and development.
The LISREL model must estimate model parameters in order to reproduce the covariance matrix of the measures to see how
well the hypothesized model “fits” supplied data. By contrast, PLS aims to maximize variance and explain it in a regression
sense. Hence, R2 and the significance of relationships among measures are indicative of how well a model performs.
In addition, the assumption of normality is not required for PLS, and the technique shows utility even with small sample
sizes (Chin, Marcolin, & Newsted, 1996). By conducting a Monte Carlo simulation study on PLS with small samples, Chin et al.
(1996) found that PLS approach can offer information about the appropriateness of indicators for sample sizes as low as 20
and even under conditions in which there are more variables than observations. Fornell and Bookstein (1982) stated that “PLS
involves no assumptions about the population or scale of measurement and consequently works without distributional
assumptions and with nominal, ordinal, and interval scaled variables” (p. 443). Considering the sample size and the
explanatory nature of the study, the PLS method is preferred for model testing.
4. Results
4.1. Measurement model
The PLS measurement model was assessed by evaluating the reliability, internal consistency, convergent validity and
discriminant validity through a confirmatory factor analysis. In terms of individual item reliability, Chin (1998) indicated that
items should load highly (greater than 0.7) on their intended constructs. Loadings with 0.5 or 0.6 are still acceptable on the
condition that there are additional indicators in the construct for comparison analysis (Chin, 1998). As a result, four items
(social ability Item 1, 5, 7, 10) were removed because their factor loadings were smaller than 0.5. This allowed all factor
loadings of the measurements (Table 1) to meet the suggested condition.
To evaluate the internal consistency and construct reliability, composite reliability was calculated. As suggested by
Nunnally (1978), composite reliability should be greater than 0.7 and most constructs in this study satisfied the condition
(Table 1). The system functionality construct is very close to 0.7 and is generally considered acceptable in social science.
Regarding convergent validity, it is suggested that average variance extracted (AVE) for each factor should be larger than 0.5.
Table 1 shows the values of AVEs meet the recommendation. Discriminant validity demonstrates that each measure item
weakly correlates with all other constructs except for the one of interest. It is valid if the squared root of AVE is greater than
the correlations between latent variables (Fornell & Larcker, 1981). Table 2 indicates that all the scales meet the suggested
requirement and manifests an adequate validity of the measurements.
Fig. 3. The PLS model results.
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
247
Table 1
Individual item loadings. Construct reliability and convergent validity.
Construct
Composite reliability
AVE
Item
Loading
System functionality
0.66
0.50
Collective efficacy
0.84
0.69
Social ability
0.88
0.50
Posting action
Reading action
Posting time
Reading time
Learning performance
1.00
1.00
1.00
1.00
0.75
1.00
1.00
1.00
1.00
0.77
SF1
SF2
SF3
CE1
CE2
CE3
CE4
SA2
SA3
SA4
SA6
SA8
SA9
SA11
SA12
PA1
RA1
PT1
RT1
LP1
LP2
0.85
0.80
0.70
0.61
0.94
0.86
0.88
0.72
0.76
0.63
0.65
0.84
0.80
0.87
0.57
1.00
1.00
1.00
1.00
0.97
0.78
AVE ¼ Average Variance Extracted.
4.2. Structural models: direct effects
The structural PLS model was modeled using the maximum likelihood method. In addition, bootstrapping resampling was
conducted to assess the statistical significance of path coefficients. Compared with traditional t-tests, bootstrapping
resampling allows the significance of parameter estimation from the data to be tested without assuming multivariate
normality (Chin, 1998) and also remediates the sample size in this study to some extent. Table 3 shows the hypotheses, path
coefficients (b), and t-values. Fig. 2 presents a graphical representation with path coefficients, and R2.
As shown in Table 3, most of the hypotheses are supported. H1 and H2 posit that both posting action and reading action
positively impact student's learning performance. According to Table 3, posting action has a positive effect on learning
performance (0.50, p < 0.05) as does reading action (0.81, p < 0.001). Therefore H1 and H2 are supported. Positive conclusions
are also reached for H3 and H4, which state that posting delay time (0.48, p < 0.01) and reading time (0.68, p < 0.001)
positively affect student learning performance. In general, participation in online discussion does influence student learning.
Such influence does not simply come from the number of posts that a student makes, but also includes non-visible behavior e
reading posts. Furthermore, participation also has a time dimension, reflected by the posting delay time and reading time. In
fact, based on the path coefficient (b), reading behavior and time attribute of participation are more influential predictors for
learning than the typically considered posting behavior.
H5 and H6 are not confirmed, showing that collective efficacy is not significantly correlated with student posting and
reading action in online discussion. By contrast, H7 and H8 are supported, indicating that collective efficacy significantly
influences both post delay time (0.42, p < 0.05) and reading time (0.55, p < 0.01) respectively. The positive effect of social
ability on posting action is not confirmed, and therefore H9 is not supported. But social ability is shown to influence reading
action significantly (0.48, p < 0.05), offering support for H10. H11 is not supported due to the insignificant result, meaning that
social ability is not positively correlated with posting delay time. However, the positive effect of social ability on reading time
is confirmed (0.33, p < 0.05), which supports H12. Briefly, collective efficacy affects students' participation in regards to
posting delay and reading time but does not relate to their concrete actions (posting action and reading action). Social ability
impacts students’ reading participation (reading action and reading time) but not posting (posting action and posting delay
time).
Table 2
Discriminant validity.
Construct
SF
CE
SA
PA/PT
RA/RT
SP
System functionality
Collective efficacy
Social ability
Posting action/posting time
Reading action/reading time
Learning performance
0.71
0.23
0.56
0.27
0.04
0.25
0.83
0.05
0.13
0.16
0.10
0.71
0.12
0.45
0.17
1.00
0.68
0.05
1.00
0.47
0.88
Diagonal elements (bold) are the square root of the AVE of each construct.
248
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
Table 3
Hypotheses testing results.
H0
H1
H2
H3
H4
H5
H6
H7
H8
H9
H10
H11
H12
H13
H14
PA -> LP
RA -> LP
PT -> LP
RT -> LP
CE -> PA
CE -> RA
CE -> PT
CE -> RT
SA -> PA
SA -> RA
SA -> PT
SA -> RT
SF -> CE
SF -> SA
Path coefficient (b)
t-value (bootstrap)
Decision
0.50*
0.81***
0.48**
0.68***
0.12
0.24
0.42*
0.55**
0.10
0.48*
0.05
0.33*
0.23
0.56**
2.12
3.44
2.76
3.93
0.55
1.19
2.12
3.22
0.48
2.51
0.23
1.96
1.10
3.13
Supported
Supported
Supported
Supported
Not supported
Not supported
Supported
Supported
Not supported
Supported
Not supported
Supported
Not supported
Supported
*: p **: p ***: p.
System functionality is not significantly correlated with collective efficacy, and hence H13 is rejected. However, system
functionality affects student social ability in an online discussion board significantly (0.56, p < 0.01), supporting H14. When
comparing Model 1 and Model 2, the goodness of fit measure for Model 2 (0.40) is better than Model 1 (0.33). Also, in terms of
the prediction power of the two models, Fig. 2 shows that Model 2 outperforms Model 1 because Model 2 explains 45% of the
variance of learning performance but Model 1 only explains 36%. It provides evidence that time is even more important in
understanding and explaining student learning in online discussion forum than simply posting and reading actions.
4.3. Structural models: Mediating effects
We further examined indirect, mediation effects in these two models. The rationale behind this analysis is that social
cognitive theory endows a central position to cognitive processes (Bandura, 1997), which indicates that the environment
influences a person's behaviors through cognitive mechanism. Therefore, social ability is expected to have mediating effects
on the relationship between system functionality and students' participation behavior. Also, student participation behavior
embodying or mediating the cognitive factors reflected as learning performance are tested as well. Baron and Kenny (1986)
suggested three steps to examine the mediator: a) independent variable must significantly influence mediator; b) independent variable must significantly influence the dependent variable; c) both independent variable and mediator are
employed to predict the dependent variable: if both of them significantly affect the dependent variable, then this mediator
partially mediate the impact of the independent variable on the dependent variable; if the influence of the mediator is
significant but the influence of independent variable is not, then the mediator fully mediates the impact of independent
variable on dependent variable.
As shown in Table 4, the direct link between system functionality and social ability was significant and hence satisfied the
first condition. However, the link between system functionality and reading action was not significant. Thus, the second
condition is not met. Therefore, social ability did not mediate the influence of system functionality on reading action. Using
the same logic, it was concluded that social ability did not mediate the influence of system functionality on reading time
either. The links between social ability and reading action and learning performance were significant, and thus, the first
condition and second condition were satisfied. Moreover, the direct relationship between social ability with student learning
performance was significant when the link between reading action and student learning performance was added, while the
latter links was also significant. Hence, according to condition three, reading action partially mediated the relationship between social ability and student learning performance. Based on a similar induction method, we could conclude that reading
time fully mediated the influence of social ability on learning performance. While posting time fully mediated the effect of
collective efficacy on learning performance, reading time only partially mediated such effect.
Table 4
Mediating effect tests.
IV
SF
SF
SA
SA
CE
CE
*: p **: p ***: p.
M
SA
SA
RA
RT
PT
RT
DV
RA
RT
LP
LP
LP
LP
IV -> M
0.57***
0.57***
0.48**
0.45**
0.46**
0.56***
IV->DV
0.13
0.33
0.48***
0.48**
0.36*
0.36*
IV þ M -> DV
Mediating effect
IV
M
0.34*
0.16
0.32*
0.29
0.41*
0.07
0.61***
0.38*
0.43**
0.38*
0.39*
0.52**
Not supported
Not supported
Partially mediated
Fully mediated
Partially mediated
Fully mediated
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
249
5. Discussion
Informed by social cognitive theory, this study constructs two student participation models in an asynchronous online
discussion context. Our models contribute to the understanding of how participation in online discussion can benefit student
learning and how system functionality, collective efficacy and social ability affect students' participation in online dialogue.
The findings on the value of participating online discussion benefiting learning not only reconfirm and remain consistent with
prior studies, but also provide new empirical evidence on the association of reading behavior and time aspect of participation
on learning. Specifically, the number of posts students write is significantly correlated with students' learning performance.
This finding is in line with previous research (Cheng et al., 2011; Patel & Aghayere, 2006). Also, the results indicate that the
number of times that a student reads posts also influence student learning performance significantly. In addition, we show
that the less time delay student uses to response to others’ posts, or the more time student spends on reading, the higher
learning performance the student can achieve.
Though the research community usually focuses on number of posts students make when examining student learning in
using online discussion tools, our research demonstrates that the number of posts students read, the time delay for
responding, and the time spent on reading posts also matters. In fact, these additional factors are even more influential than
number of posts made in predicting learning achievement. Particularly, the number of posts that a student reads is the best
predictor of learning achievement, followed by reading time, time delay and number of posts. When we purposely put the
time dimension in a separate model, the experiment results shows that time dimension of online discussion is more
important than the number of posts written or read in explaining variance in student learning performance. While many
studies consider students who post less or not at all as lurkers (Dennen, 2008b; Cheng et al., 2011), our study shows that
student can participate through reading and that their participation time is also important. These findings have very practical
implications for online discussion activities. Given that the common practice for teachers is to use number of posts as the
measure of student performance (Dennen, 2008a), the results from our study may inspire the establishment of new perspectives of learning and assessment for online discussion activity in addition to the number of posts.
This study also introduces collective efficacy and social ability into the online discussion research community. Previous
studies have generally investigated the “individual” factor in impacting students' participation in online discussion such as
motivation, learning style, gender etc. (Vonderwell & Zachariah, 2005). As a complex collaborative knowledge building
process, collective efficacy and social ability capture the emergent dynamics created by multiple factors across people and
sources in online discussion context. The results display that collective efficacy significantly affects time delay in responding
to posts and time spent on reading. Additionally, social ability is positively correlated with the number of posts that a student
has read and time spent on reading. Collective efficacy and social ability bring new understanding to how online discussion
takes place and varies among students. Practically, if the instructor intends to promote participation, he or she can find a way
to improve students' collective efficacy and social ability first, as both of these constructs are related to students’ innate
motivation and ability. Moreover, we also examine the role of system functionality in online discussion. Although system
characteristics or discussion interface design was found to influence student participation in previous study (Vonderwell &
Zachariah, 2005), the system functionality construct in our study does not influence student participation. Future studies can
further investigate the effect of the system characteristics on student participation.
In addition, prior research has centered on the direct influence of collective efficacy, social ability and system functionality
on online participation and learning, and investigated them in separate models (Laffey et al., 2006; Vonderwell & Zachariah,
2005; Wang & Hwang, 2012 ). Hence it is difficult to have an in-depth understanding of the influencing processes of these
factors. The reported study enriches literature on student online discussion by providing insights into the influencing
mechanism of these constructs in online discussion participation and learning. Specifically, we examined the mediating
effects of social ability on participation and the mediating effects of participation factors on student learning performance.
The results reveal that the influence of these factors is more complex than the reported studies on direct effects. There are
mediating effects in those constructs and the mediating power varies among them. To illustrate, reading action partially
mediates the influence of social ability on student learning performance and posting time partially mediates the influence of
collective efficacy on learning performance. By contrast, reading time fully mediates the influence of both social ability and
collective efficacy on student learning performance.
Methodologically, previous studies have only used self-reported data to measure the constructs used in their models
(Dennen, 2008a; Wang & Hwang, 2012). However, self-reported participation or evaluation instruments may not address the
student participation and performance appropriately (Xing, Wadholm, & Goggins, 2014), and can be different from the actual
case (Tsai et al., 2008). Moreover, the potential of common method bias exists (Podsakoff, MacKenzie, Lee, & Podsakoff, 2003)
if the independent variables and dependent variables are self-reported by the same individuals. Unlike those works, the
present study applied actual evaluation of the student works as the learning performance indicator. In addition, this study
took advantage of the advancement of information systems by modeling student participation in online discussion using
actual electronic trace data of student behaviors as proposed in “learning analytics” (Xing, Guo, Petakovic, & Goggins, 2015;
Xing, Wadholm, & Goggins, 2015). Our study presents a potential approach to the measurement of constructs in online
education studies, possibly with greater validity and objectivity.
Moreover, social cognitive theory is not proposed for technology-medicated communication. It is defined and usually
examined through the traditional learning context (Bandura, 1986). While this current study agrees with the traditional
perspective that inner forces and environmental stimuli drive people as combined influences, these forces and factors may
250
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
embody new meanings in CMC. Previously, personal cognitive factor has usually been confined to self-efficacy and motivation
related factors based on environmental sources of information such as performance accomplishment and vicarious experience (Bandura, 1997). In technology-supported situations, the environmental stimuli include technological factors and the
functionality of technology influences the communication and interaction between the cognitive and environmental factors
(Xing & Goggins, 2015). In turn, the cognitive capabilities need to go beyond traditional domains to incorporate factors that
denote students’ competency in working in the social technical context such as social ability. Moreover, this study also expands the behavior dimensions of the social cognitive theory by incorporating the time aspect into the model. This temporal
dimension is unique to the CMC (Chen, Chen, & Xing, 2015) to some extent since it can be easily observed and operationalized
using electronic trace data. With the affordance of technology, behavior changes through the lens of social cognitive theory
can be explained in a more nuanced way. This work demonstrates the potential to redefine social cognitive theory in the
context of people acting with technology.
6. Conclusion
The reported study developed and empirically explored two student participation behavior models in online discussion
through the lens of social cognitive theory. These two models demonstrated how different aspects of participation behavior
influence student learning performance differently and especially highlighted the often overlooked influence of reading
behavior and time dimension of participation. Moreover, these models investigated the role of collective efficacy, social
ability, and system functionality in online discussion and explained how these constructs interact with participation behavior
to generate a significant impact on student learning.
Even though the findings of the reported are significant for CMC and promising in igniting new discussions on how
students participate in asynchronous online discussion, it should be acknowledged that these findings emerge from one,
specific context: a computer supported collaborative learning course at a major research university. Researchers need to be
careful to over generalize the findings to other contexts without further considerations. Future studies can actually replicate
this research to other disciplines and contexts with more test subjects to reach a more generalized model of student
participation in asynchronous online discussion. Moreover, in this study, posting and reading action only considered the
quantitative aspect of participation, and quality was not properly recognized. Future research can examine how to incorporate
the quality measure into the models of participation and further examine changes of models and their influence on student
learning.
References
An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation approaches on students' interactions during asynchronous online discussions.
Computers & Education, 53(3), 749e760.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman.
Bandura, A. (2001). Social cognitive theory: an agentic perspective. Annual Review of Psychology, 52(1), 1e26.
Baron, R. M., & Kenny, D. A. (1986). The moderatoremediator variable distinction in social psychological research: conceptual, strategic, and statistical
considerations. Journal of Personality and Social Psychology, 51(6), 1173.
Chen, B., Chen, X., & Xing, W. (2015, March). Twitter archeology of learning analytics and knowledge conferences. In Proceedings of the Fifth International
Conference on learning analytics and knowledge (pp. 340e349). ACM.
, D. E., Collimore, L. M., & Joordens, S. (2011). Assessing the effectiveness of a voluntary online discussion forum on improving students'
Cheng, C. K., Pare
course performance. Computers & Education, 56(1), 253e261.
Chin, W. W. (1998). The partial least squares approach to structural equation modeling. Mahwah, New Jersey: Lawrence Erlbaum Associates.
Chin, W. W., Marcolin, B. L., & Newsted, P. R. (1996). A partial least squares latent variable modelling approach for measuring interaction effects: results from
a Monte Carlo simulation study and voice mail emotion/adoption study. In Paper presented at the 17th International Conference on information systems,
Cleveland, OH.
Dennen, V. P. (2008a). Looking for evidence of learning: assessment and analysis methods for online discourse. Computers in Human Behavior, 24(2),
205e219.
Dennen, V. P. (2008b). Pedagogical lurking: student engagement in non-posting discussion behavior. Computers in Human Behavior, 24(4), 1624e1633.
Dourish, P. (1999). Where the footprints lead: tracking down other roles for social navigation. In A. J. Munro, K. Hook, & D. Benyon (Eds.), Social navigation of
information space (pp. 15e34). London: Springer-Verlag London Limited.
Fornell, C., & Bookstein, F. L. (1982). Two structural equation models: LISREL and PLS applied to consumer exit-voice theory. Journal of Marketing Research,
19(4), 440e452.
Fornell, C., & Larcker, D. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research,
18, 39e50.
Goggins, S., Xing, W., Chen, X., Chen, B., & Wadholm, B. (2015). Learning analytics at “small” scale: exploring a complexity-grounded model for assessment
automation. Journal of Universal Computer Science, 21(1), 66e92.
Hara, N., & Kling, R. (2000). Students' distress with a web-based distance education course: an ethnographic study of participants' experiences. Information,
Communication and Society, 3, 557e579.
Hardin, A. M., Fuller, M. A., & Valacich, J. S. (2006). Measuring group efficacy in virtual teams new questions in an old debate. Small Group Research, 37(1),
65e85.
Kay, R. H. (2006). Developing a comprehensive metric for assessing discussion board effectiveness. British Journal of Educational Technology, 37(5), 761e783.
Kuboni, O., & Martin, A. (2004). An assessment of support strategies used to facilitate distance students' participation in a web-based learning environment
in the University of the West Indies. Distance Education, 25(1), 7e29.
Laffey, J., Lin, G. Y., & Lin, Y. (2006). Assessing social ability in online learning environments. Journal of Interactive Learning Research, 17(2), 163e177.
Lee, C., & Farh, J. L. (2004). Joint effects of group efficacy and gender diversity on group cohesion and performance. Applied Psychology: An International
Review, 53(1), 136e154.
Ma, Y., Friel, C., & Xing, W. (2014). Instructional activities in a discussion board forum of an e-leaning management system. In HCI International 2014-posters’
extended abstracts (pp. 112e116). Springer International Publishing.
S. Goggins, W. Xing / Computers & Education 94 (2016) 241e251
251
McGrath, J. E. (1991). Time, interaction, and performance (TIP) A theory of Groups. Small group research, 22(2), 147e174.
Nunnally, J. C. (1978). Psychometric theory. New York: McGraw-Hill.
Patel, J., & Aghayere, A. (2006). Students' perspective on the impact of a web-based discussion forum on student learning. In Paper presented at the 36th
annual frontiers in education, San Diego, California.
Picciano, A. G. (2002). Beyond student perceptions: issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning
Networks, 6(1), 21e40.
Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: a critical review of the literature and
recommended remedies. Journal of Applied Psychology, 88(5), 879.
Ramos, C., & Yudko, E. (2008). “Hits” (not “discussion posts”) predict student success in online courses: a double cross-validation study. Computers &
Education, 50(4), 1174e1182.
Reisetter, M., & Boris, G. (2004). What works: student perceptions of eVective elements in online learning. The Quarterly Review of Distance Education, 5(4),
277e291.
Schweir, R. A. (2002). Shaping the metaphor of community in online learning environments. Retrieved on August 18, 2014 fromhttp://davidwees.com/etec522/
sites/default/files/schwier.pdf.
Slavin, R. E. (1995). Cooperative learning: Theory, research, and practice. New York, NY: MacMillan.
Son, J. (2002). Online discussion in a CALL course for distance language teachers. CALICO Journal, 20(1), 127e144.
Sonnenwald, D. I. H. (2005). Information horizons. In K. Fischer, S. Erdelez, & L. E. F. McKechnie (Eds.), Theories of information behavior (pp. 191e197).
Medford, NJ: Asis&t.
Tsai, I.-C., Kim, B., Liu, P.-J., Goggins, S. P., Kumalasari, C., & Laffey, J. M. (2008). Building a model explaining the social nature of online learning. Educational
Technology & Society, 11(3), 198e215.
Vonderwell, S., & Zachariah, S. (2005). Factors that influence participation in online learning. Journal of Research on Technology in Education, 38(2), 213e230.
Vygotsky, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.
Wang, S. L., & Hwang, G. J. (2012). The role of collective efficacy, cognitive quality, and task cohesion in computer-supported collaborative learning (CSCL).
Computers & Education, 58(2), 679e687.
Wang, X., Laffey, J., Xing, W., Ma, Y., & Stichter, J. (2016). Exploring embodied social presence of youth with Autism in 3D collaborative virtual learning
environment: A case study. Computers in Human Behavior, 55, 310e321.
Xing, W., & Goggins, S. (2015, March). Learning analytics in outer space: a Hidden Naïve Bayes model for automatic student off-task behavior detection. In
Proceedings of the Fifth International Conference on learning analytics and knowledge (pp. 176e183). ACM.
Xing, W., Guo, R., Petakovic, E., & Goggins, S. (2015). Participation-based student final performance prediction model through interpretable genetic Programming: Integrating learning analytics, educational data mining and theory. Computers in Human Behavior, 47, 168e181.
Xing, W., Kim, S., & Goggins, S. (2015). Modeling performance in asynchronous CSCL: an exploration of social ability, collective efficacy and social interaction. In O. Lindwall, P. Hakkinen, T. Koschman, P. Tchounikine, & S. Ludvigsen (Eds.), Exploring the material conditions of learning: Proceedings of the
computer supported collaborative learning (CSCL 2015) (pp. 276e283). Gothenburg, Sweden: International Society of the Learning Sciences.
Xing, W., Wadholm, B., & Goggins, S. (2014, March). Learning analytics in CSCL with a focus on assessment: an exploratory study of activity theory-informed
cluster analysis. In Proceedings of the Fourth International Conference on learning analytics and knowledge (pp. 59e67). ACM.
Xing, W., Wadholm, B., & Goggins, S. (2015). Group learning assessment in CSCL: developing a theory-informed analytics. Educational Technology and Society,
18(2), 110e128.