Observation Class Type # Off Topic – Lecture

An Empirical Study of In-Class Labs
on Student Learning of Linear Data
Structures
Sarah Heckman
Teaching Associate Professor
Department of Computer Science
North Carolina State University
ICER 2015
Problem
•
•
•
•
•
•
•
•
•
•
7-8 sections
33 students
1 instructor
2 TAs
Lecture/Lab
CSC116
1-2 sections
70-90 students
1 instructor
2-3 TAs
Lecture
CSC216
Transition!
Lab?
ICER 2015
•
•
•
•
•
Do Nothing?
1-2 sections
70-90 students
1 instructor
2-3 TAs
Lecture
CSC316
Retention!
In-Class Labs?
Research Goal
• To increase student learning and engagement
through in-class laboratories on linear data
structures
• Hypothesis: active learning practices that involve
larger problems would increase student learning
and engagement
In-class Labs > Pair & Share
ICER 2015
Research Questions
• Do in-class laboratories on linear data structures
increase student learning on linear data
structures exam questions when compared to
active-learning lectures?
• Do in-class laboratories on linear data structures
increase student engagement on linear data
structures exam questions when compared to
active-learning lectures?
ICER 2015
Active Learning in CSC216
• “engaging the students in the process of learning
through activities and/or discussion in class, as
opposed to passively listening to an expert”
[Freeman, et al. 2014]
• Control: Active Learning Lectures
– 2-5 pair & share exercises per class
– Submitted through Google forms
• Treatment: In-class Labs
– Lab activity for the entire lecture period
– Pre-class videos introduced topic
ICER 2015
Study Participants
Metric
Section 001
Section 002
# Enrolled
Participants (completed
course)
85
49
102
60
Dropped/Withdrawn
(consenting only)
3
4
Women
Meeting Time
9
TH 2:20-3:35p
10
MW 2:20-3:35p
• Self-selected into section during standard registration period
• Populations were similar as measured by a survey on
experience with tooling and self-efficacy.
ICER 2015
Replication Materials:
http://people.engr.ncsu.edu/sesmith5/
216-labs/csc216_labs.html
Methods
• Quasi-Experimental
– Counter-balanced design
– Learning measured through exams
– Engagement measured through observations of class
meetings
Observed Class Meetings
001
Array
Array
Linked
Linked
Lists
002
ICER 2015
Iterators
Array
Array
Linked
Exam 1
Linked
Student Learning – Exam 1
• Part 4: Method Tracing with ArrayLists
• Part 5: Writing an ArrayList method
Item
E1 P4#8
S001 S001
S002 S002 p-value
Mean SD
Mean
SD
5 3.63 1.56
4.35 1.45 < 0.010
E1 P4#9
5
4.18
1.07
4.57
1.09
0.016
E1 P4#10
5
2.63
2.40
3.45
2.18
0.149
E1 P4
15 10.45
3.97
12.37
3.74 < 0.010
E1 P5
20 17.76
4.0
18.25
4.09
ICER 2015
Points
0.233
Student Learning – Exam 2
• Part 3 – Linked Node Transformation
• Part 5 – Writing a LinkedList Method
Item
Points
S001
Mean
S001
SD
S002
Mean
S002 p-value
SD
E2 P3
16
9.43
5.85
11.80
6.41 < 0.010
E2 P5
20 11.80
4.14
12.58
4.21
ICER 2015
0.412
Student Learning – Exam 3
• Comprehensive 3 hour final exam
• Stack Using an ArrayList
• Queue Using a LinkedList
Item
Points
S001
Mean
S001
SD
S002
Mean
S002 p-value
SD
E3 Array
10
8.31
2.45
8.46
2.49
0.313
E3 Linked
10
8.36
2.53
881
2.45
0.221
87.23 28.92
0.372
E3 Score
ICER 2015
105 85.02 29.17
Student Engagement
• Observations for ArrayList and LinkedList class
meetings
• Observers were graduate students and a
colleague participating in a Teaching and
Learning seminar
• Counts of students off topic during lecture and
exercise portions of the class
• Some inconsistent use of the observation
protocol
ICER 2015
Student Engagement
Observation
Class
Type
1
Lab
2
# Off Topic –
Lecture
# Off Topic –
Exercise
Questions of
Teaching
Staff
5
7
32
Lecture
62
49
12
3
Lab
10
43
50
4
Lecture
46
16
---
5
Lecture
---
---
---
6
Lab
5
10
33
7
Lecture
52
54
2
8
Lab
16
5
---
9
16.3
38.3
53.3
39.7
7
5.9
2.4
0.2
Lab Average
Lecture Average
Lecture / Lab
ICER 2015
Threats to Validity
• External Validity
– Two sections of the same course, taught by the same
instructor, in the same semester, and same time of
day
– Replication needed in other contexts to generalize
further
– Could provide additional data points in future metaanalyses
ICER 2015
Threats to Validity
• Internal Validity
– Selection bias: students selected their own sections
• Initial surveys shows groups were similar
– Confounding factors
• Materials shared between groups
• Effect size – only 6 in-class labs
– Differential Attrition Bias
• Considered “soft-drops” in the study
– Experimenter Bias
• Participants were not revealed until after the semester was
over
ICER 2015
Threats to Validity
• Construct Validity
– Exams as Measures of Learning
• Exam 1 and Exam 2 were similar, but not the same,
between sections
• Exam 3 was common
• Does exam really measure student learning?
– Survey
• Wording may be confusing for prior tool experience
• Efficacy questions not a validated instrument
– Observation Protocol as Measure of Engagement
• Inconsistent use by observers
ICER 2015
Discussion
• Did in-class labs increase student learning?
– No, at least not as measured by exam questions
– Both control and intervention were active learning
• Maybe a simple active learning intervention is enough
– Comparisons with earlier semesters may show more
• Did in-class labs increase student engagement?
– Yes and No
– The atmosphere in the classroom was fantastic
– But many questions were technology and not concept
• Completion – 72% of students earned a C or higher
– Not reaching the higher levels of completion we expect from
active learning literature
ICER 2015
Future Work
Replication Materials:
http://people.engr.ncsu.edu/sesmith5/
216-labs/csc216_labs.html
• Additional Work on Fall 2014 Data
– Compare results on final exam with previous courses
– Incorporate analysis of other measures of learning –
projects, exercises, etc.
• Starting in Fall 2015
– Additional in-class labs → Lab-based course
– Measure types of questions asked during in-class labs
– Use labs as a way to encourage best practices
(frequent commits to version control, TDD)
ICER 2015
Thank You!
Questions?
Comments?
Concerns?
Suggestions?
ICER 2015