Assessing the student learning outcome for the required L1

ASSESSMENT CERTIFICATE
CULMINATING PROJECT: ASSESSING
STUDENT LEARNING OUTCOMES
Presented by: Shujaat Ahmed and Kaitlin Fitzsimons
Objective 1: Assess student learning outcomes
for required L1 competence
OBJECTIVE 1
L1 COMPETENCE AND CRITERIA
L1 Competence statement: Can use independent learning skills and
strategies to organize, initiate, and document prior, current, and
future college-level learning.
Competence Criteria
– Describe strategies for independent and experiential learning.
– Use strategies to surface prior experiential learning in personal,
professional, and academic settings and integrate these
experiences with new learning.
– Demonstrate skills in planning, organizing, assessing, and
documenting competence based learning.
Objective 2: Compare three course
designs (1.0, 1.5, 2.0) of the Independent
Learning Seminar
OBJECTIVE 2
• Independent Learning Seminar was a new course created to help
students structure their independent learning experiences (V. 1.0)
• After the initial offering of the course, one of the instructors decided
to adjust the original course design (V. 1.5)
• This Fall, a third course design was developed, replacing all of the 1.0
version offerings (V. 2.0)
• Our study is focused on determining whether the three course
designs are achieving the L1 competence criteria, and if one design is
better at meeting certain criteria than the others
STEPS TAKEN TO ASSESS L1 COMPETENCE (OBJ 1)
1.
2.
3.
4.
Developed rubric to assess ILS projects
Identified 33 randomly selected assignments from 1.0, 1.5, and 2.0
(N=11 per section) for assessing L1.
Eight SNL TLA members rated selected assignments
Rubric data was analyzed by ILS section
ILP RUBRIC (OBJ 1)
L1 Competence Criteria
Below Average (1)
Satisfactory (2)
Above Average (3)
Excellent (4)
Describes strategies for
independent and
experiential learning.
Does not or minimally
identify(ies) strategies for
independent and
experiential learning.
Superficially describes
strategies for independent
and experiential learning.
Clearly describes strategies Clearly describes in detail
for independent and
an understanding of
experiential learning.
strategies of independent
and experiential learning
across multiple contexts.
Uses strategies to surface
prior experiential learning
in personal, professional,
or academic settings
Does not apply strategies
to surface prior learning in
personal, professional, or
academic settings
Minimally applies
strategies to surface prior
learning in personal,
professional, or academic
settings
Clearly applies strategies to
surface prior learning in
personal, professional, or
academic settings
Reflects upon the value of
strategies to surface prior
learning in personal,
professional, or academic
settings.
Integrates learning
experiences with new
learning.
Does not connect or
integrate learning
experiences
Minimally connects
learning experiences with
new learning.
Clearly connects learning
experiences with new
learning.
Draws multiple
connections between
learning experiences
across contexts and time.
Demonstrates skills in
documenting learning.
Documentation of learning Documents learning but
is unclear or incomplete
does not emphasize
learning strategies or
future learning
Documents learning with
an emphasis on learning
strategies OR an emphasis
on future learning
Documents learning
effectively with an
emphasis on learning
strategies AND their
application to future
learning
ANALYSIS OF RUBRIC DATA (OBJ 1)
• Means and Standard Deviations for Version 1, 1.5, and 2.0
Version 1
L1 Criteria
Version 1.5
Version 2
Mean (N=11)
SD
Mean (N=11)
SD
Mean (N=11)
SD
Learning Strategies
1.82
1.07
2.36
1.21
2.63
0.90
Surfacing Prior
Learning
2.18
0.75
3.09*
0.70
2.82*
0.60
Applying Learning to
Future Learning
2.55*
0.82
2.72
1.19
2.55
0.69
Documenting
Learning
2.36
0.92
2.81
0.87
2.36
0.92
Note. Bold values in red represent highest means across versions; an asterisk (*) is indicated near each mean that is the highest per
version
ANALYSIS OF RUBRIC DATA (OBJ 1)
• A 3 x 4 MANOVA was conducted to determine whether the mean differences
between the three sections were statistically significant.
• Results revealed a significant multivariate main effect for section, Wilks’ λ =
.540, F (8, 54) = 2.435, p <. 10, η2= .27, with 27% of the multivariate variance of
the L1 criteria being associated with the grouping factor, section.
• In examining the univariate between-subject effects, only surfacing prior
learning was significantly different, F (2, 30) =5.064, p=.013.
• Post hoc tests using Bonferroni procedure indicated that only when version 1.0
(M=2.18) was compared to version 1.5 (M=3.09), or vice versa for surfacing
prior learning, the mean difference was statistically significant (p< .025).
Applying Assessment Concepts
• Assessment promotes continuous improvement
– Goal of the project is to determine if the different course designs
effectively assess the L1 competence, and provide information for
future decisions on selection or improvement of course designs.
• Measuring student learning requires direct assessment
– We chose a random sample of student work and assessed each
using a rubric
• A mixed methods approach to assessment provides a full picture,
bringing together the “What” and the “Why”
– We used quantitative methods to assess the rubrics. We chose to
run a MANOVA in order to see if there were significant differences
between the sections
– We used qualitative methods to analyze the open-ended student
comments from the student online teaching evaluations for each
section of Independent Learning Seminar.
What We Learned
• In the development of the L1 rubric, we used Bloom's Extended List of
Cognitive Action Verbs to fill in the cells that describe below average,
satisfactory, above average, and excellent demonstrations of the L1 criteria.
– As L1 competence criteria were sometimes "double-barreled,” we
separated some of the criteria such that they were measuring a single
outcome. [Writing and Revising Learning Outcomes Workshop, Direct
v. Indirect Assessment Workshop]
• At this stage of the project, analysis of the L1 rubric Ratings was exclusively
quantitative. We used SPSS to compare the different ILS sections by
determining means, standard deviations, and running a MANOVA.
[Quantitative Workshop]
What We Learned
• In analyzing survey data from instructors who taught each version of the
Independent Learning Seminar, we wanted to gauge faculty perception on
teaching experiences, as well as student learning. We realize that faculty
perceptions constitute indirect evidence, so the survey was only one piece of
our mixed methods approach, including direct evidence from the rubric data
analysis. [Direct vs. Indirect Assessment Workshop]
• In analyzing the student online teaching evaluations for each section of
Independent Learning Seminar we looked at open-ended student comments.
We identified several themes based upon the frequency of student
comments. Not surprisingly, students in the 1.0 version reported the course
focusing on the development of an ILP, while students in versions 1.5 and 2.0
emphasized the class structure [Qualitative Workshop]
RELEVANCE TO WORK
• The Assessment Center provides analysis and expertise so that
SNL can make data-driven decisions
•
•
For last year’s Annual Assessment Project, SNL completed a similar
study of four different versions of the Advanced Project course
The findings from that report were presented to College
leadership and were relied upon to make decisions about curricular
offerings
• SNL is increasing its reliance on rubrics as a direct assessment
tool to objectively measure student outcomes
REFLECTION AND MOVING FORWARD
• Working on the Annual Assessment project has given us the opportunity to incorporate
what we've learned through the ACP workshops to improve SNL curricular offerings.
• We will continue to incorporate the assessment best practices regarding data
collection, data analysis, and reporting findings in our positions at SNL.
• Employing methods of formal inquiry as a means of problem solving extends outside
the area of assessment and can be used in various practice settings.
APPENDIX