Annual Assessment Report to the College 2008-2009
College: ______Humanities________________
Department: ____________________________
Program: ______Liberal Studies_____________
Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30,
2009. You may submit a separate report for each program which conducted assessment activities.
Liaison: _______Antone Minard____________
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year.
oversight of one person or a committee?
Is assessment under the
Assessment activities are under the oversight of Dr. Wendy Birky, Program Chair. The Assessment Committee includes Dr. Wendy Birky; Ms.
Geraldine Sare, Assistant Director of the Liberal Studies Program; and Dr. Antone Minard, faculty member and Assessment Liaison.
Assessment activities for Liberal Studies in 2007–2008 were folded into a larger campus-wide project, the Teachers for a New Era initiative
(http://www.csun.edu/academic.affairs/tne/index.html). TNE called for ongoing study of ITEP students during their student teaching
experience in 2008–2009.
As an interdisciplinary program, Liberal Studies students take required courses in up to forty-seven different departments. The program offers
two majors, General Studies and Pre-Credential, which is further subdivided into Freshman ITEP, Junior ITEP, and non-ITEP students. In
anticipation of the upcoming program review, the committee decided to identify and synthesize data that had been collected in the past, study
other campus models for the assessment of interdisciplinary programs, launch new assessment activities targeted toward key junctures on the
Liberal Studies major trajectories, and to develop an ongoing program for assessing both the academic and advisement activities of the Liberal
Studies Program.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
This year's assessment process began with a long stage of information-gathering in the Fall Semester of 2008, and it followed the intended plan
fairly closely.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below.
2a. Which Student Learning Outcome was measured this year?
This year's process was not tied to SLOs in the manner envisioned by this question. For this reason, we decided to gather and comile data on
each of the seven departmental SLOs. Two of the surveys designed (see question 2b) incorporated the complete list of academic and advisement
SLOs, respectively; other assessment activities were tied to the Gateway and Capstone courses, which inherently involve the academic SLOs for
all Liberal Studies majors. The additional SLOs for Liberal Studies ITEP teacher candidates were not specifically targeted except as a part of the
exit survey.
2b. What assessment instrument(s) were used to measure this SLO?
Surveys were circulated in the spring semester among all students who spoke with an academic advisor, and surveys were sent to all seniors
with approximately 21% and 18% rates of return, respectively. Specific questions were written to target either single or multiple SLOs. The
Gateway course for all Pre-Credential students, LRS 300, has been surveyed each semester for nine semesters, and these results were compiled
and analyzed with an eye toward the course structure and content as well as departmental SLOs.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
The exit survey was sent to approximately 660 seniors, the totality of the group; 117 had responded by the deadline of 12 June 2009. The
advisement survey was given to 824 students between 4 March and 12 June, 2009, and 172 responses have been received from this group. This
period of three months from the middle to the end of the spring term was identified as representative of the advisement experience as a whole,
though the survey is ongoing.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
The overall goal is to create an ongoing assessment procedure that can be both longitudinal and cross-sectional. To this end, we have identified
several "bottleneck" points in the curriculum which all of the students in a given major pass through. One of these courses, the Gateway course
LRS 300, is required of everyone but the General Studies option. Each semester, all instructors give the same end-of-semester survey to the
students; Dr. Cathy Costin has provided us with four years' worth of this data which we assessed to ascertain how well the course was meeting
the departmental SLOs over time. A second course is a recently change within the department: English 428, Children's Literature, is now the
Capstone course. To this end, we designed a new rubric for the Capstone project, a final paper, and plan to begin assessing how well the course
meets departmental SLOs in 2009–2010. The exit survey asked the students for a longitudinal self-assessment.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
Many of the surveys asked students to rank their response. Though the answers were verbal (e.g. "yes, strongly"; "no, not much") rather than
numeric, they all were essentially scales of 1–3, 1–4, or 1–5. We converted the responses into an overall percentage of satisfaction based on the
formula ([R5*5+][R4*4+]R3*3+R2*2+R1*1)/N/R', where "R" equals the responses in that category, R' equals the number of categories of
response, and "N" equals the total number of respondents. The same methodology was used for each of the surveys.
Once these numbers were calculated, sample standard deviations were run, whether longitudinal or cross-sectional, to determine whether the
observable variation was statistically significant. For the Gateway survey, data gathered over time was also plotted onto a line graph to assess
patterns.
Academically, the study revealed a truly remarkable consistency in the LRS 300 Gateway course. Although the survey covered sections taught by
22 different instructors in five consecutive academic years, variation was negligible. The overall satisfaction rate stands at 80%. Each of the
seven SLOs assessed by the survey demonstrated a slight but perceptible upward trend. Only one assignment stood out as failing to meet its
goals, and that assignment had already been identified for overhaul by the steering committee.
The exit survey was analyzed both in its aggregate results and broken down by the different major tracks (General Studies, the three
pre-K-12-teaching majors, and the individual Freshman ITEP, Junior ITEP, and Pre-Credential majors). The exit survey of seniors suggests that the
program itself is achieving a satisfaction level of between 73% and 80% for the seven SLOs, but the lowest two have been targeted for
clarification in one case and further study in the other. Students also identified Child Development and the Fine Arts as areas in need of further
coverage.
Advisement activity was also assessed, with the result that satisfaction was at an extremely high level of student satisfaction. Advisement SLO
number 7 ("Be able to accurately read and effectively utilize a degree audit through the use of the Degree Progress Report (DPR) and My
Academic Planner (MAP) in their [students'] educational planning" has been targeted for assessment next year.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.
SLO #1, "breadth and depth of knowledge across a range of disciplines," does not mention the balance we want them to achieve. This year's
activities have identified the Natural Sciences as an area in need of more coverage, and that recommendation has been forwarded to the
curriculum committee.
SLO #4, "the positive value and essential role of diversity," needs clarification. It refers to both the diversity inherent in an interdisciplinary
program and the diversity of human beings themselves. It has been recommended that the SLO be revised for clarity.
Some programs assess multiple SLOs each year. If your program assessed an additional SLO, report the process for that individual SLO below. If
you need additional SLO charts, please cut & paste the empty chart as many times as needed. If you did NOT assess another SLO, skip this
section.
2a. Which Student Learning Outcome was measured this year?
2b. What assessment instrument(s) were used to measure this SLO?
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
2d. Describe the assessment design methodology: Was this SLO assessed longitudinally (same students at different points) or was a
cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were (or could be) used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of each.
3. How do your assessment activities connect with your department’s strategic plan?
Our strategic plan for this year involved increasing assessment in many areas of the program, and progress toward this goal has
been achieved. We also had goals relating to increasing recruitment and slowing the decrease in enrollment in our program. While
external factors are largely responsible for decreasing enrollment in Liberal Studies we implemented an Advisement Satisfaction
Survey and an Exit Survey for graduating seniors to find out about student satisfaction with advisement and with the content of the
program. Questions for these surveys were written in conjunction with both advisement SLOs and academic content SLOs. The
results of these surveys, along with the analysis of data from LRS 300, have given us a better idea of where the program is at this
time and some possible directions for the future that are in line with our strategic plan.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
Our current assessment indicates that there is no immediate need for new resources at this time.
5. Other information, assessment or reflective activities not captured above.
We have recently changed Capstone assessment for precredential students. We shifted from assessment in each student's
concentration to focusing on a single course that all precredential students take shortly before they graduate. This will allow us to
use a more uniform and comparable assessment instrument over time. However, the new assessment criteria have only recently
been implemented and won't formally be used until Fall 2009, so this assessment component has been delayed. We will begin to
gather data in Fall 2009.
The advisement assessment confirmed the observation that many students drop ITEP in favor of a general pre-credential major, and listed some
of the possible reasons. A statistical analysis refined some of the trends but more work is to be conducted in 2009–2010 to determine why
students are dropping out of the ITEP program.
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
No.
© Copyright 2026 Paperzz