Annual Assessment Report to the College 2009-2010 College: __Humanities_____________________ Department: ____________________________ Program: ____Liberal Studies__________________ Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2010. You may submit a separate report for each program which conducted assessment activities. Liaisons: ____Antone Minard and Tineke Scholten_______________ 1. Overview of Annual Assessment Project(s) 1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the oversight of one person or a committee? Assessment is under the oversight of a committee: Antone Minard, Assessment Liaison (through July, 2010) Tineke Scholten, Assessment Liaison (from August, 2010) Geraldine Sare, Advisement Wendy Birky, Program Review Michael Neubauer, Program Director The overall plan was to continue gathering data on both LRS courses and the advisement process and revise SLO 4. 1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any modification to your assessment process and why it occurred. At the beginning of the Fall semester, the Liberal Studies Assessment Liaison joined the Committee for Simplifying Assessment across the University as the representative of the College of the Humanities. This committee pushed the department into designating Gateway and Capstone courses for each of its four major tracks, and identifying Signature Assignments and common March 30, 2009, prepared by Bonnie Paller Grading Rubrics in most of these. 2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an additional SLO, report in the next chart below. Note: As the methodology is the same, I have combined the SLOs into one chart 2a. Which Student Learning Outcome was measured this year? SLO 1 (“Students will acquire a breadth of knowledge across the range of disciplines included in the major and will pursue greater depth in their area of specialization.”) SLO 2 (“Students will explore how knowledge across multiple disciplines can be connected.”) SLO 3 (“Students will develop the ability to formulate their own goals for continued learning and inquiry based on a foundation of intellectual curiosity.”) SLO 5 (“Students will be able to think critically and creatively.”) SLO 6 (“Students will be able to write and speak clearly, coherently, and thoughtfully.”) SLO 7 (“Students will be able to read, understand, and evaluate all forms of text.”) 2b. What assessment instrument(s) were used to measure this SLO? Student exit surveys of LRS 300 (Gateway) students and graduating seniors. 2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of students, which courses, how decisions were made to include certain participants. Gateway n=92 (all students enrolled in LRS 300 in Spring and Fall were invited to participate) Seniors n=92 (all graduating seniors were invited to participate) Note: the fact that n=92 in both cases is a coincidence; there could have been no overlap in the two populations. 2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used. Data was initially assessed longitudinally. For Gateway, data has been gathered each semester since Spring 2005, and this year’s results were compared against that data. For the seniors, data has been gathered for two years. 2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the data collected. March 30, 2009, prepared by Bonnie Paller A matrix was designed to convert survey responses to numeric, quantifiable data. Results show a broad consistency over time and an overall high level of satisfaction. The data did highlight a drop in satisfaction for one portion of SLO 5 and it is recommended that the Program Review committee look at the ways in which this objective is met. 2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed description of how the assessment results were or will be used. Our longest running assessment method is a student exit survey for our LRS 300 Gateway course. Students have taken this survey every semester since the Spring 2005 semester. The results of this survey have helped us to gauge what is and is not working in the course and to make adjustments accordingly. As of June 2010, 1,702 students have taken this survey over 11 semesters, with an average of 155 students per semester. Over the years, our LRS 300 Gateway exit survey has allowed us to assess which assignments are working and which need modification. It has been a useful tool in improving the course. Gateway instructors meet for collaboration meetings 4-5 times per semester to discuss the course and make changes based upon the data. This is also an important part of the ongoing evolution of that course. Since this course is meant to give students important tools for success in the program, it has an impact on the overall effectiveness of the program. For example, we know that both Liping Ma’s Knowing and Teaching Elementary Mathematics and The Giver by Lois Lowry have been extraordinarily effective in meeting the SLO’s of their respective units. The data we have collected has remained consistent year after year. Furthermore, the research project in the course was successful re-envisioned. It is the centerpiece of the course and frequently cited by students as a challenging but rewarding learning opportunity. An anonymous exit survey of graduating seniors has been administered for the last two years. In both 2009 and 2010 we asked graduating Liberal Studies students to assess their learning and preparedness at graduation. We collected and analyzed 117 surveys in Spring 2009 and 92 in Spring 2010. These surveys reveal a high degree of satisfaction in the program overall in that students feel that they have acquired breadth of knowledge across disciplines and that the pre-credential students feel that they are prepared as teachers. The survey also indicates that students feel that the program covered subjects related to humanities and language arts most thoroughly and that math, physical education, and fine arts were covered the least. These results will be considered as we revise program curriculum. A survey of students who use our advisement services was conducted in 2009. This anonymous survey was given to students after a visit with one of our three advisors, asking them to assess the effectiveness and helpfulness of their advising experience. 172 surveys were collected over March 30, 2009, prepared by Bonnie Paller several months. They show a high degree of satisfaction with advisement with scores of over 90% for most measures. Overall, student satisfaction exceeded 90% and indicates that no changes in advisement are needed at this time. The lowest satisfaction, at 83%, was in regard to our online workshop, which students take as they enter the program. This information is being considered as we make revisions to the workshop. Some programs assess multiple SLOs each year. If your program assessed an additional SLO, report the process for that individual SLO below. If you need additional SLO charts, please cut & paste the empty chart as many times as needed. If you did NOT assess another SLO, skip this section. 2a. Which Student Learning Outcome was measured this year? 2b. What assessment instrument(s) were used to measure this SLO? 2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of students, which courses, how decisions were made to include certain participants. 2d. Describe the assessment design methodology: Was this SLO assessed longitudinally (same students at different points) or was a crosssectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used. 2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the data collected. 2f. Use of Assessment Results of this SLO: Think about all the different ways the results were (or could be) used. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed description of each. 3. How do your assessment activities connect with your program’s strategic plan? Our strategic plan for this year involved increasing assessment in many areas of the program, and progress toward this goal has been achieved. We also had goals relating to increasing recruitment and slowing the decrease in enrollment in our March 30, 2009, prepared by Bonnie Paller program. While external factors are largely responsible for decreasing enrollment in Liberal Studies we implemented an Advisement Satisfaction Survey and an Exit Survey for graduating seniors to find out about student satisfaction with advisement and with the content of the program. Questions for these surveys were written in conjunction with both advisement SLOs and academic content SLOs. The results of these surveys, along with the analysis of data from LRS 300, have given us a better idea of where the program is at this time and some possible directions for the future that are in line with our strategic plan. 4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student learning, please discuss here. The assessment does not show that any additional resources are needed to improve or support student learning. 5. Other information, assessment or reflective activities not captured above. The primary activity this year has been the revision of the complete list of SLOs and the shift from the reliance on student satisfaction surveys to, ideally, the collection and analysis of hard data samples. 6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your program? Please provide citation or discuss. No. March 30, 2009, prepared by Bonnie Paller
© Copyright 2026 Paperzz