Annual Assessment Report to the College 2009-2010, Revised (March, 2011) College: College of Social and Behavioral Sciences Department: Psychology Program: Major Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2010. You may submit a separate report for each program which conducted assessment activities. Liaison: Holli Tonyan, began Sept 2010 1. Overview of Annual Assessment Project(s) 1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the oversight of one person or a committee? During 2009-10, the chair served as Assessment Liaison. During 2008-09, an assessment committee was formed, but due to the Furloughs of 2009-10 the Faculty voted to disband the Assessment Committee in Fall 2009 as part of reducing workload for furloughs. The plan for 2009-10 was to seek baseline data for the new major that had been developed during 2008-09 and was finally approved during Fall 2009. Toward that goal, an internet-based survey was developed and used during the Fall semester. 1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any modification to your assessment process and why it occurred. We implemented the survey according to plan. Based on feedback, we revised our plan to also include direct assessment of Psychology Department SLO #7 (Students will demonstrate knowledge of the theories, concepts, and empirical approaches from diverse perspectives of psychology, including biological processes, developmental processes, individual and social processes, learning and cognitive processes). To do so, we examined records of student learning as assessed via embedded exam questions gathered across a number of courses, from the introductory level, to the 300-level, to the 400-level, and a paper evaluated via a rubric in one 300-level course. March 30, 2009, prepared by Bonnie Paller 2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an additional SLO, report in the next chart below. 2a. Which Student Learning Outcome was measured this year? SLO #7 (Students will demonstrate knowledge of the theories, concepts, and empirical approaches from diverse perspectives of psychology, including biological processes, developmental processes, individual and social processes, learning and cognitive processes) 2b. What assessment instrument(s) were used to measure this SLO? Embedded exam questions. 2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of students, which courses, how decisions were made to include certain participants. All students who completed the required activities in selected sections were analyzed. Sections were selected based on those taught by faculty who volunteered to participate and then, among the courses taught by those faculty, the position of the course within the program. Consideration for the specific questions to be examined was based on the extent to which they directly assessed knowledge applicable across a range of areas within the major (e.g., theoretical frameworks common across specializations within the field, knowledge or approaches used across a range of specializations. Psy150 (n=216), Psy460 (n=113), Psy422 (n=90) were considered because the faculty teaching those courses used the same quiz questions and so a cross-sectional analysis of changes over time in the proportion of students who can correctly answer the questions could be conducted. The resulting analysis comparing student knowledge from the 100-level with 400-level examined quizzes that each asked 5 questions about one of three topics (same questions asked across two courses): classical conditioning, operant conditioning, and temperament theory. 2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used. A cross-sectional comparison was used and assessment spanned 100-level, 300-level, and 400-level courses. 2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the data collected. March 30, 2009, prepared by Bonnie Paller For each topic, the following table shows the percent of students who scored 3/5, 4/5 or 5/5, respectively. Total Count 3 Count 4 Count 5 Classical Conditioning Psy150 Total Psy460 Total 216 113 6% 1% 38% 6% 54% 93% OperantConditioning Psy150 Total Psy460 Total 237 120 2% 1% 4% 11% 92% 83% Temperament Theory Total Psy150 Total Psy442 240 90 4% 0% 14% 10% 74% 90% These results suggest that students are mastering the assessed content at good levels already at the 100-level and that at least for those students assessed here, they are showing even higher rates of mastery at the 400-level as we would hope. 2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed description of how the assessment results were or will be used. This was our department’s first attempt at documenting students’ learning through embedded exam questions. The results of this analysis will help to show faculty that assessing student learning, compiling information about student learning across courses, and examining that learning across the program in strategic ways can be a regular part of what we do as opposed to being something added to course-based activities. These results suggest that in the courses sampled, students are mastering the material we hope they will master. We can continue to examine across sections and years in theh program to ensure that the results found here are representative of our students more generally. 3. How do your assessment activities connect with your program’s strategic plan? Assessment of SLO#7 is one of the last two SLOs to be assessed as part of our strategic plan. We are continuing to make progress to have direct assessment of each of our SLOs. 4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student learning, please discuss here. March 30, 2009, prepared by Bonnie Paller As suggested in our Self-Study and Memorandum of Understanding for the Program Review completed in Fall 2009, we need to continue to hire new faculty to serve our large and growing major and address students’ concerns about scheduling courses. We plan to seek out college- and university-level support in designing our longitudinal assessments for the new major. In particular, we are considering ways to use Moodle to have standardized online assessments of some of our program and course SLOs so we will need to seek assistance from Academic Technology staff as well as our CSBS Online Instruction Coordinator. 5. Other information, assessment or reflective activities not captured above. Indirect Assessment of Student needs: Our major was revised so substantially that most courses taught in 09-10 would never be taught in the same form again (e.g., some courses would have laboratory requirements eliminated, other courses were removed or redesigned with new SLOs). As a result, we gathered “baseline,” indirect data on who our students are. We developed an internet-based survey. We gathered information about demographic, employment status, and type of employment. We asked about educational pathways, including transfer status, year in degree, expected graduation date, and year of first enrollment at CSUN, enrollment in lower-division required courses, upper-division required courses, and barriers to enrollment and graduating in 4 years. All part- and full-time Faculty teaching that semester were asked to provide the link to the survey to all students in their courses. Questions were completed by between 416 and 467 students from all academic years (fewer than 30 units completed to more than 90 units completed). Of the respondents, 75% reported majoring in Psychology, 5% reported minoring in Psychology and 21% reported being undecided. Sixty percent of respondents reported having transferred from another institution, primarily community college. We used a cross-sectional design and recruited students from all courses offered during that academic semester. Descriptive statistics were calculated. Results suggested that more than 30% of our respondents could be classified as of Latino/Hispanic descent based on self-report collapsed across categories whereas 26% of respondents were self-identified Euro-American/Caucasian. The next largest groups outside of “Other” were Armenian-American (7%) and Middle-Eastern-American (6%). More than 50% of our students reported working between 11 and 30 hours per week, with most employed students working in a job not related to their major (76%). We examined whether core course requirements had already been taken, were in progress, or were yet to be taken as well as barriers to taking these required courses. Collapsing across individual courses, responses suggest that most students reported course progress appropriate to March 30, 2009, prepared by Bonnie Paller their reported number of units taken to date. Another important finding was that just under half of respondents reported taking lower division required courses at CSUN and nearly all students (approximately 99% for most courses) reported taking upper division required courses at CSUN. We examined barriers to 4-year graduation and found that the top reasons students reported were: • I work and it's hard to schedule classes around work (39%) • Financial reasons (37%) • I was unable to take required courses on time due to scheduling conflicts (36%) • I received poor advisement and took unnecessary courses (22%) • I took extra classes for additional knowledge (17%) • I delayed graduating because I don’t know what I want to do with my degree (16%) We are using the results from this assessment to inform our advising staff, our faculty, and our student services staff of the characteristics of our majors. Of course, our sample was not random so we have no reason to believe it to be representative, but the results are still informative. Based on the results of this survey, combined with the results of our self-study and program review, we have begun to implement new programs for our students. Central to our findings about progress through the major and barriers to graduating on time were difficulties in scheduling coursework around employment and financial reasons. We have begun an annual Talent Show and Silent Auction to raise funds for students. Of course, given the budgetary cuts and the dramatic reduction in sections offered, we have not been able to address the difficulties for students in scheduling classes around work. In fact, we plan to survey students again this Fall and suspect that difficulties in that area will actually have increased given the early dates by which our courses filled this semester. Our new major addresses a number of the reported barriers to graduating in four years over which we have more control. All majors will now be required to take a one-unit advising course in which all students will have uniform access to information about major requirements, pathways through the major, and career paths. We have also begun a new Peer Advising program in which Psychology majors hold regular office hours to provide peer advice complementary to that offered by advising staff. This should directly address the barriers about poor advisement and not knowing what to do with a Psychology degree. As described above, our assessment activities for 2009-10 were undertaken to provide a baseline sense of who our students were in the last year of our “old” major requirements. This year we plan to repeat the survey and include again some additional questions used in prior years to also assess educational background, satisfaction, and perceived competence as we roll out new courses part of the new major requirements. Earlier planning and assessment efforts have resulted in the creation and evaluation of program-level SLOs and we have now integrated courselevel SLOs for all new courses with our program-level goals. Our next step will be to great a plan to assess course-level SLOs together with program-level SLOs in a more systematic manner. We hope to have a retreat, possibly in January, to create a new plan to assess students longitudinally as they progress through our new major. March 30, 2009, prepared by Bonnie Paller Additional Program Development Activities (implementation of analyses from previous years) Previous years’ assessment activities and the report by the external reviewers during our program review suggested to us that we may be placing too heavy a burden on a few required core courses for the major. In 2010-11, we are piloting a new Psychology Department Writing Fellows program. We selected and trained 3 undergraduate Psychology majors to serve as peer tutors in writing – they will hold office hours and serve as “informed readers” of Psychology Majors’ writing. We created a Moodle site for the program and all students taking our core required Research Methods in Psychology & Lab course will be automatically enrolled in the site and will be visited by the Fellows to introduce the pilot. In designing our program, we have drawn from CSUN’s Writing and Reading Across Disciplines (WRAD) program as well as universitywide peer-tutoring programs at other institutions like Barnard College and University of Wisconsin-Madison. We are generating pilot data to evaluate the program’s ability to reach majors and its effectiveness. In addition, our revisions to our major were quite substantial and we have been quite busy with its implementation and the development of new courses and course material as well as revised course material. The survey reported here as well as those reported as part of the 2000-01 and 2007-08 reports have all been used in developing the new major. 6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your program? Please provide citation or discuss. Not during 2009-2010. March 30, 2009, prepared by Bonnie Paller
© Copyright 2026 Paperzz