Citrus College Student Learning Outcomes and Assessment

Citrus College Student Learning Outcomes and Assessment Reporting Form Instructions
Course Number/Title: Type your course name and title here
Assessment Cycle: 2014-2015 (Su14/Fa14/W15/Spr15)
This is always pre-populated for you!
Faculty who teach this course: This is just to get an idea how many different faculty could contribute to the information in this report since we
aggregate data at the course level. It is recommended that faculty who teach the same course to collaborate and complete one form. However, if
preferred, individual faculty can choose to complete their own form and upload separately in the course folder. NOTE: Even if faculty use different assessment
NOTE: Even if faculty use different assessment tools for the same course, you can still use ONE form as long as the benchmark agreed upon is the same.
Identify the SLO(s) that you will be assessing this term:
This is always pre-populated for you! Simply check the box for the SLO(s) you assessed during the cycle above. HotShots (SLO Committee)
recommends that if you have more than 5 SLOs you may want to revise, and 2-5 is the recommended number so it is manageable to assess
over the 5-year cycle.
 SLO#1: Students shall demonstrate an understanding of the complete accounting cycle
 SLO#2: Students shall apply accounting principles and concepts to small business activities
 SLO#3: Students will be able to identify, analyze and record business transactions and interpret financial data.
Which Institutional Level Outcome does this course address?
These ILOs are based on the campus Mission Statement.
 Academic Excellence (General Education)
 Economic Opportunity (Career/Technical Education Programs)
 Foundational Skills for Student Success (Basic Skills)
See BLUE text for instructions,
examples, and helpful tips as you
complete this document
Program Level Outcomes: Students completing this course will have acquired the following competencies. Check all that apply:
It is recommended by both HotShots and Curriculum to have at least 1 SLO statement per core competency covered by the course.
Discipline/Subject Area Specific should only be used sparingly, when no outcome fits with the other 5 competencies. (FYI: Program level
outcomes for each program are in the catalog beginning on page 166. Please check your program level outcomes and see if you are satisfied
with what is written. Many of these can be refined or streamlined down. Eventually, each program will complete an updated version of their
Curriculum Map (matrix for mapping course level and program level SLOs) as part of the Core+ (Year 4) program review cycle. You will have
the opportunity to revise program level outcomes at this time).
 Communication
 Creative, Critical, and Analytical Thinking/Info Comp
 Technology
 Computation
 Community, Global Consciousness and Responsibility
 Discipline/Subject Area Specific
Were any sections of the course taught Distance Education?
 Yes
 No
IF YES, was the assessment method different?  Yes*  No *If assessment methods were different, complete an additional form
If there were any sections taught distance education and a different method of assessment was used for online courses vs. on ground courses,
please complete an addition form. You will want to fill out that form, “Save As” with the course name but also add DE (ex: ACCT100 DE). You
can then upload that separate document to the folder for that course. You will then see ACCT100 and ACCT 100DE.
Describe the assessment tool. Check all that apply:
A suggestion is to use embedded assessment: an activity, assignment, or exam that you already use to evaluate students in the course. Not to
be confused with achievement data (enrollment, completion, grades, and so on). What SLO assessment may catch that grades and other
achievement data alone do not is the description of the actual things a student has learned and can do for having been engaged in a course.
Even students who do not complete or who do not receive passing grades may well have learned things we are interested in noting.
According to the ASCCC SLO Terminology Glossary, grades are the faculty evaluation of a student’s performance in a class as a whole. Grades
represent an overall assessment of student class work, which sometimes involves factors unrelated to specific outcomes or student
knowledge, values or abilities. For this reason equating grades to SLO assessment must be done carefully.
 Exam (midterm, final, test)
 Class discussion or activity
 Research (lab reports)
 Quiz
 Written work (essay/assignment)
 Presentation (oral, Prezi, video)
 Homework
 Performance
 Portfolio or class project
 Skills demonstration
 Survey
 Other: [Don’t see it listed? Explain it]
EXAMPLES: Often assessment occurs as part of the course’s regular activities – tests, projects, papers, or demonstrations of skills that are normally
used for grading students. In addition to providing the information for grading, assessment results can provide a snapshot of how well students in
general are learning. This kind of information may confirm effective practices or may suggest areas for improvement through changes in curriculum,
materials, teaching methodology, sequence, or even the SLOs themselves.
 Final exams. These are especially effective when results for items related to specific SLOs are analyzed, discussed, and lead to conclusions
about how well students are achieving in that particular area. Example: A Statistics course might determine how well students handle
complex word problems (critical thinking) versus those that are more computation based. The results could either confirm effective
instruction or suggest areas where changes in teaching or materials might improve students’ learning.


Projects. Many CTE and arts courses have a final project or performance that reflects the achievement of the course SLOs. The use of a
rubric to grade students can also provide data about overall student strengths and weaknesses, and faculty could respond accordingly.
Comprehensive written assignments such as research papers, essays, or critiques. Most papers require application of skills stated in the
SLOs. Rubrics for evaluating the writing can provide more specific information about students’ strengths and weaknesses to inform teaching.
Select the criteria/criterion that determines success within the assessment tool(s):
This is asking how the assessment tool was scored (e.g., rubric, percentage correct, total points, combination, etc.)?
 Blackboard alignment
 Points (for exam items)
 Regional/state exam or industry
 Specified rubric
 Tabulation of survey results
based credential
 Percentage (score or % of students)
 Quality of product
 Other: [Write it in…
]
In your perception, to what extent did the students in the course meet the outcome(s) based on the tool(s) and criterion?
 Exceed
 Meet
 Somewhat Meet
 Not Meet
Explain the benchmark agreed upon for meeting the expectation:
This can be brief. You do not need to include all your data. Keep data in a file or on your computer for your records. Examples below…
Example #1: “Of the 68 students completing the project, 22 submitted an acceptable project with a score of 70-79%; 31 received a score 80-89%; 15
received 90-100%.”
Example #2: “75% out of 68 of students will achieve the level of ‘satisfactory’ or above as described in the rubric.”
Example #3: “Majority of students achieved 8/10 points on exam question about computing distance to Mars.”
Example #4: “3 out of 5 designated quiz questions correct”
Example #5: “85% of the students successfully achieved the learning outcome.”
Based on the results, might improvements be required at the course level?
 Yes
 No
The data sections above provide evidence for the main area of importance which is what you're saying in the reflection in the program review
document or simply how you determined effective practice or how change might be necessary. That's the part that has the big impact on
planning and decision making at the College. Ongoing assessment is part of effective teaching because it can reveal patterns, successes, and
new possibilities. When changes are initiated to address one or more SLOs, it’s logical to follow up with another assessment to determine
whether the changes made a difference. The conclusions drawn from this second assessment may indicate that concerns have been addressed
or that further changes are in order. Using results to determine the effect of changes, re-assessing, and deciding on the next step are often
referred to as “closing the loop” – that is, completing one assessment cycle to improve student learning.
If YES, check all that apply
You might decide to re-assess in a different way or to change the SLO intent or statement itself. Perhaps you may explain something
differently, revise something, or provide more clarity to the student regarding your expectations etc.
 SLO modification
 Textbook revision
 More student engagement
 Program dialogue needed
 Curriculum revision
 Instruction method
 Assignment/activity revision
 Other: [Write it in…
]
If NO, describe the effective practice(s) confirmed by the results:
If the results indicate that the SLO(s) has been met and that no changes seem to be necessary, briefly describe aspects of the course that have
supported student learning (examples: degree of student engagement in class activity; teaching methodology; technological support, student
access to instructor, etc.)
Please share any success stories about the impacts of SLO practices on student learning, achievement, and institutional effectiveness
Feel free to brag here! Has outcome assessment resulted in something great? Provide any example of how it really worked.
Example #1: The Student Success Committee and the English department analyzed assessment data from sequential courses and decided to collapse
basic skills course sequences. Effective fall 2011, English and Reading faculty developed a one unit English and reading blended skills course (Engl 98)
and a five unit English and reading blended course (Engl 99). The rationale came from consistent and informal assessment of SLOs for ENG 030, 040,
and 100. While students showed proficiency in meeting SLOs for each course, they struggled in competency in ENG 101; hence, the English department
revamped the basic skills sequence to include courses that correspond with the skills required for ENG 101. Consistent SLO assessments of the two new
English courses identifies areas of weakness and strength which guide curriculum changes, textbook adoptions, and assignments that further improve
these courses.
Example #2: Library outcomes address physical and virtual resources and the impact of those resources and library instruction on student success. In
SLOA, the library utilizes data from database vendors, feedback from bi-annual student and staff surveys, and informal student feedback. As a result,
librarians added streaming video collections and increased the number of electronic books. Electronic database full-text retrievals increased by 21%
since 2009. Librarians held 269 instruction sessions during 13-14 and have created over 90 online research guides accessible to students on and off
campus. They offer instructional sessions online for DE students. Virtual reference transactions like online chat and text have increased. 11-14 biannual library surveys reveal 87% of students feel library instruction sessions enhance achievement.
Example #3: The Biology program acknowledged the positive impact 13-14 SLOA data have on student learning and development. As a result, they
improved pedagogical approaches for the diverse student population. SLOA data led to acquiring new spectrophotometer machines to aid student
reading of data.
Example #4: Due to SLOA, Math faculty restructured the curriculum to create an alternate algebra sequence, Beginning & Intermediate Algebra I/II to
reduce overlapping concepts and improve student success.
Example #5: Math 210 – Calculus and Analytic Geometry III analyzed the same SLO from spring 13 to fall 13, and noticed an improvement from 73.7%
to 88.5% on the assessment because the faculty devoted more time to real-world application problems during the class. The faculty found that a
smaller class size resulted in a significantly higher rate of students scoring a 3 or higher on the assessment rubric, thus they will continue to advocate
for such measures to be permanently implemented.
In what ways might dialogue about the outcome(s) assessment be shared with colleagues, program, division, or campus?
Check the boxes that represent any communication about the SLO results. This can include informal “sidewalk” or “hallway” conversations.
 Discussion with instructors who teach
 Workshop or presentation
 Program Review (goals, reflections,
the same course
 Department or division meetings
resource requests)
 Convocation/FLEX
 Class discussion with students
 Other: [Write it in…
]
What, if any, assistance or resources may be needed to help you address your outcomes? (will help inform program review in Fall)
Need help from the SLO or Program Review Coordinator? Questions for Institutional Research? Data needs? This serves as a reminder for
how assessment results support your program review requests.
General comments/feedback about the SLO assessment results for this course (optional, but will help inform program review in Fall)
Anyone may add a comment here. Can include: notes for those who input the information, more detail about where the data is stored, when
additional assessments might be conducted, reminders, analysis or reflection to include in the program Review document etc.