Journalism assessment report to college 09-10

Annual Assessment Report to the College 2009-2010
College: MCCAMC
Department: Journalism
Program: Bachelor of Arts
Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2010.
You may submit a separate report for each program which conducted assessment activities.
Liaison: Linda Bowen
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee? (See next page for Part II of question.)
1) The assessment process for this academic year was focused on three distinct but related endeavors:
a) completing a self-study and report as part of the re-accreditation process of the Accrediting Council on Education in Journalism and
Mass Communication. ACEJMC evaluates nine standards, including “Assessment of Learning Outcomes.”
b) collaborating with the Department’s Curriculum Committee on appropriate curricular changes, i.e., developing new multimedia courses
and/or incorporating convergence-based tools and techniques instruction into existing courses. (Some faculty members advocated
“blowing up” the current curriculum in favor of a non-traditional “team-teaching” workshop model framed around skills development
and student media outlets, while others are concerned about how this approach could be implemented in light of weighted teaching
units and other administrative and logistical issues.)
c) participating in the AALC’s Simplifying Assessment Across the University project in which gateway and capstone courses and their
signature assignments were identified in anticipation of implementing the first part of the pilot in Fall 2010. For example, the gateway
course assignment involved development of a news exam (news judgment, style/copy editing and writing) for majors entering upper
division. The capstone, likely administered in the senior tutorials in Spring 2011, was expected to evaluate the overall curriculum and
instruction, as recommended by the ACEJMC site-visit team in Spring 2010.
March 30, 2009, prepared by Bonnie Paller
Is assessment under the oversight of one person or a committee?
Since Spring 2009, assessment has been the responsibility of a three-person committee, led by the AALC representative. However, major
decisions and most assessment activities have involved the entire full-time faculty, which is a relatively small unit (approximately 10), as well as a
few adjunct faculty members who teach the courses selected for assessment. Assessment is discussed at the monthly faculty meetings as a
regular agenda item. This has been the Department practice for more than 10 years.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
The Simplifying Assessment project became the assessment process in the Journalism Department, which has not previously had formally
designated gateway and capstone courses. The timing of the project coincided with suggestions by the accrediting agency site visit team that,
“After introducing new curriculum, the Department may wish to consider shifting the emphasis increasingly from course-level assessment to
assessment of capstone products at graduation that evaluate the Department’s overall curriculum and instruction.” At the same time, faculty
members have recognized the overriding need to evaluate overall curriculum and instruction beyond the typical analysis of data derived from
internship evaluation forms completed by site supervisors. About 10 percent of our students take internships and the average mid-term and
final evaluation rating is 3.75 on a 4-point scale. The 2008-09 results dovetailed with the Internships classification change from “S” Factor to “C”
Factor. The intent was to revamp the evaluation form to more effectively assess student learning and the internships’ relation to the
Department’s SLOs. The current form has 15 standards intended to gauge professional skills and work habits. As the Internship course evolves in
its new classification and format, creating a more useful evaluation instrument should be addressed, particularly because of the evolving nature
of the news media workplace. In addition, the required intern-generated written reports and logs have not been mined as intended. They
represent a rich source of data about the workplace and students’ experiences, and should be analyzed as part of ongoing assessment
committee activities.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below.
2a. Which Student Learning Outcome was measured this year?
The Department did not directly measure a particular SLO this year for several reasons:
1) The faculty is engaged in both major curriculum revision and strategic planning activities;
2) The Department has focused its assessment activities mainly on the course level rather than direct measures of the curriculum at firstyear and/or transfer as well as at graduation. As noted above, the accrediting council’s site visit summary suggested program-level
assessment of “capstone products” that would evaluate overall curriculum and instruction.
3) The assessment liaison has been involved in the Simplifying Assessment Across the University project, in which both gateway and
capstone courses will be assessed in 2010-11. The Department assessment committee also has been involved in this process.
March 30, 2009, prepared by Bonnie Paller
2b. What assessment instrument(s) were used to measure this SLO?
N/A – As part of the current curriculum revision and strategic planning process in Fall 2010, the faculty recognized the need to rework the SLOs
that have not been revised since they were adopted in November 1997.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
N/A
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
N/A
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
N/A
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.
While a traditional assessment project, evaluating an SLO or two, was not conducted this year, the results of previous assessment activities and
projects informed the Department faculty’s curriculum revision talks during Spring 2010. Those curriculum discussions led to meetings among
the full and part-time faculty who teach key skills courses – JOUR 110 and JOUR 210. The focus: What is being taught in the classes, particularly
in regards to convergence/multimedia storytelling tools and techniques, and how course content adhered to learning objectives/outcomes
formulated from earlier assessment results. The meetings helped faculty reach consensus on the level (introductory) of multimedia storytelling
tools and techniques that should or should not be taught in these existing courses, and pointed to a variety of other topics to address in the
curriculum revision. Among them: What should be taught in JOUR 210, if the current public affairs component is shifted to upper division skills
courses? How much audio/video/visual/new/social media should be included at this level? What about consistency across various sections? In
addition, ongoing assessment projects, such as the Simplifying Assessment Across the University (2010-11), are expected to provide new
programmatic information that will be useful as faculty complete the curriculum revision and strategic planning processes.
March 30, 2009, prepared by Bonnie Paller
3. How do your assessment activities connect with your program’s strategic plan?
This semester, the Department chair appointed several senior faculty members to the new Ad Hoc Planning Committee, in response to a
weakness identified in the ACEJMC’s “Report of On-Site Evaluation” following the re-accreditation site visit in Spring 2010. While the
Department has a long history as an accredited journalism program, the site team members noted “a lack of formality in strategic, long-range
planning.” Committee members recognize that a key element of this planning process is connected to assessment and curriculum.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
Last year’s report identified the need for a pre-test or qualifying exam as a resource to assess the skills of incoming students. Other CSU
Journalism programs employ these types of tests, using a standardized exam administered and tabulated through University test centers. Since
it’s unlikely that funding (the lack thereof) would allow a similar test here, the assessment committee opted to develop a more comprehensive
exam for the Simplifying Assessment project (see above).
5. Other information, assessment or reflective activities not captured above.
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
No
March 30, 2009, prepared by Bonnie Paller