MSW annual assessment report 2008-09

Annual Assessment Report to the College 2008-2009
College: ____Social and Behavioral Sciences___
Department: _Social Work_________________
Program: _______________________________
Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2009.
You may submit a separate report for each program which conducted assessment activities.
Liaison: ___Dr. Eli Bartle____________________
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
Assessment is currently under the oversight of one person, Dr. Eli Bartle. Beginning August, 2009, it will be under the oversight of a committee
consisting of a chairperson, Dr. Hyun-Sun Park, plus 4 more committee members: 2 teaching faculty and 2 field faculty.
The 2008-2009 Academic Year was an experimental year in terms of assessment because the profession’s accrediting body, the Council on
Social Work Education (CSWE) has put forth new Educational Policy and Accreditation Standards. These standards called for 3 changes: (1)
CHANGE FROM OBJECTIVES TO COMPETENCIES: The MSW Department converted its program objectives to program competencies. There are
now 10 foundation year program competencies (instead of the previous 12) and there are still 5 concentration year program competencies. (2)
FIELD IS SIGNATURE PREDAGOGY: The Department began the process of making field the signature pedagogy or central form of instruction and
thus, central form of assessment. (3) ASSESS BOTH THE IMPLICIT AND THE EXPLICIT CUMMICULUM: The Department experimented with
different methods of assessing the explicit curriculum, mainly using student self-assessment, faculty assessment of student learning, or a
combination of both. The implicit curriculum assessment and changes were also discussed such as commitment to diversity; admissions policies
& procedures; advisement, retention, and termination policies; student participation in governance; faculty; administrative structure; and
resources.
Measurement instruments used for assessment this year include the following:
1. Student Self-Efficacy Scale, (retained from previous years): a pre-post-test 55-item scale asking students to assess their learning of all
15 program competencies on a 1 to 7 scale when they come into the program and when they graduate (also again @ 2 years post
graduation to measure retention).
March 30, 2009, prepared by Bonnie Paller
2. Student Self-Report of Course Competency Learning: a post-test asking students to access their learning on course competencies which
are in turn related to program competencies. A 1 to 5 scale to measure degree of accomplishment is used.
3. Faculty Grid Ratings on One Common Assignment: In committees consisting of faculty in each core area (policy, practice, research, and
human behavior/social environment, field seminar), one in-common assignment was chosen for each course along with a grid measuring
the accomplishment of assignment aspects related to as many of the course competencies as possible. This grid is completed by each
faculty teaching the course for each student in the course. A 1 to 5 scale to measure degree of accomplishment is used.
4. Field skill sets were identified by each field placement agency. Skill sets were then assigned to be taught in the appropriate course. A
tool may be developed to assess how well the skill sets were taught in each course.
5. Focus group: One focus group was held for graduates. For the first time, the moderator was a field liaison who had not had any contact
with the graduating students thereby reducing social desirability bias to some extent. It enabled the students to talk more freely about
what they learned or didn’t learn.
6. Alumni Survey: The alumni survey and 2nd administration of the student self-efficacy instrument was administered for the first time this
year. It was administered to alumni who graduated 2 years ago and included a second posttest of the self-efficacy scale.
7. Final Graduate Project: Faculty and field instructors rated the final project poster session based on a grid developed for it. A 1 to 5 scale
to measure degree of accomplishment is used.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
No, it did not deviate from what was intended.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below. BECAUSE THE DEPARTMENT IS REQUIRED TO ASSESS ALL 17 (17 THIS YEAR, 15 NEXT YEAR)
PROGRAM COMPETENCIES (SOCIAL WORK PROFESSIONAL WORD FOR SLO), I AM GOING TO REPORT RESULTS IN COMPETENCIES THAT NEED
CHANGES ONLY.
2a. Which Student Learning Outcome was measured this year?
March 30, 2009, prepared by Bonnie Paller
All 15 program competencies (our term for SLOs) must be measured each year as required by our accrediting body, CSWE.
2b. What assessment instrument(s) were used to measure this SLO?
See Number 1 above – items 1 – 7 explain the assessment instruments used.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
All students and all faculty participated in the assessment. It is tradition nationwide to include all students in the program in the assessment
process. Only graduating students participated in the focus group. The response rate for the self-efficacy scale was 90% and the response rate
for each of the courses was anywhere from approximately 50-90%. Considering this was an experimental year, this response rate is satisfactory.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
The overall assessment tool, the self-efficacy scale, is a pretest upon entry to the graduate program, upon exit/graduation, and 2-years postmasters. The other assessment tools are administered at one point in time only, at the end of each semester. The focus group is held one to
two weeks prior to the end of classes for graduating students.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
Baseline data was set for each quantitative instrument. Any score below baseline was linked with the appropriate program competency.
Focus group results were also linked with the appropriate program competency. Data below baseline along with program competency charts
were given to faculty. Faculty began the discussion of necessary changes needed at the final spring faculty meeting and in email and small group
discussions with the assessment person. Faculty began implementing changes to their course competencies and their common assignment grids
over the summer as they prepared for the fall courses. Faculty completed the discussion at the fall retreat in August, 2009. Faculty met in their
core curriculum areas in both May and August to discuss changes in the course competencies also.
March 30, 2009, prepared by Bonnie Paller
Key Findings: The concentration year micro practice course (SWRK 601) scored the lowest in terms of meeting the objectives. Faculty who
taught that course met and made changes to the objectives to meet the current course content. Objectives needed to be updated; course
content had already been updated but did not match old objectives. Objectives also changed to competencies.
Only 2 of the 15 competencies needed to be addressed:
Foundation Competency #2: Apply social work ethical principles to guide professional practice.
Foundation Competency #4: Engage diversity and difference in practice.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.
Program Changes: Conversion from 17 program objectives to 15 program competencies achieved.
Self-Efficacy Scale Results: Revisions made to improve two foundation competencies: 1. More emphasis in all classes on social work ethics
both nationally and internationally; related National and International Code of Ethics to each course. 2. Hold a diversity day each academic
year for all students; focus on sexual orientation this year.
Focus Group Results: Continue holding focus group for graduating seniors.
Course Changes: No major changes needed.
Changes in Assessment: Only assignment GRIDs (instead of student self-assessment of their learning of each course’s sompetencies) will be
used now as they measure outcomes. Field will lead the way in terms of assessment. The method in which it will lead the way will be
determined in the Assessment Committee.
March 30, 2009, prepared by Bonnie Paller
Some programs assess multiple SLOs each year. If your program assessed an additional SLO, report the process for that individual SLO below. If
you need additional SLO charts, please cut & paste the empty chart as many times as needed. If you did NOT assess another SLO, skip this
section.
2a. Which Student Learning Outcome was measured this year?
2b. What assessment instrument(s) were used to measure this SLO?
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
2d. Describe the assessment design methodology: Was this SLO assessed longitudinally (same students at different points) or was a crosssectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were (or could be) used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of each.
March 30, 2009, prepared by Bonnie Paller
3. How do your assessment activities connect with your department’s strategic plan?
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
Yes, resources needed include: computer lab for research work such as SPSS for quantitative analysis and software for qualitative analysis,
training on technology for faculty to expedite the increasing amount of data entry, training on technology for students to improve their final
projects
Additional faculty will be hired due to increase in number of students especially the 3-year program.
5. Other information, assessment or reflective activities not captured above.
Implicit curriculum discussions based on assessment of new admissions criteria (refinement of rating instrument and possible addition of
writing sample), writing development course to enhance critical thinking skills; use of field skill sets in assessment process
March 30, 2009, prepared by Bonnie Paller
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
Dr. Bartle completed an assessment report for faculty use. The report also serves as a model for the final self-study assessment chapter that will
be provided to our accrediting body for the next 4-year accreditation cycle.
March 30, 2009, prepared by Bonnie Paller