Student Affairs Assessment Council Minutes January 8, 2014 Attendance: Ozge Akcali, Richard Arquette, Sherrie Butler, Tina Clawson, Maureen Cochran, Dave Craig, Rick DeBellis, Kami Hammerschmith, Marigold Holmes, Lisa Hoogesteger, Pat Ketcham, Carolyn Killefer, Remi Nagata, Jodi Nelson, Daniel Newhart, Juliana Recio, Michele Ribeiro, Linda Reid, Rick Stoddart, Kent Sumner, Jennica Vincent Formative Assessment During the planning portion of the retreat, the group decided it would be helpful to do a formative assessment half-way thought the year. This discussion began with a review of what has been accomplished this year. We submitted a book proposal Assessment council introduced and welcomed several new members Assessment council introduced and welcomed a new chair and SAREP Director Daniel has explored the landscape of the division & discussed next steps for the division We’ve examined ourselves as a group through the membership description & discussed how it might be used (upcoming: orientation development) Charge revision-in progress Input given to Strategic Planning Initiative 6 Discussed changes from last year made to peer review process & provided feedback Tour de Qualtrics-what can it do and where can you find help? We were introduced to the idea of Academic Grit idea Moved away from Compliance Assist We’ve had some good cake Discussion: Looking back on the first half of this year, folks were excited about: Speed with which the decision to move away from Campus Labs was made Culture of Inquiry and its future connection to curriculum. This will serve as a foundation to build other resources and tools Qualtrics being open to university Cake Things that the Council would like to spend more time on; what excites you and/or is relevant to your work? There is a need on campus to establish some common tools. o Common learning outcomes for students workers across the Division o o o o o o Marigold shared that the Graduate School is striving to take more of a Student Affairs approach by providing support services beyond just academic affairs. Resurrect the Cultural Knowledge & Effectiveness Rubric; we put a lot of work into that and don’t know that it’s been utilized very much. How can we use this work? Common tools help brand identity – student affairs and consistency – marketing to our students based on items such as common instruments (i.e. multicultural competence rubric) Use common tools & rubrics for program/project development; similar idea to what is happening with the Professional Faculty job grouping project We need a common definition of what constitutes student success Connect to the various groups that already have common threads Curriculum Daniel and Maureen have been working in partnership with Stan Dura from the University of Oregon to create a curriculum for Assessment, Evaluation, and Research which reflects professional standards from NASPA, ACPA, the American Evaluation Association, and other relevant resources. The intention is to use this curriculum to inform Assessment Council and division-wide educational efforts. In its current draft form, there are 8 major areas that have been identified: Program Review & Evaluation Assessment Research Ethics Effective Reporting & Use of Results Politics of Assessment Assessment Education Culture of Inquiry Daniel led attendees through an activity in which those present broke into 6 groups, 2 of which took on 2 different areas. They were asked to examine the content of the area they reviewed and provide feedback on content, what’s missing, and what they’re especially interested in learning about. Some groups had more time to discuss their thoughts at more length than others but each group had an opportunity to briefly report back to the larger group. Their main points are outlined below. Overall feedback: within each area, make sure that advanced Knowledge, Skills, & Abilities (KSAs) are built upon through the basic and intermediate sections. Program Review & Evaluation Feedback on content: On item 2.b.ii., move training and participation at the Division level to advanced and articulate the development of this more clearly. Division level activity should begin in basic level, with responsibilities increasing through stages Assessment and Research (2 separate sections but reported on together the meeting) Feedback on content: There are a lot of technical skills in these sections; do we really need to understand all these statistics? Is there a space for us to define these terms within a StuAff context? Especially of interest: Utilizing folks who have been in the Assessment Council for a while to present and people who have a lot of experience Articulating both differences and similarities between assessment and research-getting on the same page, common set of definitions that would work across the university What is the Student Affairs narrative on top of basic terms? Ethics Feedback on content: It appears that basic=awareness, intermediate=compliance, and advanced=evaluating One must also consider ethics of your associations and the Assessment Council as a group What’s missing: Ethics of student data; we need to articulate how we're handling student data, how we're comfortable using it as a group, and find out what students are ok with. How do we involve students in defining our ethics? We each may have different boundaries in terms of use of data & reporting-what do you make public and how do you get it out there. Especially of interest: Creating an awareness of ethics; explaining the necessity for following ethics 1 exercise-make sure all have gone through FERPA, Human Subjects training, etc. Speakers: Rebecca Matherns (Registrar), Lisa Leventhal (IRB Coordinator) Effective Reporting & Use of Results Feedback on content: It appears that basic=interpretation, intermediate=sharing, advanced=using results Consistently sharing with members within the department (basic), across division (intermediate), at conferences (advanced) Item 2.e.-the same sentence could be used in basic section as a building block Item 3.c.-the same sentence could be used in intermediate section as a building block Especially of interest: Consistent framework for reporting; what are the important components to include in a report? Reminders to define acronyms Politics of Assessment Feedback on content: Is there a better way to frame this section? Advocacy? Negotiation? This seems to speak to the importance of always being the assessment voice in the room In the intermediate section, we should be “teaching people to fish;” how can they become politically savvy? What’s missing: Convincing people that assessment is necessary (related to 2.c.) Overcome barriers through education Add to advanced-able to communicate political issues in a basic way with the commitment of developing these skills in others. Assessment Education Feedback on content: The title needs work; are we talking about education? Influence? Advocacy? Is it purely about teaching others? The content is clearly called out in the Assessment section We need to ask ourselves: What are the KSAs needed to be an educator in basic, intermediate, and advanced levels? Basic=understanding the audience, development theory, etc. intermediate=designing and implementing, and advanced=influencing, planning Culture of Inquiry Feedback on content: Is 1.e. ok in basic or should it be moved to intermediate? This group started to summarize “what’s the point” of each of the competencies in this area: o Experience: 1.a., 2.a., and 3.a. o Competence: 1.b. o Civic Welfare: 1.c. o Inclusive: 1.d., 2.c. o Context: 1.e., 2.b. What’s missing: Add social justice component to advanced section. Next Meeting: January 15, 2014 9:00 am-10:30 am MU Council Room
© Copyright 2026 Paperzz