Session Topic: How can the assessment of student learning be used to help faculty learn how to more effectively teach difficult areas such as critical thinking? For this session, please arrange yourselves by “size of ins7tu7on” (see table markers) below 1000 students 1000-‐2500 students 2500-‐5000 students above 5000 students Who We Are ¡ Two schools, one academic program ¡ Liberal Arts, Residen7al, Catholic, Benedic7ne ¡ 3,900+ undergraduates ¡ 300+ FTE faculty ¡ 80% of faculty tenured or tenure track CSB/SJU First-Year Seminar § Required of all FY (approximately 1000 per year) § Year-‐long course § 16 students in each class; stay in cohort for year § Goals § Cri7cal Thinking, Reading, Wri7ng, Discussion, Public Speaking, Informa7on Literacy § First semester emphasizes essays § Second semester emphasizes research paper § 65 sec7ons, 50 faculty § Faculty-‐determined topical content § Faculty recruited from across disciplines (HUM/SS/NS/FA) § Both tenured and con7ngent faculty FYS Assessment 1990s - 2008 ¡ Assessment of student work done annually ¡ Pre and post essays ¡ Holistic grading ¡ Little/no incentive for student effort ¡ Assessment not clearly tied to directives in Faculty Assembly motion ¡ Results not shared with faculty or used New Vision ¡ Meaningful assessment ¡ to shape faculty development efforts ¡ to improve student learning ¡ Create clearer understanding of whole faculty’s expectations for course New Aspirations § Create culture of assessment § Don’t just do assessment, but see it as useful § Enhance sense of a joint endeavor § Create sense of community among 50+ faculty teaching in program One Result ¡ Decision on key areas for assessment ¡ Rubric ¡ Used to evaluate research papers produced in second semester Activity: Table Conversations ¡ Please take a few moments to have a brief conversa7on at your table about the following: ¡ What do you do at your school ¡ What do you see in our program ¡ What would you like us to explore/explain in our remaining 7me together Possible Areas ¡ Crea7ng Culture of Assessment ¡ Process ¡ Results ¡ On-‐going Issues Creating a Culture of Assessment: Community Understanding ¡ Faculty conversa7ons on goals for FYS ¡ Faculty conversa7ons on how best to assess in ways that fit the Faculty mandate ¡ Faculty conversa7ons on crea7on of key areas Creating a Culture of Assessment: Essential Step ¡ FYS faculty buy-‐in on three key areas for assessment ¡ Ability to Present a Clear Argument ¡ Ability to Address Different Points of View ¡ Ability to Use Evidence in a Convincing Manner ¡ Understood from the beginning that we were “aiming high” ¡ Crea7on of rubric ¡ FYS faculty discussions and revisions Creating a Culture of Assessment: Re-branding the “A” Word ¡ Emphasizing that it is a faculty driven process ¡ Not puni7ve, but helping teachers develop skills necessary to improve student learning in key areas ¡ Reminders that assessment is not the same as grading ¡ Worked from the assump7on that if good students weren’t mee7ng our goals, then we needed to find more effec7ve teaching strategies (or change our goals) ¡ Not really interested in results from students who didn’t put in much effort Creating a Culture of Assessment: Disseminating Results ¡ FYS faculty receive student scores from their sec7on (numerical and wrigen comments) ¡ Aggregate results are shared at FYS department mee7ng ¡ Mul7ple conversa7ons on “best prac7ce” in areas of greater difficulty ¡ Led by faculty with ideas that have been successful ¡ Training sessions led by director and team ¡ Week-‐long May workshop plus several in-‐semester mee7ngs Assessment Process § Evaluate research papers done in second semester § Draws together mul7ple strands taught over the year § Major por7on of grade, so high degree of student effort § Select three papers with the highest grade from each sec7on § Do “best” students meet our assessment goals? Assessment Teams § Experienced FYS faculty § Most are graduates of Teagle Grant Assessment Training § Day long training/conversa7on for inter-‐rater reliability § Revisions of rubric considered § Four teams of two § Read individually, score, and compare ra7ngs § Provide numerical ra7ng and 4-‐5 sentence explana7on of why § When disagreement, paper goes to third reader What We Found Out ¡ Gradual improvement in ALL categories of assessment (Combining Excep&onal and Acceptable categories) 2009 2012 Ability to Present a Clear Argument 56.40% 70.80% Ability to Use Different Points of View 49.30% 61.40% Ability to Use Evidence 60.80% 79.10% Thinking About Results § Scores in “Ability to Address Different Points of View” (Cri7cal Thinking) were lower § Recogni7on that level of intellectual development makes this harder for tradi7onal age FY students § Tend to rely on authority or “everyone has right to own opinion” § Makes it difficult to wrestle with argument that doesn’t fit student view Response to Student Results ¡ Promote understanding of typical responses for FY level ¡ Rather than seeing this as permanent state ¡ Seek ways of “nudging” students forward ¡ Emphasis on small steps ¡ Structured teaching of cri7cal thinking over the year Assessment Impact on FYS Faculty ¡ Accountability focused agen7on on core learning goals ¡ Vast majority of faculty began to adhere to page/source requirements, ending wide variability between sec7ons ¡ Increased sense of common purpose ¡ Greater agendance at FYS department mee7ngs ¡ Greater agendance at workshop and other training ¡ More conversa7ons among faculty ACROSS disciplines ¡ More FYS sec7ons where faculty were doing “paired” work Greater Willingness to Cooperate 2009 2014 Percent of Faculty Who Turned in Research Papers for Scoring 45% 98% On-Going Concerns: Senior Faculty ¡ Some tenured faculty not as engaged ¡ Tend to teach one FYS every 3-‐4 years ¡ Departmental affilia7on takes precedence ¡ Don’t agend training or mee7ngs as frequently ¡ Therefore less likely to share wisdom ¡ Don’t modify expecta7ons to fit goals ¡ Assessment results onen lower On-Going Questions: 2014 Results 2009 2012 2014 Ability to Present a Clear Argument 56.40% 70.80% 60.80% Ability to Use Different Points of View 49.30% 61.40% 52.40% Ability to Use Evidence 60.80% 79.10% 73.60% Thoughts on 2014 Results ¡ No significant change in profile of class ¡ Inter-‐rater reliability over 7me ¡ Most of team has read 4-‐5 years ¡ Have made adjustments in gloss on rubric ¡ Immersed in goals, so unconsciously expect more? ¡ Need to include earlier papers in annual mee7ng to establish inter-‐rater consistency ¡ See if there has been shin in standards Examination of 2014 Data ¡ May be due to a slight change in who was teaching ¡ Experience magers ¡ Sec7ons taught by “veteran” faculty averaged ¡ 2.2 “excep7onal” ra7ngs per class out of 9 possible ¡ 2.2 “unsa7sfactory” per class out of 9 possible ¡ Sec7ons taught by “inexperienced” faculty averaged ¡ 1.1 “excep7onal” ra7ngs ¡ 4.2 “unsa7sfactory” ra7ngs For further information ¡ Ken Jones ([email protected]) ¡ John Kendall ([email protected]) ¡ To download a copy of this presenta7on or any other related material in this presenta7on, please visit the following webpage at CSB/SJU: hgp://employees.csbsju.edu/jkendall/
© Copyright 2026 Paperzz