Did They Really Learn Anything? Transforming First-Year Seminar Assessment to Measure Student Learning Outcomes Stephanie M. Foote and Braden J. Hosch, Ph.D. February 27, 2006 Annual Conference on The First‐Year Experience Presentation Overview I. Institutional and Course Background II. First‐Year Seminar Assessment Efforts III. Assessment and Evaluation Findings IV. Areas for Potential Improvement V. Q&A Part I Institutional Profile • • • • • Baccalaureate general, public, 3,300 Headcount 670 residential NCAA Division II Mean SAT: 990s Mean HS class rank: 70-75th percentile Part I • Approximately 3,200 undergraduates, and 610 are new freshmen • Factors indicated by freshman on 2005 CIRP Survey as “very important” in choosing USCA: – – – – Academic reputation (55%) Cost (49%) Size (43%) Graduates get good jobs (40%) Part I • 51% of USCA freshmen rated their overall academic ability as “above average” or “outstanding” compared to 58% at peer institutions (2005 CIRP Survey) • In a survey of residential students in November 2005, 75% of freshmen reported they studied or did homework 10 hours per week or less Part I ASUP 101 (The Previous Course) • “Strategies for college achievement” • 3-4 sections (60-80 students) • Primarily taught by one PT adjunct and a handful of staff (no faculty) • One credit hour (first 8-weeks) • No assessment strategy Æ we assumed this would work (ASUP 101 was offered from 1996-2005) Part I Identifying the Problem • High Freshman failure rate (1/3 of freshmen have 1st sem gpa<2.0) • Declining retention rates • Perception of weakening academic skills in entering classes Part I Organizational Response • Strategic Planning • New Vice Chancellor (2002) • New IE Director (2003) • New FYE Director (2005) (New Position) Part II • Structural – FYE Office with Budget, hiring power – Reinvigorated FYE Committee – More Sections • More Instructors, Incl. faculty • Formal Communication among Instructors • Procedural – – – – Training Common Syllabus Outcomes Developed Measure Outcomes in Assignments Part II The Committee • Reinvigoration of FYE Committee, led by new Director • Director did all of the work • Balanced, cross-representational group • Met once a month • Identified outcomes • Input on course content, textbook Objective and result = broad buy-in Part II What the Committee Did • Inherited a laundry list things of what various people wanted the course to be (kitchen sink) • Eventually developed course goals for learning outcomes (3/16 Draft) – Semester-long process – Two pages of goals and outcomes Part II Change in Course Identity • Name changed • Made congruent with similar courses nationwide • Faculty buy-in fostered and recruited faculty to teach course • Common textbook adopted • Course structured around revised outcomes goals Part II AFYS 101 (The First‐Year Seminar) Sections Enrollment Fall 2004 Fall 2005 5 10 98 177 Common content includes time management, study skills, learning styles, reading and memory, note taking, test‐taking, diversity, etc. Part II Effective Assessment • Produce meaningful results about student learning • Maintain faculty ownership • Communicate and use the results • Simple How well did students learn what we wanted them to learn? Do faculty accept the results? Do faculty know about the results? Is this process manageable and sustainable? Part II Transformation of Goals • Goals weren’t easily measurable • Goals were too comprehensive • What can we realistically do? Implement Improve Evaluate Part II The Learning Outcomes We Selected 1. Students will understand the differences between high school and higher education. 2. Students will develop and use effective time management, note taking, and study strategies. 3. Students will identify their learning styles, create a learning plan, and apply it. Part III Rubric Construction Linked to Outcomes Outcome: Students will develop and use effective time management, note taking, and study strategies 1. Time management rubric 2. Note taking rubric 3. Study strategies rubric Outcome: Students will identify their learning styles, create a learning plan, and apply it. 4. Learning styles rubric Part III Rubric (Partial Example) Learning Outcome: Students will develop and use effective time management, note taking, and study strategies. Note Taking Rubric Outcomes and Characteristics Excellent (5) Identify Relevant Information Notes capture all main points Notes summarize/ synthesize, not retell Notes provide quick, memorable examples for main points Satisfactory (4-3) Needs Improvement (2-1) Incomplete or not Achieved (0) Part III * 3.5 2.93 2.82 2.3 2.27 2.41 1.0 2.11 1.5 2.34 2.0 3.23 * 2.62 2.5 2.72 * 3.22 3.0 2.56 Adjusted Sem. GPA† Adjusted First Semester GPA† By SAT Score 0.5 0.0 Below 900 900-990 † Does not include grade in FY Seminar FY Sem w/ Rubric 1000-1090 SAT Score All FY Seminar 1100-1190 * Significant at p<0.05 All FY Students Part III Self Report vs. Direct Measurement Self Report (%Agree + %Strongly Agree) Direct Assessment (Mean)* Note Taking 91.2% 4.16 Time Management 94.4% 4.09 Learning Styles 92.8% 3.84 * 1-2=Needs Improvement, 3-4=Satisfactory, 5=Excellent (0=Missing, but not included in mean calculation) 3.5 3 2.5 2 1.5 1 0.5 LS1: Identify Personal Learning Styles 4.5 4 TM3: Analyze Relationship Betw Time Usage and Priorities LS2: Apply Learning Plan LS2: Create Learning Plan NS3: Evaluate Personal Note Taking Skills NS2: Organize Information NS1: Identify Relevant Information 0 TM2: Identify Personal Priorities 5 TM1: Identify Personal Time Usage Part III Objectives Ranked By Mean 4.78 4.69 4.24 4.21 4.16 4.12 4.05 3.21 2.81 1-2=Needs Improvement, 3-4=Satisfactory, 5=Excellent (0=Missing, but not included in mean calculation) Part III Significant Correlations with Other Areas 1a. Learning Style Learning style inventory was used correctly • Correlates with English 101 grade – Pearsonʹs R= 0.402, R2 = 0.162, Sig: 0.028 – Explains about 16% of variance in course grade Part III Rubric Construction: What We Did Right • Involved campus constituents (FYE Committee) • Focused on realistic and measurable outcomes associated with the course • Created rubrics that were easy to use Part III Rubric Construction: What We Did Wrong • Created the rubrics late (7 days before classes began) • Failed to get sufficient input from AFYS 101 Instructors • Included some characteristics that were challenging to evaluate Part IV Plan for Improvement • Revise rubrics, involving AFYS 101 Instructors • Re-evaluate methods of measurement and assignments • Using collected data, determine which learning outcomes best represent the course goals Contact Information Stephanie M. Foote Director, Academic Success Center and the First‐Year Experience University of South Carolina Aiken [email protected] (803) 641‐3321 Braden, J. Hosch, Ph.D. Director, Institutional Effectiveness University of South Carolina Aiken [email protected] (803) 641‐3338
© Copyright 2026 Paperzz