VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from a proof-of-concept pilot study involving 59 institutions in nine states using common rubrics to assess more than 7,000 student work products. The sample of student work in the pilot represented the near-graduation students across the participating institutions in the nine states only; therefore, the results are not generalizable for all students in each participating state or nationwide. MSC Pilot by the Numbers • MSC states: Connecticut, Indiana, Kentucky, Massachusetts, Minnesota, Missouri, Oregon, Rhode Island, Utah • 59 public institutions uploaded artifacts • By sector: – 28 four-year, including 8 research institutions – 31 two-year These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 2 MSC Pilot by the Numbers • 7,215 pieces of student work were submitted [number of pieces of work approximates number of student participants] – Students had to be 75% of the way to completion of institutional degree requirements – 2,642 artifacts scored twice (36.6%) in order to measure inter-rater reliability • 1,166 assignments were submitted [number of assignments approximates number of faculty participants] These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 3 MSC Pilot Study Student Population Sample by Gender Relative to Graduating Students in Participating Institutions These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 4 MSC Pilot Student Population of Pell-Eligible vs. Non Pell-Eligible Students These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 5 MSC Pilot Study Four-Year Student Population Samples by Race 82% of students in MSC sample were White; 80% of students in participating institutions were White 10/17/2015 These results are not generalizable across participating states or the nation in any way. Please use appropriately. 6 MSC Pilot Study Two-Year Student Population Samples by Race 77% of MSC sample were White students; 81% of students in participating institutions were White These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 7 MSC Pilot Study Student Population Sample by Age Relative to Students at Participating Institutions 10/17/2015 These results are not generalizable across participating states or the nation in any way. Please use appropriately. 8 VALUE Rubrics Valid Assessment of Learning in Undergraduate Education • Assesses student work from curriculum and co-curriculum • Provides detailed assessment of key dimensions/criteria of learning for each learning outcome • Evaluated by faculty and other educational professionals • Visible evidence of learning applied and used in multiple settings • Shared framework and standards of judgment for quality without standardization 10/17/2015 9 Faculty and Staff Responses to Usefulness of VALUE Rubrics for Assessing Student Work Percent of scorers who reported Strongly Agree or Agree with each aspect of rubric use 10/17/2015 10 Quantitative Literacy Rubric Dimensions Interpretation Ability to explain information presented in mathematical forms (e.g., equations, graphs, diagrams, tables, words) Representation Ability to convert relevant information into various mathematical forms (e.g., equations, graphs, diagrams, tables, words) Calculation Application / Analysis Ability to make judgments and draw appropriate conclusions based on the quantitative analysis of data, while recognizing the limits of this analysis Assumptions Ability to make and evaluate important assumptions in estimation, modeling, and data analysis Communication Expressing quantitative evidence in support of the argument or purpose of the work (in terms of what evidence is used and how it is formatted, presented, and contextualized) MSC Pilot Study Results—Quantitative Literacy Dimension 4-Year Institutional Score Distribution Frequency Percent % of student work products scored 4-0 by faculty scorers on each dimension of quantitative literacy 50% 40% 30% 20% 10% 0% 0 1 2 3 4 Application/ Assumption Communica Interpretati Representat Calculation Analysis s tion on ion 7.3% 27.6% 9.1% 11.2% 11.2% 13.1% 10.7% 12.8% 5.4% 8.0% 5.1% 7.1% 33.3% 31.5% 19.4% 22.5% 26.4% 26.2% 31.3% 19.7% 47.8% 35.8% 38.2% 38.3% 17.3% 8.4% 18.3% 22.5% 19.1% 15.3% Total 13.7% 8.2% 26.4% 35.1% 16.7% Note: Each work product was scored on 6 dimensions of quantitative literacy using a common AAC&U VALUE Rubric. See Slide 15 for rubric dimension criteria. VALUE rubrics are available at www.aacu.org/value. This was a pilot study. These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 DRAFT 12 MSC Pilot Study Results—Quantitative Literacy Dimension 2-Year Institutional Score Distribution Frequency Percent % of student work products scored 4-0 by faculty scorers on each dimension of quantitative literacy 50% 40% 30% 20% 10% 0% 0 1 2 3 4 Application/ Assumption Communica Interpretati Representat Calculation Analysis s tion on ion 12.4% 35.4% 5.9% 14.5% 15.5% 7.0% 10.0% 17.5% 6.9% 14.1% 11.2% 7.5% 45.0% 24.5% 25.0% 22.7% 29.6% 34.0% 27.8% 18.4% 53.2% 40.5% 35.4% 45.5% 4.8% 4.2% 9.0% 8.2% 8.3% 6.0% Total 15.4% 11.3% 30.1% 36.4% 6.7% Note: Each work product was scored on 6 dimensions of quantitative literacy using a common AAC&U VALUE Rubric. See Slide 15 below for rubric dimension criteria. VALUE rubrics are available at www.aacu.org/value. This was a pilot study. These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 DRAFT 13 Selected Findings – Quantitative Literacy Most students in both two-year and fouryear institutions were rated “3” or “4” on students’ “calculation” skills. Less than half of work products at four-year institutions and about one-third of work products at two-year institutions were rated “3” or “4” on ability to “make judgments and draw appropriate conclusions based on quantitative analysis of data.” Written Communication Rubric Dimensions Context of and Purpose for Writing Includes considerations of audience, purpose, and the circumstances surrounding the writing task(s). Content Development Genre and Disciplinary Conventions Formal and informal rules inherent in the expectations for writing in particular forms and/or academic fields (please see glossary). Sources and Evidence Control of Syntax and Mechanics MSC Pilot Study Results—Written Communication Dimension 4-Year Institutional Score Distribution Frequency Percent % of student work products scored 4-0 by faculty scorers on each dimension of written communication 50% 40% 30% 20% 10% 0% 0 1 2 3 4 Content Development .4% 14.5% 26.9% 35.2% 22.9% Context/ Purpose 11.6% 29.5% 34.8% 24.1% Syntax/ Mechanics 1.4% 11.8% 33.3% 34.8% 18.7% Genre/ Conventions 1.3% 14.7% 30.7% 33.3% 19.9% Sources/ Evidence 10.6% 16.1% 28.7% 27.4% 17.1% Total 3.1% 13.6% 30.2% 33.0% 20.1% Note: Each work product was scored on 5 dimensions of written communication using a common AAC&U VALUE Rubric. See Slide 18 below for rubric dimension criteria. VALUE rubrics are available at www.aacu.org/value. This was a pilot study. These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 DRAFT 16 MSC Pilot Study Results—Written Communication Dimension 2Year Institutional Score Distribution Frequency Percent % of student work products scored 4-0 by faculty scorers on each dimension of written communication 50% 40% 30% 20% 10% 0% 0 1 2 3 4 Content Development 2.7% 20.0% 30.2% 36.5% 10.6% Context/ Purpose 1.4% 12.1% 38.8% 30.4% 17.3% Syntax/ Mechanics .2% 20.9% 38.4% 32.4% 8.1% Genre/ Conventions 3.2% 24.6% 41.5% 20.6% 10.1% Sources/ Evidence 12.8% 27.5% 27.1% 22.3% 10.3% Total 3.7% 21.3% 35.4% 28.9% 10.6% Note: Each work product was scored on 5 dimensions of written communication using a common AAC&U VALUE Rubric. See Slide 18 below for rubric dimension criteria. VALUE rubrics are available at www.aacu.org/value. This was a pilot study. These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 DRAFT 17 Selected Findings – Written Communication • Students at two-year colleges have students performing really well, for example: • • Nearly 50 percent of work products collected from two-year institutions were scored either “3” or “4” on “content development” in writing. About one-third of work products collected from two-year institutions were scored “3” or “4” on demonstrating the use of “sources and evidence” in writing. • Students at four-year colleges and universities are performing at higher levels overall. • • More than 50 percent of student work at four-year institutions was scored at either a “3” or “4” overall in terms of written communication One still sees much room for improvement overall, especially in students’ capacity to use “sources and evidence” to make arguments in their writing. Critical Thinking Rubric Dimensions Explanation of issues Evidence Selecting and using information to investigate a point of view or conclusion Influence of context and assumptions Student's position perspective, thesis/hypothesis Conclusions and related outcomes implications and consequences MSC Pilot Study Results--Critical Thinking Dimension 4-Year Institutional Score Distribution % of student work products scored 4-0 by faculty scorers on each dimension of critical thinking Frequency Percent 50% 40% 30% 20% 10% 0% 0 1 2 3 4 Conclusions/ Outcomes 6.9% 25.5% 38.7% 21.9% 6.9% Evidence 4.9% 27.0% 35.2% 26.2% 6.6% Explanation of issues 2.2% 16.3% 37.0% 30.4% 14.1% Context/ Assumptions 12.1% 25.8% 38.6% 14.4% 9.1% Student's Position 4.8% 33.2% 32.6% 24.6% 4.8% Total 6.2% 26.0% 36.6% 23.3% 7.9% Note: Each work product was scored on 5 dimensions of critical thinking using a common AAC&U VALUE Rubric. See Slide 12 for rubric dimension criteria. VALUE rubrics are available at www.aacu.org/value. This was a pilot study. These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 DRAFT 20 MSC Pilot Study Results--Critical Thinking Dimension 2-Year Institutional Score Distribution Frequency Percent % of student work products scored 4-0 by faculty scorers on each dimension of critical thinking 50% 40% 30% 20% 10% 0% 0 1 2 3 4 Conclusions/ Outcomes 9.8% 37.9% 38.3% 10.2% 3.8% Evidence 10.1% 31.2% 37.7% 15.2% 5.8% Explanation of Context/ issues Assumptions 8.8% 11.6% 26.3% 40.1% 36.8% 32.7% 24.6% 15.0% 3.5% .7% Student's Position 9.7% 41.8% 27.9% 17.6% 3.0% Total 10.0% 36.3% 34.8% 15.5% 3.4% Note: Each work product was scored on 5 dimensions of critical thinking using a common AAC&U VALUE Rubric. See Slide 12 for rubric dimension criteria. VALUE rubrics are available at www.aacu.org/value. This was a pilot study. These results are not generalizable across participating states or the nation in any way. Please use appropriately. 10/17/2015 DRAFT 21 Selected Findings - Critical Thinking • Consistent with other national studies, many students who had earned at least 75 percent of credits toward their degrees still are not achieving high levels of important critical thinking skills. Of the three outcomes evaluated, critical thinking was the worst. • Less than one-third of student work products collected from fouryear institutions were scored a “3” or “4” on “using evidence to investigate a point of view or reach a conclusion.” • Nearly 40 percent of work samples assessed at four-year institutions were rated only “0” or “1” on how well students “analyzed the influence of context and assumptions” to draw conclusions. • On the other hand, nearly 45 percent of work products from fouryear institutions were rated “3” or “4” in terms of how well students “explained issues” in their analysis. • In short, students can describe and explain, but lack proficiency in critically examining the assumptions, hypotheses, evidence and implications of issues and positions, i.e. applying their knowledge. VALUE/Multi-State Collaborative (MSC) KEY CONCLUSIONS Central to student achievement is the quality of assignments that will allow us to reap the evidence on learning Faculty collaboration and agreement on quality of student work across and within institutions is possible and assessment results can be useful for improvement of learning We can and need to improve what we are doing to achieve heightened proficiency among all students 10/17/2015 23
© Copyright 2024 Paperzz