Analysis of 2010-2011 Assessment Results for General Education

Analysis of 2010-2011 Assessment Results
for General Education
Student Learning Outcomes Goals 2-6
Morehead State University
DRAFT
August 29, 2011
Blank page
Morehead State University
Analysis of 2010-2011 Assessment Results
for General Education
Student Learning Outcomes Goals 2-6
This is a summary of the General Education 2010-2011 assessment of student learning outcomes for
Goals 2 through 6. The report supplements the Response Report to the SACS Reaffirmation Committee,
which reported on the assessment of student learning outcomes for Goal 1 of the general education
program. The report is organized in five (5) major sections to respond to five of the six major goal areas
around which the student learning outcomes are defined..
Student Learning Outcomes
The general education program reflects increased accountability across levels with student learning
outcomes (SLOs) grouped into six major goal areas:






Communication skills
Intellectual skills
Quantitative skills
Knowledge of human cultures
Knowledge of the natural world
Knowledge of aesthetics
Analysis of 2010-11 Assessment Results
The GEC defined and the Faculty Senate approved six goals with 2 to 5 student learning outcomes
(SLOs) per goal for a total of 22 outcomes. Faculty and staff assess each SLO through a combination of
direct measures through program level and course-embedded assessments, and indirect measures through
national and local surveys. Fall 2010 – Spring 2011 is the initial year of implementation of the new GE
paradigm. Data collected this year through core courses and transitional distribution courses will serve as
baseline data to show progress over time and to assist in developing data collection procedures and
standards for SLO success. This section of the report presents the results for each SLO grouped by Goals
2-6 using a consistent set of subsections. The subsections accomplish the following tasks:
 Data Source Table: presents the course measures as data sources for the SLO in tabular form
including capstone and indirect measure sources not yet collected
 Measures: describes the measures used by each source to assess the outcome
 Findings: summarizes the findings submitted and any outcomes from 2009 NSSE data
 Summary Table: presents the percentage of students identified by each assessment as achieving
the SLO (at least a 70% possible score)
 Conclusions: draws a set of conclusions based on the findings
 Recommendations: makes suggestions for instruction, revised assessment procedures and
decision making needs
NOTE: For ease of navigation, activate hyperlinks to references in the appendices by use of
CTRL+click. Return to your place in the document by activating hyperlinks at the bottom of each
reference in the same way.
Page
3
Goal 2 – Effective Intellectual Skills
Outcome 2a: Employ current technologies to locate, analyze, evaluate and use information
in multiple contexts and for a variety of purposes
Data Source Table:
Table 2-1 indicates that three core courses have responsibility for assessing this outcome. Data
submitted in this initial year were from sections of the oral communications course and sections
of two English courses. Additionally, data were available from NSSE.
Table 2-1 : Direct and Indirect Measures for Outcome 2a*
Type of
Measure
Test
Rubric 1
Rubric 2
Survey
Oral
COMM 108
NCA Competent
Speaker form
Core Courses
Write 1
ENG 100
Library Quiz
Core Writing
Scoring Guide
Indirect
Write 2
ENG 200
Library Quiz
NSSE
Dept. Developed
Resource Survey
2b, 11e, 1l, 11g
*Grayed measures were not available
Measures (See Appendix A):
Core courses provided both direct and indirect measures for outcome 2a. ENG 100 courses
described two direct measures to assess outcome 2a. The first measure was one category of the
Core Writing I and II Scoring Guide measuring the effective use of source material in support of
thesis. English faculty applied the scoring guide to an essay and final exam.
The second measure implemented by ENG 100 was a library research quiz. Upon completion of
instruction in use of the online library catalog, library databases, and internet search engines and
the submission of at least one research-based writing assignment, students took a timed computermediated library research quiz. The quiz presented 20 multiple-choice items with 4 options each
that focused on the use of current technology to locate information. The General Education
Writing Committee (GEWC) of the English department established no targets for student
achievement for the first year, opting to pilot the research quiz and the assessment process for two
semesters to have a broader sense of potential areas of concern affecting assessment.
ENG 200 courses described one direct and one indirect measure for outcome 2a assessment. The
direct measure was similar to the library research quiz described for ENG 100. It differed from
the ENG 100 quiz in two ways: it had only 10 items, the first 6 of which focused on locating
information while the last 4 items focused on evaluating the information. As with the ENG 100
research quiz, the GEWC established no targets for student achievement for the first year, opting
Page
4
to pilot the research quiz and the assessment process for two semesters to gain a broader sense of
potential areas of concern affecting assessment. In addition, “the student group assessed this first
semester is not the population [the department] believes will best present assessment outcomes
for the revised Core Writing I and Core Writing II sequence.“
The second measure implemented by ENG 200 was a Resource Survey. The first part of this
course survey indicates students‟ perception of their abilities to use electronic research tools to
locate information, evaluate potential sources, and use sources in writing.
Findings:
ENG 100 results provided for both fall and spring semesters based on the first measure, as
reported by the English department, were “folded into the general assessment of the essay and
final overall.” Data provided were not disaggregated to show results for the use of source
materials, so the results provided no useful data from which to assess the outcome.
A total of 489 fall students and 351 spring students enrolled and actively working towards the
successful completion of ENG 100 took the library research quiz administered through
Blackboard. Based on summative results, the average fall score was 15.4 and the average spring
score was 15.1. Spring data provided individual scores allowing a frequency distribution (table
2-2) to be calculated which showed a range of scores from 8 to 19. Table 2-2 shows that 77.8%
of spring students scored at least a 14 (70%). The table also shows that, of the 556 students
originally enrolled, 205 (36.9%) did not complete the quiz.
Table 2-2: Counts for ENG 100 Library Research Quiz
Score
20
19
18
17
16
15
14
13
12
11
10
9
8
Count
Pct
Cum Pct
0
12
25
71
62
62
41
32
24
9
11
1
1
0.0
3.4
7.1
20.2
17.7
17.7
11.7
9.1
6.8
2.6
3.1
0.3
0.3
Assessed
351
100.0
Missing
205
Total
556
0.0
3.4
10.5
30.8
48.4
66.1
77.8
86.9
93.7
96.3
99.4
99.7
100.0
Page
5
Students enrolled and actively working towards the successful completion of ENG 200 took a
specialized library research quiz administered through Blackboard. Fall results revealed that the
average score for 359 students of the 369 who attempted the quiz was 5.83. Spring results
showed that the average score for the 444 students of the 450 who attempted the quiz was 5.94.
Data did not allow calculation of subscores based on the 6 items focusing on locating information
and the last 4 items focusing on evaluating the information. The outcome, therefore, could only
be assessed as a whole. Individual data for spring students allowed the frequency distribution
Table 2-3 to be calculated showing 38.3% of students score at least a 7 (70%). The table also
shows that, of the 636 students originally enrolled, 192 (30.2%) did not complete the quiz.
Table 2-3: Counts for Eng 200 Library quiz
Score
Count
Pct
Cum Pct
10
0
0.0
0.0
9
14
3.2
3.2
8
64
14.4
17.6
7
92
20.7
38.3
6
117
26.4
64.6
5
75
16.9
81.5
4
44
9.9
91.4
3
26
5.9
97.3
2
11
2.5
99.8
1
1
0.2
100.0
Assessed
444
100
Missing
192
Total
636
A total of 336 of 345 fall students (97.4%) and 346 or 350 spring students (98.9%) enrolled and
actively working towards the successful completion of ENG 200 took a two-part survey
administered through Blackboard, with the first part seeking feedback on their ability to use
specialized databases and web sources after research instruction in ENG 200. Response to five
items pertinent to outcome 2a shown in Table 2-4 revealed the following five findings relevant to
outcome accomplishment.
1) After taking ENG 200, 69.6% and 55.5% of fall and spring students, respectively felt at
least more comfortable using specialized electronic research tools to locate information.
2) 79.1% and 74.8% (fall and spring semesters, respectively) said their experiences and
practice using online databases or electronic resources improved their ability to evaluate
sources for use in their writing.
3) Use of online databases and electronic research tools helped 70.2% of fall and spring
students increase their understanding of how to evaluate sources for use in your writing.
Page
6
Table 2-4: Results from ENG 200 survey Items pertinent to Outcome 2a
Survey Item
#5 - After taking ENG 200, how do
you rate your ability to use online
databases? (locate info)
#8 - Which statement best describes
how your experiences and practice
with using online databases and
electronic resources affected your
ability to evaluate sources for use in
your writing? (evaluate sources)
#9 - Which statement below most
closely resembles the specific way
your use of online databases and
electronic research tools affected your
ability to evaluate sources for use in
your writing (evaluate)
#10 - Which statement most closely
describes the effect of using online
databases and electronic research tools
when using actual source material in
your writing
#11 - Which statement best describes
your perceptions of your future as a
user of online databases/electronic
research tools
Fall
2010
69.6%
Spring
2011
55.5%
My practice with both online databases and
electronic research tools had a positive impact on
my ability to evaluate sources.
My practice with electronic resources improved my
ability
61.6%
46.8%
10.4%
10.1%
My practice with online databases improved
my ability to evaluate sources, but electronic
resources had no impact.
My practice enabled me to understand how
an electronic citation is formatted and what
information a citation supplies, as well as
understanding a source‟s credibility based
on the type of source and its merits.
7.1%
17.9%
39.3%
41.0%
My practice enabled me to understand how
to read an abstract from an electronic
citation and to cut research time, providing
me more time to concentrate.
Using these research databases and tools
helped me feel more confident in using
sources in my writing because I know how
to research more effectively now, providing
me skills to complete additional research if
necessary.
I understand the value of using outside
source material in my writing because I
understand how others use it from having
located, evaluated, and read a diversity of
sources found through online databases
and/or electronic tools.
Using these resources helps me avoid
plagiarism because I feel more confident in
locating, evaluating, and using source
materials effectively in my writing, rather
than merely phrasing someone else‟s ideas
in my own words without proper attribution.
I will be more willing to use more
specialized online using online databases and
electronic research tools.
30.9%
29.2%
36.9%
31.2%
27.1%
21.7%
19.6%
24.6%
67.6%
67.0%
Response Options
Felt more or significantly more comfortable using
online databases and electronic research sources
4) Three benefits of using online databases and electronic research tools students most
closely associated with in fall and spring, respectively included feeling more confident in
Page
7
using sources in writing (36.9%, 31.2%), understanding the value of using outside
sources (27.1%, 21.7%), and helping to avoid plagiarism (19.6%, 24.6%).
5) About two –thirds of the students (67.6% in the fall, 67.0% in the spring), indicated an
increased willingness to use more specialized online databases and electronic research
tools in the future.
The 2009 NSSE data show MSU seniors‟ perceptions to be comparable to the NSSE mean for
analyzing the basic elements of an idea, experience, or theory. . .(2b), and significantly greater (p
<.05) than the NSSE mean for thinking critically and analytically (11e). Senior perceptions
specific to SLO 2a were significantly greater than (p<.001) the NSSE mean for used an
electronic medium. . .to discuss or complete an assignment(1l); and using computing and
information technology (11g) (NSSE Table 2, Appendix B).
Summary Table:
Table 2-5: Percent of students reported successfully achieving Outcome 2a*
Core Courses
Indirect
Oral
Write 1
Write 2
Assessment
NSSE
COMM 108
ENG 100
ENG 200
Quiz
78%
Rubric 1
NA
UTD
Rubric 2
NA
Survey
67%
NSSE Mean
*NA - not available; UTD – Unable to determine
Conclusions for SLO 2a:
Based on the 2010-2011 findings for SLO 2a, the Assessment Coordinator draws the following
conclusions.
 78% of 351 ENG 100 spring students reached at least a 14 (70%) on a library quiz to
show they understood the use of current technology to locate information. Because mean
scores for fall and spring (15.4 and 15.1, respectively) students were comparable, it is
likely that a similar percentage of the 489 fall students also demonstrated understanding
(table 2-2).
 Less than half (38.3%) of 444 ENG 200 spring students scored a 70% on a library quiz to
show they understood the use of current technology to locate and evaluate the
information. Because mean scores for fall and spring students were comparable (15.8
and 15.9, respectively), it is likely that a similar small percentage of the 359 fall students
also demonstrated understanding (table 2-3).
 ENG 200 data did not allow subscore calculations for locating information and evaluating
the information, making it impossible to determine if one of these skills presented a
greater challenge to the students than the other.
Page
8
 Spring data show that of 556 students originally enrolled, 205 (36.9%) did not complete
the quiz indicating a student retention rate for Spring ENG 100 courses of 63%.
 Spring data show that, of the 636 students originally enrolled in ENG 200, 192 (30.2%)
did not complete the quiz indicating a student retention rate for Spring ENG 200 courses
of 70%.
 Seniors responding to the 2009 NSSE survey were significantly more confident in their
abilities to use computing and information technology than other students represented.
Outcome 2b: Recognize and effectively utilize both deductive and inductive reasoning
Data Source Table:
Table 2-6 indicates that two distribution courses have responsibility for assessing this outcome.
Data submitted in this initial year were from two science courses, Biology and Astronomy.
Table 2-6 : Direct and Indirect Measures for Outcome 2b*
Type of
Measure
Test
Quiz
Application
Distribution Courses
NSC 1
NSC 2
BIOL 105
ASTR 112
2 Exam items
6 pt item Mid-semester &
End of semester
Chapter
Formula
*Grayed data were not available
Measures (See Appendix A):
Two distribution courses used direct measures to assess outcome 2b, students recognize and
effectively utilize both deductive and inductive reasoning. An exam in biology 105 assessed the
recognition task through two multiple choice exam items which asked students to identify
examples of deductive and inductive reasoning. Two sections of Astronomy used two short
answer exam questions to assess indirectly whether students could effectively utilize both types
of reasoning by asking them to respond to questions that required the use of each type of
reasoning. The two sections coordinated on the inductive reasoning question, but used separate
items for deductive reasoning. Section 2 provided the student with interpretive information and
directions that helped lead the student through the deductive reasoning process. Instructors also
scored the answers differently, with the answer in section 1 worth 6 points and in section 2 worth
4 points. One section used additional items to further pursue student use of both types of
reasoning. Aggregating the data was not possible.
Page
9
Findings:
Biology results showed that the average response for 266 students was 54%. On the average,
students answered one of the two items correctly. Reported results offered no additional
analyses, so the percentage of students answering one or both items correctly cannot be
determined. Results did not present an item analysis, so it cannot be determined whether
students had more difficulty identifying inductive or deductive reasoning.
Astronomy instructors in two spring sections used the same question requiring inductive reasoning
to respond correctly. Section 1 had 26 students and section 2 had 49 students. The results were
similarly varied, with section 1 students averaging a score of 4.2 out of 6 (70%) and section 2
students averaging 3 out of 6 (50%). Instructors assessed deductive reasoning with different
versions of the same question. The question in section 2 provided additional information, which
could lead the student through the reasoning. Thirty-four (34) students of section 1 averaged 4 out
of 6 (66.7%), while 48 students in section 2 averaged 2.5 out of 4 (62.5%).
The first section added a second question requiring deductive reasoning, but at a greater level of
content difficulty. Students average 2.36 out of 6 (39%) on this item. Since the scores do not
assess use of deductive reasoning itself, the lower average most likely reflects the content
difficult and not the lack of use of deductive reasoning. The report also showed several
additional items that required one of the two forms of reasoning to answer. The percentage of
correct responses ranged from 62% to 77%. However, none of the items assessed use of
reasoning directly and correct response at times was highly dependent on mathematics skills. The
percentage of students correctly applying either type of reasoning could not be determined.
Summary Table:
Table 2-7: Percent of students reported successfully achieving Outcome 2b*
Assessment
Test
Quiz
Application
NSC 1
BIOL 105
UTD
Distribution Courses
NSC 2
ASTR 112
UTD
UTD
UTD
*NA – not available; UTD – unable to determine
Conclusions for SLO 2b:
Based on the 2010-2011 findings for SLO 2b, the Assessment Coordinator draws the following
conclusions.
 The percentage of students successfully recognizing or using reasoning skills could not be
determined.
Page
10
 Items used to assess reasoning skills resulted in an indirect measure.
 Faculty from the same course failed to coordinate efforts sufficiently to draw conclusions for
the success of students in the course.
Outcome 2c: Thoughtfully analyze and evaluate diverse points of view
Data Source Table:
Table 2-8 that three core courses and two distribution course have responsibility for assessing
this outcome. Core course data submitted in this initial year were from sections of the oral
communications course and sections of two English courses. Distribution course data submitted
were from sections of psychology and imagining science courses. Additionally, data were
available from NSSE.
Table 2-8 : Direct and Indirect Measures for Outcome 2c*
Type of
Measure
Test
Rubric 1
Rubric 2
Survey
Core Courses
Write 1
ENG 100
Exam-1b
Dept. Developed Essay
Dept. Developed Reading response
Oral
COMM 108
Write 2
ENG 200
Essays
Survey
Distribution Courses
SBS 2
TSBS 2
PSY 154 TC IMS 300-301
Pre-Post
Discussion Board
Indirect
NSSE
1e
*Grayed measures were not available
Measures (See Appendix A):
English 100 and 200 courses in writing used essays and a final exam to determine if students
could thoughtfully analyze and evaluate diverse points of view. Faculty used the Rubric for
Assessing Writing Effectiveness to assess an essay based on the focus for the outcome.
Typically, faculty drew a sample of 12-15% of essays from all sections of a course to assess.
The rubric defined a 4-point scale (fails criterion, meets criterion minimally, meets criterion with
minor exceptions, and fully meets criterion) across five categories for a total of 20 points. Two
experienced readers read texts holistically. When scores varied by more than two points, a third
reading took place. Scoring guides (Appendix A) for all readings were identical. English 100
students wrote an essay “shaped by a controlling claim that integrates matter from a range of
credible sources,” while English 200 students wrote an essay “that connects multiple texts across
disciplines.”
The second assessment to evaluate this outcome for writing students varied. English 100,
writing I students completed a final examination in which they responded to a series of 10
objective questions over a brief reading and then wrote a response to that reading. Writing II
Page
11
students in English 200 responded to a course survey in which they reflected upon the value of
class discussions after engaging in class discussion of multiple texts on a single cultural issue.
PSY 154 conducted a pretest-posttest as a measure of outcome 2c. Students were pretested early
in the semester and post tested late in the semester. The pretest and posttest were identical.
Students read two passages of moderate length that each presented diverse points of view. After
reading a passage, students answered 6 multiple-choice items with 5 options each designed to
assess the students' ability to evaluate the diverse points of view presented in the passage. Thus,
the test was composed of 12 multiple-choice items (2 passages x 6 items/passage) each found by
the assessment coordinator‟s review to amply address the outcome.
A transitional distribution course from Imaging Science (IMS 300-301) assessed this outcome
through a group project. Each group researched, analyzed, and evaluated a diverse legal and
ethical issue in health care. Faculty used a 10-point Discussion Board Grading Rubric to grade
the group projects. A review of the methods by the assessment coordinator found the assignment
to be highly representative and the measure to address the outcome amply. As a transitional
courses and not part of the future General Education program, the usefulness of the IMS 300-301
data will be limited.
Findings:
Two faculty readers assessed samples of 45 essays representing about 12% - 15% of ENG 200
students. When a difference of more than two points occurred in the scores, a third reading was
performed. When applied, the outlier scores for one reader needed to be adjusted up or down
based on the third reading. A review of the data identified several cases where an appropriate
adjustment was not applied. For example, a first reading of 19, a second of 16, and a third of 16
scored a total 17.5 instead of 16. All such errors favored higher scores. The process of
reconciling scores with the use of a third reader had a positive effect on inter-rater reliability.
Initial scores of first and second readers had a moderate correlations of .29. When the third
reader score was used correctly to reconcile differences of more than two points, it effectively
raised the inter-rater reliabilities to an acceptable level of .77.
The English department set two achievement targets on each assessment: 80% of students would
receive a score of 12/20 and 60% of students would receive a score of 14/20. Corrected results
show achievement of the first criterion level with 91% of students scoring at least a 12. Results
for the second target, which aligns with our reporting target of 70% (14) for success, show a
student success rate of 56% that misses the intended target. Prior to forming any conclusions
regarding these findings, a review must be made of the rubric relative to its assessment of the
outcome. This review must be driven by one question, “Does the rubric assess thoughtful
analysis and evaluation of diverse points of view?”
Page
12
English 100 in the spring planned a similar process to assess an essay shaped by a controlling
claim that integrates matter from a range of credible sources and a written response to a reading
that was part two of the second proposed assessment. The holistic nature of the assessment,
delayed the results until early fall of 2011.
The report from English did not summarize the results for the first part of the second assessment
used in ENG 100 for this outcome. They indicated that the results were listed under 1b, but
those results were for a reading comprehension outcome. The connection between the two
outcomes (1b and 2c) was not explicitly stated. It is reasonable to believe that a student who
comprehends what they read would be better prepared to thoughtfully analyze and evaluate
diverse points of view expressed. Table 1-11 (Appendix B) clearly demonstrated that at least
80% of the 353 students who took the quiz comprehended the information read.
The second part of a survey completed by ENG 200 asked students to reflect upon the value of
class discussion. Reported results identified responses to four questions (15, 17, 20 and 22) as
being the most pertinent to this outcome (See Appendix B). The response to question 20 had the
most direct relationship to the outcome and showed 82% of students perceived class discussion
to be an effective to highly effective way of understanding multiple points of view.
PSY 154 conducted a pretest-posttest as a measure of outcome 2c. Table 2-9 shows that 516
students averaged 5.4 on the pretest and 480 averaged 6.2 on the posttest. The results did not
report the significance of the change, nor was the raw data provided to allow a test of the
significance. The percent of students retained between implementations (93%) makes
comparisons reasonable. Both averages were below 70% of the total possible points (8.4) that
represents the report standard.
Table 2-9: Average class scores on PSY 154 Pretest-Posttest
Possible
Test
N
Points
Average
Percent
Pretest
516
12
5.4
45.0%
Posttest
480
6.2
52.0%
Results for an imaging science class (IMS 300-301) displayed in Table 2-10 show an average
score of 18.1 out of 20 on the analysis and evaluation of a diverse legal and ethical issue in
health care using a Discussion Board Grading Rubric . The summary did not report total
students, but did provide sufficient summary statistics to calculate the percentage of students that
would have met the 70% criterion level set for this report, or a score of 14 for this measure.
Table 2-10 shows that the mean was 18.1 with a standard deviation of 2.49. It is known that
84% of the scores will fall above one standard deviation (-2.5) below the mean (18.1), or above a
score of 15.6. (See the Normal Curve Chart in Appendix B.) So, for this data, more than 84% of
the students scored above the 70% criterion level.
Page
13
Table 2-10: Average class scores on IMS 300-301 Discussion Board Grading
Possible
Test
N
Points
Average Std. Dev. Min Max
Discussion Board
20
18.12
2.49
13
20
Rubric Grade
Senior perceptions on the 2009 NSSE specific to SLO 2c were significantly (p<.001) greater
than the NSSE mean for included diverse perspectives (different races, religions, genders,
political beliefs, etc.) in class discussions or writing assignments (NSSE Table 2, Appendix B).
Summary Table:
Table 2-11: Percent of students reported successfully achieving Outcome 2c*
Assessment
Core Courses
Oral
Write 1
COMM 108
ENG 100
Test
Rubric 1
Rubric 2
Survey
NA
NA
NA
NA
Write 2
ENG 200
56%
UTD
Distribution Courses
SBS 2
TSBS 2
PSY 154
TC IMS 300-301
UTD
 84%
Indirect
NSSE
 NSSE
*NA – not available; UTD – unable to determine
Conclusions for SLO 2c:
Based on the 2010-2011 findings for SLO 2c, the Assessment Coordinator draws the following
conclusions.
 More than 84% of MSU students taking a course in Imaging Science thoughtfully analyzed
and evaluated diverse points of view at a level that met expectations based on assessments of
group, discussion board assignments.
 Prior to forming any conclusions regarding findings based on results from the English
writing classes, a review should be made of the rubric relative to its assessment of the
outcome. This review must be driven by one question, “Does the rubric assess thoughtful
analysis and evaluation of diverse points of view?”
 Guidance needs to be made available to faculty to assist in generating more useful data.
Outcome 2d: Perceive and articulate ethical consequences of decisions and actions
Data Source Table:
Table 2-12 indicates that only one core course, the First Year Seminar, has responsibility for
assessing this outcome.
Page
14
Table 2-12 : Direct and Indirect Measures for Outcome 2d
Core Courses
FYS
Presentation
Writing
Type of Measure
Rubric 1
Rubric 2
Measures (See Appendix A):
FYS faculty used two rubrics to assess outcome 2d: the Presentation Assessment Scoring Guide
and the Writing Assessment Scoring Guide. Each rubric assessed this outcome as part of the
content assessment with the form of assessment varying from fall to spring. Both rubrics in the
fall assessed whether “articulation of ethical consequences is evident” on a weighted 5-point
scale (failure, below average, average, good, and excellent) providing a total possible score of 20
which was converted to a percentage for grading purposes. The spring presentation rubric
assessed whether the content “articulated ethical consequences of an identified problem” on a 3point scale (fails to meet criterion, meets criterion, exceeds criterion) providing a possible score
of 3. The spring writing rubric assessed both components of the outcome by assessing first
whether the student “recognized an ethical problem related to the topic” and second whether the
student “clearly explained ethical principles related to [the] problem” on a 3-point scale (fails to
meet criterion, meets criterion, exceeds criterion) providing a possible score of 6.
Findings:
Table 2-13 shows that, in the fall, 87.9% of 719 students giving presentations and 85.0% of 710
students‟ writing assignments received at least an average assessment for the articulation of
ethical consequences. Clearly, students had less difficulty in giving evidence during a
presentation than when writing as evidenced by the higher percentages of good and excellent
assessments for presentations (76.1%) than for writing (40.3%).
Table 2-13: FYS Fall Rubrics Score Distribution for Outcome 2d
SLO 2d
Failure
(<5)
N
Pct
Below Avg
(5-9)
N
Pct
Avg
(10-14)
N
Pct
Good
(15-19)
N
Pct
Excellent
(20)
N
Pct
Total
N
Pct
Presentation
65
9.0%
21
2.9%
85
11.8%
326
45.3%
222 30.8%
719
99.9%
Writing
93
13.1%
13
1.8%
317
44.6%
99
13.9%
188 26.4%
710
99.9%
Table 2-14 shows that, in the spring, 74.2% of 90 students giving presentation and 68.8% of 73
students‟ writing assignments received at least met the criteria for the articulation of ethical
consequences. Students, once again, had less difficulty in giving evidence during a presentation.
Writing results also reveal that more students achieved the lower order thinking skill of
Page
15
recognizing an ethical problem (84%) than achieved the higher order skill of clearly explaining
one (69%).
Assuming that criteria of “average” performance and “meets criterion” describe adequate
performance in both cases, a simple calculation can be used to combine fall and spring semester
results. This calculation, which takes the number of students assessed into account, shows 86%
of 809 students adequately articulated ethical consequences of an identified problem during a
presentation. A similar calculation shows that 83% of 783 students adequately articulated ethical
consequences in a written assignment.
Table 2-14: FYS Spring Rubrics Score Distribution for Outcome 2d
Perceive and articulate
ethical consequences of
decisions and actions
SLO 2d
Fails to Meet
Criterion
N
Meets Criterion
Exceeds
Criterion
At least
Met
SLO
Count
Pct
Count
Pct
Count
Pct
Pct
Avg
17
25.8%
27
40.9%
22
33.3%
74.2%
2.08
A. Recognized an ethical
problem
10
15.6%
23
35.9%
31
48.4%
84.4%
2.33
B. Clearly explained ethical
principles related to problem
20
31.3%
24
37.5%
20
31.3%
68.8%
2.00
44
68.8%
Presentation:
IV. Articulated ethical
consequences of an identified
problem
90
Writing:
73
Writing: Met both criteria
Additional analyses comparing the presentation and writing total scores on SLO 2d showed
moderate though significant correlations of .49 and 0.57 (p < .001) for fall and spring measures,
respectively (Table 2-15). These moderate correlations help to establish the validity of these two
measures.
Summary Table:
Table 2-15: Percent of students reported successfully achieving Outcome 2d
Assessment
Rubric 1: Presentation
Rubric 2 : Writing
Core Courses
FYS – Fall
FYS - Spring
87.9% (N=719)
74.2% (N=90)
85% (N=710)
68.8% (N=73)
Total
86% (N=809)
83% (N=783)
Page
16
Conclusions for SLO 2d:
Based on the 2010-2011 findings for SLO 2d, the Assessment Coordinator draws the following
conclusions.
 More than 80% of MSU students adequately articulated ethical consequences during oral
presentations and written assignments.
 A greater percentage of students successfully recognized an ethical problem (84%) than
clearly explained ethical principles related to the problem (69%).
Outcome 2e: Apply knowledge and skills to new settings and complex problems
Data Source Table:
Table 2-16 indicates that two core courses have responsibility for assessing this outcome. Data
submitted in this initial year were from the First Year Seminar and in subsequent years, data
submitted by Capstone courses will supplement this data. Additionally, the CAAP Math test
from 2009 provides baseline data for institutional level assessment and the NSSE provides data.
Table 2-16 : Direct and Indirect Measures for Outcome 2e*
Type of Measure
Test
Rubric 1
Rubric 2
Survey
Core Courses
FYS
Capstone
Pre-Post
Writing
External
CAAP
Critical Thinking
Indirect
NSSE
Presentation
Project
2b, 2c, 2e, 11e, 11m
**Grayed courses did not submit data in 2010-2011
Measures (See Appendix A):
The GEC scheduled two core courses to provide internal direct measures of intellectual skills
associated with outcome 2e, the FYS and the Capstone course. FYS administered two
assessments designed to assess the application of knowledge and skills to new settings and
complex problems (SLO 2e). The pretest-posttest conducted fall 2010 and spring 2011 was to
address three SLOs: reading comprehension (1b), and intellectual skills of problem-solving (2e)
and esoteric, critical and creative thinking (2f). The twenty-five item multiple choice, pretestposttest instrument used 9 items to measure reading comprehension based on the current required
reading1 assignment, and the remaining items to measure understanding of six online course
modules. Of the nine items for reading comprehension, only two aligned with outcome 2e by
applying knowledge to new settings. Faculty administered the pretest in the first week of the
course and the posttest in the last week of the course.
1
Jackson, Brooks and Kathleen Jameson. unSpun: Finding Facts in a World of Disinformation.
Page
17
The FYS Writing Assessment Scoring Guide/Rubric used in fall 2010 assessed one criteria for
outcome 2e on a weighted 5-point scale (failure, below average, average, above average, and
excellent) providing a total possible score of 20. Faculty reported the total percentage achieved
by each student. To improve the feedback to faculty, the Writing Assessment Scoring
Guide/Rubric used spring 2011 changed two key features. Faculty assessed 4 criteria for SLO 2e
on a 3-pont scale (fails to meet criterion, meets criterion, and exceeds criterion). Scores for the
outcome were the sum of the four criteria that defined its success allowing for a possible score of
12. Faculty entered the score for each task and criteria onto a spreadsheet designed for this
purpose.
The GEC designed two rubrics for faculty to implement in each Capstone 499c course. While
implementation of the capstone course rubrics will not occur until the 2011-2012 academic year,
a review of the rubrics will assist in understanding the scope of the learning process for this SLO
designed into the GE program. The two rubrics are the Capstone Presentation Assessment
Rubric and the Capstone Project Rubric. With the presentation rubric, faculties assess two
criteria for SLO 2e: summarized most meaningful ideas of findings, and discussed possible
applications. These align with items V-C and V-D of the FYS Writing Assessment Scoring
Guide/Rubric. The project rubric has 38 criteria, 15 of which assess SLO 2e and align with the
four criteria from the FYS writing rubric as seen in Table 2-17. The alignment between FYS and
Capstone rubrics provide opportunities to assess learning of these skill sets.
Faculty will enter data for both capstone rubrics onto spreadsheets designed for this purpose.
The Assessment Coordinator designed an “Excel Management” spreadsheet for the Capstone
Project Rubric to provide faculty and the GEC with a copy of the rubric, point scale
interpretations, and immediate analyses showing summary data and charts for category
averages, and counts & percentages for scale ratings by criteria and by category. A sample data
set may be viewed at this link: Capstone Project Excel Management.
Two external measures that address SLO 2e include the CAAP Critical Thinking module and the
NSSE. The “CAAP Critical Thinking Test” is a 32-item, 40-minute test that measures students'
skills in clarifying, analyzing, evaluating, and extending arguments.”2 This skill set most closely
aligns with the problem solving skills and measures required for achieving MSU outcome 2e.
2
ACT, Inc. (2011, July 27). Retrieved from http://www.act.org/caap/test_thinking.html
Page
18
Table 2-17: Alignment of Capstone Project Rubric SLO 2e criteria to FYS Writing Rubric
FYS
Writing
Capstone Project Rubric Chapter Criteria for SLO 2e
Statement of the Purpose
VA
 Justifies the need through relevant, current literature
VA
 Specifies key research objectives to accomplish
VA
 Defines the scope and limitations to focus the project
 Defines unique terminology
Review of Resources
 Chooses sources (literature, records, interviews, focus groups) relevant to support the need
VB
VC
VA
 Summarizes significant ideas derived from the reviewed references/resources
 Synthesizes concepts to show relationships and patterns of knowledge
Methods
 Identifies objectives to accomplish the project's purpose
 Develops an implementation plan as a guide to project completion
 Presents an evaluation plan to determine level of project accomplishment
 Describes the data/findings collection methods
VC
VD
V
 Describes the data analysis (quantitative and qualitative) methods utilized
Conclusions & Recommendations
 Accurately synthesizes the information generated from the project to draw conclusions and
recommendations
 Draws conclusions about the effectiveness of the project to accomplish the purpose

Recommends at least one application of the results
Findings:
The test instrument used in the First Year Seminar course addressed student learning outcomes in
categories on reading comprehension skills (1b) and intellectual skills (2e & f) . The testing
process only captured total scores. So, while t-Test results (Table 1-7, Appendix B) comparing
pretest to posttest scores show significant (p<.001) learning, it is not possible to generalize to
SLO achievement.
Table 2-18 shows that, in the fall, 85% of 710 students‟ writing assignments received at least an
average assessment for the “application of knowledge and skills to new settings and /or complex
problems.”
Table 2-18: FYS Fall Writing Rubrics Score Distribution for Outcome 2e
Failure
(<5)
SLO 2d
Writing
N
88
Pct
12.4%
Below Avg
(5-9)
N
17
Pct
2.4%
Avg
(10-14)
N
315
Pct
44.3%
Good
(15-19)
N
110
Pct
15.5%
Excellent
(20)
N
180
Pct
25.3%
Total
N
710
Pct
99.9%
Page
19
Table 2-19 shows that, in the spring, the criteria for the application of knowledge and skills to
new settings and/or complex problems was met by at least 78% of 73 students‟ writing
assignments. More students met criteria based identification (83% and 86%) than the higher
order skills of synthesizing (78%) and drawing conclusions (78%). The percentage of students
who met all criteria was 62% and 70% met at least three of the four criteria. Synthesizing key
ideas had the smallest percentage of students exceeding the criterion.
Table 2-19: FYS Spring Rubrics Score Distribution for Outcome 2e (N=73)
V. Applied knowledge and skills to
new settings and/or complex
problems
SLO 2e
Fails to Meet
Criterion
Meets
Criterion
Exceeds
Criterion
At least
Met
Count
Pct
Count
Pct
Count
Pct
Pct
Avg
9
14.1%
32
50.0%
23
35.9%
85.9%
2.22
11
17.2%
35
54.7%
18
28.1%
82.8%
2.11
C. Synthesized key ideas
14
21.9%
37
57.8%
13
20.3%
78.1%
1.98
D. Drew conclusions related to the problem
14
21.9%
29
45.3%
21
32.8%
78.1%
2.11
40
62.5%
45
70.3%
A. Applied knowledge to identify information
needed to address the situation or problem
B. Identified concepts & principles relevant to
the topic
Overall: Met all 4 criteria
Met 3 of 4 criteria
7.38
Assuming that criteria of “average” performance and “meets criterion” describe adequate
performance in both cases, a simple calculation can be used to combine fall and spring semester
results. This calculation, which takes the number of students assessed into account, shows 83%
of 783 students adequately applied knowledge and skills to new settings and/or complex
problems in a written assignment.
The GEC approved about 20 capstone courses to commence in 2011-2012. Others are scheduled
for approval at the first fall 2011 meeting to bring the total capstone courses close to 40. No
results based on these assessments will be available until summer 2012.
The 2009 NSSE data show average scores for MSU seniors‟ perceptions to be significantly
greater (p < .05) than NSSE averages in two items aligned with outcome 2e: thinking critically
and analytically (11e) and solving complex real-world problems (11m). Three additional items
aligned with SLO 2e showed perceptions of MSU seniors to be comparable to the NSSE mean.
These included analyzing basic elements of an idea, experience, or theory . . . (2b); synthesizing
and organizing ideas, information, or experiences into new, more complex interpretations and
relationships (2c); and applying theories or concepts to practical problems in new situations (2e)
(NSSE Table 2, Appendix B).
Page
20
CAAP data from 2009 shown in Table 2-20 provides baseline data from which to assess progress
of the new GE paradigm categories of intellectual skills and quantitative skills. This data shows
the mean for 205 MSU seniors (64.3) to be greater than the National mean (62). Data also
identified 204 of the seniors to be enrolled as freshman. Future CAAP data will provide an
indicator of general education program effectiveness.
Table 2-20: CAAP 2009-10 Baseline Data for SLO 2e
SLO Categories Assessed
2. Intellectual Skills: CAAP Critical
Thinking
Mean
SD
10/2009
Freshmen
60.0
4.6
2/2010
Seniors
64.3
4.6
2009-10
Natl
62.0
5.4
208
205
9531
N
Senior perceptions on the 2009 NSSE specific to SLO 2e were significantly (p<.05) greater than
the NSSE mean for solving complex real-world problems (11m). Perceptions were comparable
to NSSE means for synthesizing and organizing ideas, information , or experiences into new,
more complex interpretations and relationships (2c); and applying theories or concepts to
practical problems in new situations (2e) (NSSE Table 2, Appendix B).
Summary Table:
Table 2-21: Percent of students reported successfully achieving Outcome 2e*
Assessment
Test
Rubric 1
Rubric 2
Survey
Core Courses
FYS
Capstone
Pre-Post
Writing
External
CAAP
Critical Thinking
Indirect
NSSE
Presentation
Project
2c, 2e, 11m
* Grayed data were not available
Conclusions for SLO 2e:
Based on the 2010-2011 findings for SLO 2e, the Assessment Coordinator draws the following
conclusions.
 While t-Test results comparing pretest to posttest scores reveal significant (p<.001) learning,
the FYS data collection process was insufficient to allow a pretest-posttest assessment by
SLO.
 83% of 783 students adequately applied knowledge and skills to new settings and/or complex
problems in a written assignment.
Page
21
 A sample of 205 MSU 2010 seniors performed better than the national mean on the CAAP
Critical Thinking Test.
 Average scores for MSU seniors‟ perceptions comparable or significantly greater (p < .05)
than NSSE averages on five items aligned with outcome 2e including thinking critically and
analytically (11e) and solving complex real-world problems (11m).
Outcome 2f: Explore the connections among practical, esoteric, critical and creative
thinking
Data Source Table:
Table 2-22 indicates that two core courses have responsibility for assessing this outcome. Due to
problems associated with test construction, data were not submitted in this initial year from the
First Year Seminar or the Capstone courses.
Table 2-22 : Direct and Indirect Measures for Outcome 2f*
Core Courses
Type of Measure
Test
FYS
Capstone
Pre-Post
Rubric
Project
* Grayed courses did not submit data in 2010-2011
Measures (See Appendix A):
The GEC scheduled two core courses to provide internal direct measures of intellectual skills
associated with outcome 2e, the FYS and the Capstone course. FYS administered a pretestposttest fall and spring semesters of 2010-2011 to address four SLOs: reading comprehension
(1b), and intellectual skills of problem-solving (2e) and esoteric, critical and creative thinking
(2f). The twenty-five item multiple choice, pretest-posttest instrument used 9 items to measure
reading comprehension based on the current required reading3 assignment, and the remaining
items to measure understanding of six online course modules. Of the nine items for reading
comprehension, two aligned with SLO 2e and none aligned with SLO 2f. Faculty administered
the pretest in the first week of the course and the posttest in the last week of the course.
The Capstone Project Rubric has 38 criteria, 3 of which assess SLO 2f, explore the connections
among practical, esoteric, critical and creative thinking. These three items are bulleted below.
The Excel Management spreadsheet does not currently show assessment by SLO.
•
3
Identifies project strengths: characteristics that positively impact its efficiency or
effectiveness
Jackson, Brooks and Kathleen Jameson. unSpun: Finding Facts in a World of Disinformation.
Page
22
•
•
Identifies project weaknesses: characteristics that negatively impact its efficiency or
effectiveness
Identifies unresolved issues or areas for further research
The notable absence of the CAAP Critical Thinking Test deserves mentioning. While it would
seem logical that this test would be included as a measure for an SLO associated with critical
thinking, the test content are actually more closely aligned with SLO 2e.
Findings:
The test instrument used in the First Year Seminar course intended to address student learning
outcomes in categories on reading comprehension skills (1b) and intellectual skills (2e & 2f) .
Though results of a t-Test comparing pretest and posttest scores show significant (p<.001)
learning, a lack of alignment of test items to SLO 2f fails to support achievement of this SLO.
Summary Table:
Table 2-23: Percent of students reported successfully achieving
Outcome 2f *
Core Courses
Type of Measure
Test
FYS
Capstone
Pre-Post
Rubric
Project
*Grayed data were not available
Conclusions for SLO 2f:
Based on the lack of 2010-2011 data for SLO 2f, the Assessment Coordinator can draw no
conclusions.
Page
23
Goal 2 Recommendations
Recommendations specific to SLOs 2a-2f:
1) While almost (89%) of ENG 100 students achieved a 70% score on the library quiz, less than
half of ENG 200 successfully achieved this level on a similar though much shorter quiz. It is
recommended that subscores be developed for the ENG 200 quiz to determine which
SLO 2a task, locating or evaluating information, presented the most difficulty for
students. (2a)
2) While about three-fourths of students said their experiences and practice using online
databases or electronic resources improved, about two –thirds of the students indicated
increased willingness to use these skills in the future. It is recommended that this type of
library experience with specialized online databases and electronic research tools be
continued and encouraged for achieving SLO 2a.
3) As we move toward improving the process, future assessments must capture all of the
assessment data to allow a more detailed breakdown of the FYS pretest-posttest scores by
SLO. (2e)
General Recommendations based on SLOs 2a-2f:
4) To increase usefulness and applicability of data submitted, it is recommended that
instruction be provided to faculty on the purpose and need for disaggregating data by
the elements of an outcome. This could be in the form of a workshop on the reporting
of assessment findings. (2a) (2c)
5) It is recommended that strategies be developed for improving retention rates. (2a)
6) Faculty need to apply analyses that adequately address levels of student outcome
success. While the items used to assess SLO 2b reasoning skills were logical, the percentage
of students successfully recognizing or using reasoning skills could not be determined. (2b)
7) Faculty need to coordinate efforts in assessing the same outcome. (2b)
8) The new writing rubric looked at two performance indicators for an SLO previously assessed
as a whole. This approach provided valuable information about student performance. As
MSU moves toward process improvement, future data collection should attempt to
increase the performance indicators for SLOs. (2d)
Page
24
Goal 3 – Adequate Quantitative Skills
Each of the Data Source Tables for the student learning outcomes related to this goal show
supporting data from two external measures; the CAAP Mathematics Test, a direct measure, and
the NSSE, an indirect measure. Neither measure has a score nor subscore related to an
individual SLO, but rather assesses this goal as a whole. As such, descriptions under the goal
heading provide the relevant information related to these measures to eliminate unnecessary
replication of information.
CAAP Mathematics Test4. The CAAP (Collegiate Assessment of Academic Proficiency )
Mathematics Test is a 35-item standardized, nationally normed direct measure used by MSU to
assess students‟ proficiency in using quantitative reasoning to solve mathematical problems. The
overall test represents six content areas: prealgebra, elementary algebra, intermediate algebra,
coordinate geometry, college algebra, and trigonometry. The test has a 40-minute time limit.
CAAP data from 2009 shown in Table 3-6 provides baseline data from which to assess progress
of the new GE paradigm categories of quantitative skills. Future CAAP data will provide an
indicator of core course effectiveness. This table represents only 21 seniors.
Table 3-1: CAAP 2009-10 Baseline Data for the Quantitative Skills SLO Category
SLO Categories Assessed
Freshmen
Sophomore
Junior
All Students
(Includes
Seniors)
Mean
57.4
58.1
56.9
57.5
58.5
SD
3.5
3.3
3.6
3.5
4.0
N
103
80
31
235
21678
% Completed GE Math requirement
35.0
66.3
77.4
53.6
3/2009
3. Quantitative Skills: CAAP Math
Natl
PB
NSSE offers two items related to quantitative skills. The first item (11f - Analyzing quantitative
problems) is general to goal 3. The second item (2d), which more closely aligns with SLO 3c,
addresses the concept of data verification at a more global level.
Senior mean scores for perceptions on the 2009 NSSE specific to Goal 3, adequate quantitative
skills, were significantly greater than (p < .001) the NSSE mean for analyzing quantitative
problems (11f), and for making judgments about the value of information, arguments, or
methods, such as examining how others gathered and interpreted data and assessing the
soundness of their conclusions (2d). (NSSE Table 2, Appendix B).
4
ACT, Inc. (2011, Aug 4). CAAP Science Test. Retrieved from http://www.act.org/caap/test_math.html
Page
25
Outcome 3a. Analyze situations and/or problems using arithmetic, geometric, algebraic,
and statistical methods.
Data Source Table:
Table 3-2 shows that five core math courses have responsibility for assessing this outcome. Data
submitted in this initial year were from two sections of Math 131 and three sections of Math 135.
Additionally, the external CAAP Mathematics Test will provide data to support this goal.
Table 3-2 : Direct and Indirect Measures for Outcome 3a*
Type of
Measure
Rubric 1
Rubric 2
Rubric 3
Test
Survey
MATH
131
Sampling Theory
Group Project
Lab report
Exam Question
Core Courses
MATH
MATH
135
152
Authentic
NA
Group Project
MATH
174
MATH
175
NA
NA
External
Indirect
CAAP
NSSE
Exam Question
Mathematics
11f
*NA – Not available
Measures (See Appendix A):
Two core math courses (Math 131 and Math 135) provided data to support the three SLOs
associated with Goal 3. They each used a group project rubric and an exam question rubric to
assess SLO 3a. Math 131 switched to a laboratory report rubric in the spring.
A fall faculty member in Math 131 used the Sampling Theory Group Project Rubric to assess
outcome 3a. This rubric defined six tasks scored on a scale from 0 – 4 including requirements to
define, describe, graph, discuss and explain variables associated with sampling procedures. The
rubric did not define the scale, so it was not possible to discern the level of achievement
assessed. A spring faculty for this course used a Laboratory Report Common Rubric to assess a
homework and/or lab assignment in which students compared two sets of data and described
their distributions. This rubric defined five factors scored on a scale from 0 – 3 including
understanding the project‟s purpose, carrying out steps on computer program steps, publishing
the steps, keeping well organized, and arriving at correct conclusions about stratified random
samples. Neither of these rubrics defined the scale, so it was not possible to discern the level of
achievement assessed.
Math 135 faculty used the Authentic Group Project rubric to assess a project on Buying a House.
This rubric comprised five criteria (organized, complete, knowledge of expenses and the
mathematics, and amortization table production) across three levels. While each level was
characterized for each criterion, the levels themselves were not defined. Hence, the reviewer
could not determine the level necessary for outcome achievement.
Page
26
Faculty in both Math 131 and Math 135 used a rubric to score exam questions in determining
achievement of this outcome. The Exam Question Rubric 1 for Math 131 had three criteria,
while Exam Question Rubric 2 for Math 135 had four criteria. The nature of the exam item may
affect the number and type of criteria. None of the criteria seems to address the outcome directly.
Findings:
Four tables (3-3 through 3-6) present the results from two math courses (Math 131 and Math
135) for SLO 3a. The first two tables show results based on projects for each course followed
by two tables showing results based on exam questions. More students in Math 131
demonstrated analyzing situations and/or problems using arithmetic, geometric, algebraic, and
statistical methods through projects (73.5%) compared to exam questions (68.9%). Just the
opposite was true for students in Math 135 where more students demonstrated SLO achievement
through exam questions (76.8%) compared to a project (61.0%). At least 69% of the students
assessed in Math 131 and 61% of those assessed in Math 135 achieved outcome 3a.
Table 3-2 also revealed a lack of complete participation in the assessment process by Math 131
faculty. Faculty in each core course were to implement two assessments for each outcome. Less
than half of these faculty completed both assessments.
Table 3-3: Percent of Math 131 students reported successfully achieving Outcome 3a
based on Lab Reports and a Group Project
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
006 - Fall 2010
003 - Fall 2010
004 - Fall 2010
005 - Fall 2010
005 - Spring 2011
006 - Spring 2011
Lab Report
Group Project
Totals
30
Did not complete
Did not complete
Did not complete
19
Did not complete
49
28
93.33%
8
42.11%
36
73.5%
Table 3-4: Percent of Math 135 students reported successfully achieving Outcome 3a
based on a Group Project
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
001 - Fall 2010
004 - Fall 2010
093 - Spring 2011
Totals
Authentic Group
Project
26
25
31
14
12
24
53.85%
48.00%
77.42%
82
50
61.0%
Page
27
Table 3-5: Percent of Math 131 students reported successfully achieving Outcome 3a
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
006 - Fall 2010
003 - Fall 2010
004 - Fall 2010
Exam Question
Rubric
30
32
30
13
20
19
43.33%
62.50%
63.33%
005 - Fall 2010
005 - Spring 2011
006 - Spring 2011
33
19
36
24
18
30
72.73%
94.74%
83.33%
Totals
180
124
68.9%
Table 3-6: Percent of Math 135 students reported successfully achieving Outcome 3a
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
001 - Fall 2010
004 - Fall 2010
Exam Question
Rubric
26
25
22
18
84.62%
72%
093 - Spring 2011
31
23
74.19%
Totals
82
63
76.8%
Summary Table:
Table 3-7: Percent of students reported successfully achieving Outcome 3a *
Assessments
Rubric 1: Group Project
Rubric 2: Lab Report
Rubric 3: Exam Question
Test
Survey
MATH
131
42.1%
93.3%
68.9%
MATH
135
61%
Core Courses
MATH
MATH
152
174
NA
NA
MATH
175
NA
External
Indirect
CAAP
NSSE
76.8%
Mathematics
.> NSSE
*NA – Not available
Conclusions for SLO 3a:
Based on the 2010-2011 findings for SLO 3a, the university draws the following conclusions.
 At least 69% of the students assessed in Math 131 and 61% of those assessed in Math 135
achieved outcome 3a.
 Math rubrics did not have defined achievement scales, making interpretation of results
difficult.
Page
28
 Math results were provided for the overall student sample, which is beneficial for reporting
outcome accomplishment, but effectively loses the data on task performance which could
provide useful information upon which to improve classroom instruction.
3b. Use deductive reasoning in a formal, symbolic, axiomatic system.
Data Source Table:
Table 3-8 shows that five core math courses have responsibility for assessing this outcome. Data
submitted in this initial year were from two sections of Math 131 and three sections of Math 135.
Additionally, the external CAAP Mathematics Test will provide data to support this goal.
Table 3-8 : Direct and Indirect Measures for Outcome 3b*
Type of
Measure
Rubric 1
Rubric 2
Test
Survey
MATH
131
Core Courses
MATH
MATH
135
152
Exam Question Exam Questions
In-Class Activity
NA
MATH
174
MATH
175
NA
NA
External
Indirect
CAAP
NSSE
Mathematics
11f
*NA - not available
Measures (See Appendix A):
Faculty in both Math 131 and Math 135 used a rubric to score exam questions in determining
achievement of this outcome. The Exam Question Rubric 1 for Math 131 had three criteria and
was used in one assessment, while Exam Question Rubric 3 for Math 135 had four criteria and
was used in two assessments. The nature of the exam item may affect the number and type of
criteria. None of the criteria seems to address the outcome directly.
Faculty of Math 131 used an In-Class Activity Rubric to assess students‟ demonstration of
deductive reasoning as they assigned order requirements to a series of tasks involved in
completing a project, created a graph to represent these requirements, created a schedule
and determined if this schedule is optimal. The rubric had four tasks assessed on a 4-point
scale that ranged from 3 – 0 with three divisions, 0-1 being a single range. While each level
was characterized for each task, the levels themselves were not defined. Hence, the reviewer
could not determine the level necessary for outcome achievement.
Page
29
Findings:
Four tables (3-9 through 3-12) present the results from two math courses (Math 131 and Math
135) for SLO 3b. The first two tables show results based on an in-class activity for students in
Math 131 followed by three tables showing results based on exam questions for both courses.
More students in Math 131 demonstrated using deductive reasoning in a formal, symbolic,
axiomatic system through projects (87.5%) compared to exam questions (57.2%). Students in
Math 135 had two assessments through exam questions and more of them succeeded on the first
assessment (78%) than succeeded on the second assessment (51%). Responses to criterion of the
rubric would be valuable for showing where students had difficulty with the second task.
Table 3-9: Percent of Math 131 students reported successfully achieving Outcome 3a
based on an In-Class Activity
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
30
27
90%
003 - Fall 2010
32
28
87.5%
004 - Fall 2010
30
26
86.67%
005 - Fall 2010
33
31
93.94%
19
14
73.68%
126
87.5%
006 - Fall 2010
005 - Spring 2011
Lab Report
Group Project
006 - Spring 2011
Did not complete
Totals
144
Table 3-10: Percent of Math 131 students reported successfully achieving Outcome 3b
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
006 - Fall 2010
003 - Fall 2010
Exam Question
Rubric
30
13
43.33%
32
22
66.67%
004 - Fall 2010
30
13
43.33%
005 - Fall 2010
33
19
57.58%
005 - Spring 2011
19
18
94.74%
006 - Spring 2011
36
18
50.0 %
Totals
180
103
57.2%
Page
30
Table 3-11: Percent of Math 135 students reported successfully achieving Outcome 3b
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
26
24
92.31%
25
17
68.0 %
093 - Spring 2011
31
23
74.19%
Totals
82
64
78.0%
001 - Fall 2010
004 - Fall 2010
Exam Question
Rubric 1
Table 3-12: Percent of Math 135 students reported successfully achieving Outcome 3b
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
001 - Fall 2010
Exam Question
26
10
38.46%
Rubric
2
004 - Fall 2010
25
10
40.0 %
093 - Spring 2011
31
22
70.97%
Totals
82
42
51.2%%
Summary Table:
Table 3-13: Percent of students reported successfully achieving Outcome 3b *
Assessments
Rubric 1: Group Project
Rubric 2: Exam Question 1
Rubric 2: Exam Question 2
Test
Survey
MATH
131
87.5%
57.2%
MATH
135
Core Courses
MATH
MATH
152
174
NA
NA
MATH
175
NA
External
Indirect
CAAP
NSSE
78.0%
51.2%
Mathematics
.> NSSE
*NA – Not available
Conclusions for SLO 3b:
Based on the 2010-2011 findings for SLO 3b, the university draws the following conclusions.
 At least 57% of the students assessed in Math 131 and 51% of those assessed in Math 135
achieved outcome 3a.
 Math rubrics did not have defined achievement scales, making interpretation of results
difficult.
 Math results were provided for the overall student sample, which is beneficial for reporting
outcome accomplishment, but effectively loses the data on task performance that could
provide useful information for determining students‟ areas of weakness and strength.
Page
31
3c. Verify answers to mathematical and scientific problems in order to determine
reasonableness, identify alternative methods of solution, and select the most reliable results.
Data Source Table:
Table 3-1 shows that five core math courses have responsibility for assessing this outcome. Data
submitted in this initial year were from two sections of Math 131 and three sections of Math 135.
Additionally, the external CAAP Mathematics Test will provide data to support this goal.
Table 3-14 : Direct and Indirect Measures for Outcome 3c*
Type of
Measure
Rubric 1
Rubric 2
MATH
131
Core Courses
MATH
MATH
135
152
Authentic
Authentic
Group Project
Group Project
Exam Question Exam Question
Test
Survey
NA
MATH
174
MATH
175
NA
NA
External
Indirect
CAAP
NSSE
Mathematics
11f, 2d
*Grayed measures were not available
Measures (See Appendix A):
Faculty of Math 131 and Math 135 used the Authentic Group Project rubric first described for
Math 135 faculty in assessment SLO 3a to assess a project on Buying a House. This rubric
comprised five criteria (organized, complete, knowledge of expenses and the mathematics, and
amortization table production) across three levels. While each level of the criteria listed was
characterized for each criterion, the levels themselves were not defined. Hence, the reviewer
could not determine the level necessary for outcome achievement. More importantly, it is
difficult to ascertain the relationship between these criteria and the SLO. There is not criterion
for verifying answers from which to make a direct assessment.
Math 131 faculty planned to use an Exam Question Rubric as a second assessment for this
outcome, but did not identify it. Faculty form Math 135 used Exam Question Rubric 3 defined
for SLO 3b. The reviewer had similar difficulties in relating the four criteria of this rubric
(answers complete, organized, work shown, and correct) to the outcome assessed (SLO 3c).
Findings:
Four tables (3-15 through 3-18) present the results from two math courses (Math 131 and Math
135) assessing students‟ performance in verifying answers for outcome 3c. The first two tables
show results based on an authentic group project based on mathematical concepts and principles
applicable to buying a house. The next two tables show results based on the assessment of exam
questions for both courses. Only one section of Math 131 reported results using either
Page
32
assessment for this outcome showing less than half (47%) of the students verified answers. A
greater proportion of students in Math 135 demonstrated verifying answers to a mathematical
problem through an exam question (77%) than through a project (71%). Responses to criterion
of the rubric would be valuable for showing where students had difficulty with the second task.
A greater proportion of students in Math 135 demonstrated verifying answers to a mathematical
problem through the authentic project (71%) than students in Math 131 (47%). Collectively, at
least 67 of 101 students (66.3%) verified answers to a mathematical problem.
Table 3-15: Percent of Math 131 students reported successfully achieving Outcome 3c
based on a Group Project
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
006 - Fall 2010
003 - Fall 2010
004 - Fall 2010
005 - Fall 2010
005 - Spring 2011
Authentic Group
Project
Group Project
Did not complete
Did not complete
Did not complete
Did not complete
19
006 - Spring 2011
9
47.39%
9
47.39%
Did not complete
19
Totals
Table 3-16: Percent of Math 135 students reported successfully achieving Outcome 3c
based on a Group Project
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
26
19
73.08%
25
15
60.00%
093 - Spring 2011
31
24
77.42%
Totals
82
58
70.7%
001 - Fall 2010
004 - Fall 2010
Authentic Group
Project
Table 3-17: Percent of Math 131 students reported successfully achieving Outcome 3c
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
006 - Fall 2010
003 - Fall 2010
004 - Fall 2010
005 - Fall 2010
005 - Spring 2011
006 - Spring 2011
Totals
Exam Question
Rubric - Undefined
Not Available (NA)
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
NA
Page
33
Table 3-18: Percent of Math 135 students reported successfully achieving Outcome 3c
based on Exam Questions
Students
Students Meeting Outcome
Section
Measure
Number
Number
Percentage
26
24
92.31%
25
17
68.00%
093 - Spring 2011
31
23
74.19%
Totals
82
63
76.8%
001 - Fall 2010
004 - Fall 2010
Exam Question
Rubric 3
Summary Table:
Table 3-19: Percent of students reported successfully achieving Outcome 3c*
Assessments
Rubric 1: Group project
Rubric 2: Exam Question
Test
Survey
MATH
131
47%
NA
Core Courses
MATH MATH MATH
135
152
174
71%
NA
NA
77%
MATH
175
NA
External
Indirect
CAAP
NSSE
Mathematics
.> NSSE
*NA - not available
Conclusions for SLO 3c:
Based on the 2010-2011 findings for SLO 3c, the university draws the following conclusions.
 At least 66.3% of 101 students verified answers to a mathematical problem in order to
determine reasonableness, identify alternative methods of solution, and select the most
reliable results.
Page
34
Goal 3 Recommendations
Recommendations specific to SLOs 3a-3c:
1) Math rubrics need to be revised to attach meaning to the assessment scale.
2) Math faculty need to summarize data for all students by course across the year.
General Recommendations based on SLOs 3a-3c:
3) As we move toward improving the process, future assessments must capture all of the
assessment data to allow a more detailed breakdown of the FYS pretest-posttest scores by
SLO.
4) The need for comparable data is important, as is the ability to interpret the data consistently.
The reviewer recommends that the GEC work with faculty to ensure that rubrics represent
the outcome, are consistent, and have adequately defined scales.
Page
35
Goal 4 – Applicable Knowledge of Human Cultures
Each of the Data Source Tables for the student learning outcomes related to this goal show
supporting data from an external, indirect measure; the National Survey for Student Engagement
(NSSE). One measure on the NSSE (11l – Understanding people of other racial and ethnic
backgrounds) relates to this goal as a whole. As such, descriptions under the goal heading
provide the relevant information related to these measures to eliminate unnecessary replication of
information.
Senior mean score for perceptions on the 2009 NSSE specific to Goal 4 – Applicable Knowledge
of Human Cultures, was comparable to the NSSE mean for understanding people of other racial
and ethnic backgrounds (11l). (NSSE Table 2, Appendix B).
4a. Examine the history of the United States and explain the basic principles and operation
of the United States government with a view to being a responsible citizen
Data Source Table:
Table 4-1 indicates that only one distribution course has responsibility for assessing this
outcome. Additionally, data were available from NSSE.
Table 4-1: Direct and Indirect Measures for Outcome 4a
Distribution course
Indirect
Type of Measure
Test
Rubric
Survey
SBS 1
GOVT 362
NSSE
Exam Questions
Research Paper
11i, 11l
*Grayed measures were not available
Measures (See Appendix A):
Students in GOVT 362, a transitional distribution course, were to be evaluated on their ability to
identify the U.S. position on a variety of critical foreign policy issues through various in-class
assignments and discussions. Exam questions would require the student to analyze U.S. national
interests in various regions of the world and prescribe state behavior that best protects those
interests. In addition, students would conduct a research paper requiring them to analyze U.S.
interests with respect to the topic chosen.
The 2009 NSSE data show MSU seniors‟ perceptions to be comparable to the NSSE mean for
voting in local, state, or national elections (11i). (NSSE Table 2, Appendix B.)
Page
36
Findings:
The assessment coordinator received no results.
Summary Table:
Table 4-2: Percent of students reported successfully achieving
Outcome 4a
Distribution course
Indirect
SBS 1
GOVT 362
Assessment
Test
Rubric
Survey
NSSE
NA
NA
= 11i, 11l
*NA = not available
Conclusions for SLO 4a:
Due to the lack of data, no conclusions could be drawn.
4b. Investigate the worldview and/or history of cultures outside the United States
Data Source Table:
Table 4-3 indicates that one core course, the First Year Seminar, and three distributional courses
from the humanities have responsibility for assessing this outcome. Data submitted in this initial
year were from the First Year Seminar and two transitional distribution courses from Humanities
and English. Additionally, data were available from NSSE.
Table 4-3 : Direct and Indirect Measures for Outcome 4b
Core Courses
Type of
Measure
Test
Rubric 1
Rubric 2
Survey
FYS
Presentation
Distribution Courses
HUM 1
HUM 1
HUM 2
TC HUM 170
CMEM 210 TC ENG 211
Essay Questions
Exam 1
Paper
Essay
Essay 1
Essay
Indirect
NSSE
11l
*Grayed measures were not available
Measures (See Appendix A):
The FYS Presentation Rubric used in fall 2010 assessed one criteria for outcome 4b on a
weighted 5-point scale (failure, below average, average, good, and excellent) providing a total
possible score of 20. To improve the feedback to faculty, the spring 2011 FYS Presentation
Rubric had faculty indicate whether the presentation showed that there was investigation evident
of a worldview (global perspective) or of a history of non-US culture on a 3-point scale. Faculty
indicated whether the evidence fails to meet criterion, meets criterion, or exceeds criterion
Page
37
providing a total possible score of 3. Faculty then checked which of the two parts students
presented to achieve the outcome the worldview or the history.
Three distribution courses from humanities (TC HUM 170, CMEM 210, AND TC English 211)
assessed outcome 1b. HUM 170 faculty assessed the outcome through a paper and an essay. The
common paper produced by students, according to the proposal, “analyzes a significant work of
world cinema in its proper cultural, historical and aesthetic content.” Neither the proposal nor
the results provided details regarding the elements assessed or the total points possible. The
exam essay assessed, according to the proposal, covered “general terminology relevant to the
study of all film as well as terms, concepts, and classifications that are particular to highly
specific yet widely influential movements in the international history of cinema.” The faculty
developed a HUM 170 - Rubric for 4b to assess essays. Five criteria were defined evaluated on a
scale of 0 – 4 (0=missing, 1=not much in evidence, incomplete and/or erroneous, 2=evident and
factually accurate, 3=evident and well defined, 4=evident and superbly stated). It was clear from
the scoring guide that the essay was worth 20 points. They also reported results for essay
questions, but it was unclear how these differed from the essays and how they were evaluated.
Based on results, the essays appeared to be worth 20 points each. Faculty set a score of nine (9)
as an acceptable achievement level.
English 211 used two measures in the fall: an essay exam and a written essay to demonstrate
reading comprehension and investigating the worldview and/or history of cultures outside the
United States. Students wrote unit essay exam answers to demonstrate their skill in selecting
details from the readings to contrast similar main characters and value systems (1b, 4b) in three
assigned readings from different eras and/or cultures. Students also wrote a clearly organized
essay based on a research of published views to convey the diversity of perspectives found on
one character/theme/title/ major issue from the readings in the course (1b, 1c, 4b). No rubric was
provided which distinguished outcome accomplishments. ENG 211 was a transitional course
and will not be a part of the future General Education program, thus limiting the usefulness of
the data from these courses.
Findings:
Faculty from 31 First Year Seminar Fall 2010 sections reported results for 719 students, or
94% of the 765 students reported. Table 4-4 shows that 87.4% (N=629) of the students giving a
presentation performed at least at an average level on the criteria assessed for investigating the
worldview and/or history of cultures outside the United States. This information does not
indicate which part of the outcome students achieved.
Table 4-4: Presentation Rubric Score Distribution
Failure
Below Avg
Outcome
N
Pct
N
4b
61
8.5% 29
Avg
Good
Pct
N
Pct
4.0%
77
10.7%
N
388
Pct
Excellent
N
Pct
53.9% 164 22.8%
Total
N
Pct
719
99.9%
Table 4-5 shows that, in the spring, faculty reported results for 66 students or 73% of the 90
reported. Of 66 students giving presentations, 84.8% (N=56) at least met the criteria for
Page
38
investigating the worldview and/or history of cultures outside the United States. Results also
revealed that 91% of students investigated a worldview (global perspective) compared to 9% that
investigated the history of a non-US culture.
Assuming that criteria of “average” performance and “meets criterion” describe adequate
performance in both cases, a simple calculation can be used to combine fall and spring semester
results. This calculation, which takes the number of students assessed into account, shows
87.3% of students giving a presentation successfully achieved outcome 4b.
Table 4-5: FYS Spring Rubrics Score Distribution for Outcome 4b
Investigate the
worldview and/or
history of cultures
outside the United
States
SLO 4b
Fails to Meet
Criterion
Presentation:
V. Investigation was evident
of a (mark area investigated)
Exceeds
Criterion
Meets Criterion
At least
Met
SLO
N
Count
Pct
Count
Pct
Count
Pct
Pct
Avg
90
10
15.2%
27
40.9%
29
43.9%
84.8%
2.29
60
90.9%
6
9.1%
A. worldview (global
perspective) OR
B. history of non-US
culture
Tables 4-6 and 4-7 present data from different faculty for the transitional course HUM 170. The
two tables present results for two different sets assessments. It was not clear how they related,
whether the Paper in Table 4-6 was the Essay in Table 4-7 and if the Essay Questions were the
same for both sets of data. Results did not report the total number of students assessed, so
consolidation of data was not possible. The percentage of students reaching the defined target is
only reported for data in Table 4-6, but these data do not indicate whether students had to reach a
9 on one assignment or both to be counted.
Table 4-6: TC HUM 170 Spring Rubrics Score Distribution for Outcome 4b
HUM 170 301
HUM 170 302
Scores
High
Low
Average
Target: ≥ 9
Paper
20
10
12.15
Essay Qs
20
6
8.7
77% of students
Paper
20
211
13.2
Essay Qs
20
8
9.4
79% of students
Page
39
Table 4-7: TC HUM 170 Spring Rubrics Score Distribution for Outcome 4b
HUM 170
Scores
High
Low
Average
Target: ≥ 9
Essay
20
9
15
Essay Qs
20
5
14
Not Reported
English 211 provided summary data using measures of central tendency. Twenty-four (24)
students wrote essay exam answers in comparing three epics to demonstrate reading
comprehension and investigating the worldview and/or history of cultures outside the United
States. The instructor assigned grades ranging from E (1) to A (5) with an average score of 3.46.
These students also wrote an essay based on published research with assigned grades ranging
from C (3) to A (5) and averaging 4.12. This finding represented accomplishment of multiple
outcomes, but the measurement method did not distinguish between outcome accomplishments.
The summary did not report total students. Summary data such as this has limited value in
determining student learning outcome achievement.
Summary Table:
Table 4-8: Percent of students reported successfully achieving Outcome 4b*
Assessment
Test
Rubric
Survey
Core Courses
FYS
(n=719)
83.7%
Distribution Courses
HUM 1
HUM 1
HUM 2
TC HUM 170
CMEM 210 TC ENG 211
NA
UTD
UTD
Indirect
NSSE
= 11l
*NA – not available, UTD – unable to determine
Conclusions for SLO 4b:
Based on the 2010-2011 findings for SLO 4b, the university draws the following conclusions.
 At least 83% of MSU students adequately communicated evidence of investigating the
worldview and/or history of cultures outside the United States, based on First Year Seminar
two rubrics for oral communication skills. (4b)
 Results presented did not always provide sufficient data upon which to base results. (4b)
4c. Analyze cultural, social, economic, geographic and historical dynamics that influence
individuals and groups
Data Source Table:
Table 4-9 indicates that one core course and two distribution course have responsibility for
assessing this outcome. Core course data submitted in this initial year were from sections of the
Page
40
First Year Seminar courses. Distribution course data submitted were from sections of imaging
science. Additionally, data were available from NSSE.
Table 4-9 : Direct and Indirect Measures for Outcome 4c*
Core Courses
Type of
Measure
Test
Rubric
Survey
FYS
Distribution Courses
SBS2
SBS 2
PSY 154
TC IMS 300-301
Final Exam
Indirect
NSSE
Presentation
11l
*Data for grayed courses not available until 2011-12
Measures (See Appendix A):
The FYS Presentation Rubric used in fall 2010 assessed one criteria for outcome 4b on a
weighted 5-point scale (failure, below average, average, good, and excellent) providing a total
possible score of 20. To improve the feedback to faculty, the spring 2011 FYS Presentation
Rubric had faculty indicate whether the presentation showed that there was analysis of one or
more of the following dynamics that influence individuals and groups is evident: (mark all that
apply) on a 3-point scale. Faculty indicated whether the evidence fails to meet criterion, meets
criterion, or exceeds criterion providing a total possible score of 3. Faculty then checked which
of the five dynamics (cultural, social, economic, geographic, or historical) students presented to
achieve the outcome.
A transitional distribution course from Imaging Science (IMS 300-301) assessed this outcome
through a 100-point final exam that required the student to analyze factors that influence the
nature of human value development to determine the answer to the questions. As a transitional
courses and not part of the future General Education program, the usefulness of the IMS 300-301
data will be limited.
Findings:
Faculty from 31 First Year Seminar Fall 2010 sections reported results for 719 students. Table
4-10 shows that 88.3% (n=636) of the students performed at least at an average level on the
criteria assessed. This information does not indicate which part of the outcome students
achieved.
Table 4-10: Presentation Rubric Score Distribution
Failure
Below Avg
Outcome
N
Pct
N
4c
64
8.9% 19
Avg
Good
Pct
N
Pct
2.6%
70
9.7%
N
325
Pct
Excellent
N
Pct
45.1% 241 33.5%
Total
N
Pct
719
99.9%
Page
41
Table 4-5 shows that, in the spring, 59 of the 90 (62%) FYS students gave a presentation. Of the
59 students giving presentation, 94.9% (n=56) at least met the criteria for the articulation of
ethical consequences. The table also shows that the social dynamic was chosen most often
(38%) to analyze followed by geographic (28%), cultural (15%), economic (13%), and historical
(6%). Assuming that criteria of “average” performance and “meets criterion” describe adequate
performance in both cases, a simple calculation can be used to combine fall and spring semester
results. This calculation, which takes the number of students assessed into account, shows 89%
of students adequately analyzed dynamics that influence individuals and groups. A secondary
analysis depicted in Table 4-12 shows that 67% of the 69 students assessed analyzed two or more
dynamics. It also reveals that not all of the students assessed received a rating for the criterion.
Table 4-11: FYS Spring Rubrics Score Distribution for Outcome 4c
Analyze cultural, social,
economic, geographic and
Fails to Meet
Meets
Exceeds
At least
Criterion
Criterion
Criterion
Met
historical dynamics that
influence individuals and groups
SLO
N
Count
Pct
Count
Pct
Count
Pct
Pct
Avg
90
3
5.1%
28
47.5%
28
47.5%
94.9%
2.42
A. Cultural
19
15.2%
B. Social
47
37.6%
C. Economic
16
12.8%
D.
Geographic
35
28.0%
E.
Historical
8
6.4%
SLO 4c
Presentation:
VI. Analysis of one or more of the
following dynamics that influence
individuals and groups is evident
Table 4-12: FYS Spring Presentation
Rubric Dynamics Analyzed for SLO 4c
Dynamics Analyzed
Number Analyzed
Count
5
4
0
3
0.0%
3.3%
4
39
23
21
90
4.4%
43.3%
25.6%
23.3%
100.0%
3
2
1
0
Total assessed
Pct
Page
42
Results for an imaging science class (IMS 300-301) displayed in Table 4-13 show an average
score of 89.8 out of 100 on analyzing factors that influence the nature of human value
development to determine the answer to the questions. The summary did not report total
students, but did provide sufficient summary statistics to calculate the percentage of students that
would have met the 70% criterion level set for this report, or a score of 70 for this measure.
Table 4-13 shows that the minimum score was 74. (See Normal Curve Chart in Appendix B.)
So, for this data, more 100% of the students scored above the 70% criterion level.
Table 4-13: Average class scores on IMS 300-301 Final Exam
Possible
Test
N
Points
Average Std. Dev. Min
Final Exam
100
89.8
5.13
74
Max
90
Summary Table:
Table 4-14 : Percent of students reported successfully achieving Outcome 4c*
Assessment
Core Courses
FYS
(n=778)
Test
Rubric
Survey
Distribution Courses
SBS2
SBS 2
PSY 154
TC IMS 300-301
100%
Indirect
NSSE
89%
= 11l
*Data for grayed courses not available until 2011-12
Conclusions for SLO 4c:
Based on the 2010-2011 findings for SLO 4c, the university draws the following conclusions.
 Of 778 students giving an oral presentation, 89% of FYS students adequately analyzed at
least one cultural, social, economic, geographic and historical dynamic that influences individuals
and groups, based on two rubrics for oral communications. (4c)
 100% of all students in an imaging science course scored about a 70% on an exam that
required students to analyze factors that influence the nature of human value development to
determine the answer to the questions. (4c)
 Not all students were reported or reported accurately. (4c)
Page
43
4d. Comprehend the cycle of human growth necessary to provide sustained health and
individual well-being.
Data Source Table:
Table 4-15 indicates that three distributional courses have responsibility for assessing this
outcome. However, data was not scheduled to be submitted in this initial year were from the
three transitional distribution courses from Biology, Chemistry, and Science. Data were
available from NSSE.
Table 4-15: Direct and Indirect Measures for Outcome 4d*
Type of
Measure
Test
Rubric
Survey
Distribution Courses
NSC 1
NSC 1
NSC 1
TC BIO 105 TC CHEM 101 TC SCI 104
NS
NS
NS
Indirect
NSSE
6b
*NS – not scheduled
Measures (See Appendix A):
Of three transitional courses approved for the Gen Ed program (BIO 105, CHEM 101, SCI 104),
none scheduled assessments for this year for this outcome.
Findings:
The 2009 NSSE data show MSU seniors‟ perceptions to be significantly less than (p < .001) the
NSSE mean for Exercised or participated in physical fitness activities (6b). (NSSE Table 2,
Appendix B).
Summary Table:
Table 4-16: Percent of students reported successfully achieving Outcome 4d
Assessment
Test
Rubric
Survey
Distribution Courses
NSC 1
NSC 1
NSC 1
TC BIO 105 TC CHEM 101 TC SCI 104
NA*
NS
NS
Indirect
NSSE
= 11l
> 6b
*NA – not available, NS – not scheduled
Conclusions for SLO 4d:
 Based on the lack of 2010-2011 data for SLO 2f, the university can draw no conclusions.
Page
44
Goal 4 Recommendations
The assessment coordinator makes the following recommendations.
Recommendations specific to SLOs 4a-4d:
1) Both courses assessing this outcome showed high percentages of student accomplishment.
First Year Seminar and Imaging Science courses should continue their efforts in
teaching students to analyze cultural, social, economic, geographic and historical
dynamics that influence individuals and groups. (4c)
2) MSU NSSE results were substantially behind the NSSE mean for Exercised or participated
in physical fitness activities (6a). This item holds implications not just for student learning,
but for life-long learning and health. This item should be monitored closely. (4d)
General Recommendations based on SLOs 4a-4d:
1) Design a template for data submission that provides the minimum data necessary for
reporting meaningful results and ensuring faculty provide results for all students. (4b)
(4c)
2) Review requirements of the presentation rubric item for SLO 4c and considered
revisions to clarify the expectations for completing the FYS rubric to faculty. (4c)
Page
45
Goal 5 –Applicable Knowledge of the Natural World
Each of the Data Source Tables for the student learning outcomes related to this goal show
supporting data from two external measures; the CAAP (Collegiate Assessment of Academic
Proficiency ) Science Test, a direct measure, and the NSSE (National Survey of Student
Engagement), an indirect measure. Both measures have one score that assesses this goal as a
whole, and the NSSE has one related to an individual SLO. As such, descriptions under the goal
heading provide the relevant information related to these measures to eliminate unnecessary
replication of information.
CAAP Science Test5. The CAAP Science Test is a 45-item standardized, nationally normed
direct measure used by MSU to assess students‟ skills in scientific reasoning. The test covers
content from science disciplines of biology, chemistry, physical science and physics; and
consists of three formats: data representation, research summaries, and conflicting viewpoints.
Results do not provide subscores. The test has a 40-minute time limit.
MSU has not administered The CAAP Science Test and is waiting for the State‟s Council on
Postsecondary Education (KY CPE) to announce the schedule for test administration.
NSSE offers two items related to Knowledge of the Natural World. One measure on the NSSE
(11m – Solving complex real-world problems (d)) relates to this goal as a whole. Another
measure (11g – Using computing and information technology (d)) relates to SLO 5a Comprehend and apply basic scientific, quantitative, and technological methods and knowledge
of natural systems to the solution of scientific problems.
The senior mean score for perceptions on the item specific to Goal 5 was significantly greater
than (p < .05) the NSSE mean for solving complex real-world problems (11m). The senior mean
score for perceptions on the item specific to SLO 5a was significantly greater than (p < .001) the
NSSE mean for using computing and information technology (11g-d). (NSSE Table 2, Appendix
B).
Data Sources for SLOs 5a – 5c
Distribution courses each had four to six student learning outcomes to measure. The GEC
allowed faculty to divide the reporting on SLOs outcomes among two years, ensuring to provide
data for at least two outcomes per year and to cover all outcomes within the two years. Each
5
ACT, Inc. (2011, Aug 8). CAAP Science Test. Retrieved from http://www.act.org/caap/test_science.html
Page
46
course from the categories of Natural Science 1 (NSC 1) and Natural Science 2 (NSC 2) worked
down the list of SLOs, beginning with SLOs 1 and 2. None of the courses scheduled any of the
SLOs for goal 5 within the first year. Consequently, there is no data for SLOs 5a – 5c.
Goal 5 Recommendations
Recommendations based on SLOs 5a – 5c:
The GEC needs to establish a scheduling procedure to ensure that data received each year allows
for monitoring of progress on each SLO.
Page
47
Goal 6 –Knowledge of Aesthetics
Each of the Data Source Tables for the student learning outcomes related to goal 6 show
supporting data from an external, indirect measure; the National Survey for Student Engagement
(NSSE). One measure on the NSSE (11a – Acquiring a broad general education) relates to this
goal as a whole. As such, descriptions under the goal heading provide the relevant information
related to these measures to eliminate unnecessary replication of information.
The senior mean score for perceptions on the 2009 NSSE specific to Goal 6 – Applicable
Knowledge of the Natural World, was significantly greater than (p < .01) the NSSE mean for
acquiring a broad general education (11a). (NSSE Table 2, Appendix B).
6a. Analyze the significance of diverse creative productions and explain how ideas are
communicated effectively through the expressive arts (literature, theatre, dance, music,
and visual arts)
Data Source Table:
Table 6-1 indicates that three distribution courses have responsibility for assessing this outcome.
Data submitted in this initial year were from sections of an art and humanities course.
Additionally, data were available from NSSE.
Table 6-1: Direct and Indirect Measures for Outcome 6a
Type of
Measure
Test
Rubric
Survey
HUM 1
Art 160
Project
Distribution Courses
THUM 170
THUM 1
CMEM 210
HUM 170
Pre-Post
Essay
Indirect
NSSE
11a
*Data for grayed courses not available in 2010-2011
Measures (See Appendix A):
Two of three distribution courses assessed SLOs for Goal 6 in 2010-2011(see table 6-1). One
course applied a project rubric for assessing writing and the second developed and administered
a pretest-posttest. ART 160 faculty assessed students‟ comprehension of the meaning and
relevance of artworks dealing with contemporary issues using a Final Writing Project ~ Art 160
rubric. The rubric defines criteria for five levels of grade achievement: failing, below average,
average, good, and excellent. While the rubric assigns a point range to each level of the scale, it
does not define criteria for assigning points. Faculty applied the rubric to written essays on how
various artists approach topics of interest to contemporary society.
Page
48
The second course, HUM 170, developed an objective exam covering general terminology
relevant to the study of all film as well as terms, concepts, and classifications that are particular
to widely influential movements in the international history of cinema. It was proposed to
include essay questions that would ask students to define the historical and/or cultural
particularity of a significant movement in global cinema and analyze what is aesthetically
distinct about the seemingly hybrid form of film. The item instead stated the following:
4. Write short essays on both the German Expressionist and the French New Wave film
movements. For each, touch on the following points:
 Movement dates
 Any historical influences
 Technical and visual characteristics
 Typical subject matter and themes
 Important key/representative films
 Legacy and influence of the movement on later films and filmmakers
Two bullets seem to address the first half of outcome 6a; the second bullet on historical influences
and the last bullet on legacy and influence. The latter half of outcome 6a and outcome 6b are not
covered.
The HUM 170 used the Rubric for 1c to assess the common essay questions. The rubric lays out
five criteria assessed on a 5-point scale: 0=missing, 1=not much in evidence, incomplete and/or
flawed, 2=evident and competently done, 3=evident and well done, 4=evident and superbly done.
The rubric also provides an achievement target of “70% of the students in each class will score 9 or
higher.” This is below the target score of 14 (70%) on a 20 point measure assigned to this report.
Findings:
Art faculty presented the data in Table 6-2 demonstrating that 97% of students scored at least a
70% on project to show students‟ comprehension of the meaning and relevance of artworks
dealing with contemporary issues. This is the same rubric used for assessing writing proficiency,
and because results did not disaggregate the scores for the two SLOs, the results are the same.
Table 6-2: Range of points scored by ART 160 students on a Project rubric
Rating
Score Range
N
Pct
Excellent
Good
Average
Below Average
Failing
Total
90-100
80-90
70-79
60-69
40
23
9
0
2
74
54%
31%
12%
0%
3%
Page
49
The next two tables show combined fall and spring results of a pretest-posttest administered by
HUM 170 to assess SLO 6a. Table 6-3 shows that the mean of the pretest (26.11) was less than
that of the posttest (40.47) for 38 students. As seen in Table 6-4, a paired samples t-test revealed
that the mean change of scores (14.37) from pretest to posttest to be significant (p < .001).
Faculty provided individual pre- and posttest scores by student which allowed frequency
distributions of posttest scores and change scores from pretest to posttest to be calculated. The
70% target score for this test would be a 35. Support Table 6a-1 in Appendix B shows 74.4% of
students achieved above a score of 35. The change score is an indication of the learning that
occurred. Support Table 6a-2 in Appendix B shows that, of students who recorded both a pretest
and a posttest score, 100% of students demonstrated learning with a range of change scores
from a minimum of 8 points to a maximum of 32 points.
Table 6-3: 2010-11 HUM 170 Pre-posttest Paired Samples Statistics
Mean
40.47
Posttest
Pretest
N
26.11
38
Std. Dev.
6.141
Std. Error Mean
.996
38
6.770
1.098
Table 6-4: 2010-11 HUM 170 Pre-posttest Paired Samples t-Test
Paired Differences
Posttest - Pretest
Mean
14.37
Std. Dev.
5.897
Std. Error
Mean
.957
95% Confidence Interval
of the Difference
Lower
12.43
Upper
16.31
t
15.019
Sig.
df
(2-tailed)
.000
37
Summary Table:
Table 6-5: Percent of students reported successfully achieving Outcome 6a
Assessment
Test
Rubric
Survey
HUM 1
Art 160
70% (n=74)
Distribution Courses
THUM 170
THUM 1
CMEM 210
HUM 170
100% learning
74.4% at criterion
(n=38)
Essays
Indirect
NSSE
.> NSSE 11a
*Data for grayed courses not available until 2011-12
Page
50
Conclusions for SLO 6a:
Based on the 2010-2011 findings for SLO 6a, the university draws the following conclusions.
 More than 70% of 112 MSU students completed essays in which they were to touch on historical
influences and the legacy and influence of two film movements on later films and filmmakers.
 All the students (100%) taking both the pretest and posttest in HUM 170 demonstrated positive
learning
6b. Describe and analyze the aesthetic value of creative productions in cultural and
historical context
Data Source Table:
Table 6-1 indicates that two of the distribution courses that have responsibility for assessing this
outcome also had responsibility for SLO 6a. Data submitted in this initial year were from
sections of an art and humanities course. Additionally, data were available from NSSE.
Table 6-6: Direct and Indirect Measures for Outcome 6b
Type of
Measure
Test
Rubric
Survey
Distribution Courses
THUM 170
THUM 1
CMEM 210
HUM 170
Pre-Post
Exam Essay
Indirect
NSSE
11a
*Data for grayed courses not provided
Measures:
Measures used for SLO 6b in HUM 170 were the same as those used and reported for SLO 6a.
Directions for the essay, however, did not support this outcome and the pretest-posttest results
could not be disaggregated by outcome.
Findings:
No findings result from SLO 6b.
Goal 6 Recommendations
Recommendation based on SLO 6a:
The methods used show multiple outcomes assessed with the same measure but without a
disaggregation of results. A review of the rubrics show that they are not constructed to allow
disaggregation of results. Assist faculty in understanding the importance and usefulness of
disaggregating results by outcome.
Page
51
Appendix A: Instruments
Core Writing I and II Scoring Guide Descriptor Chart
Eng 100 & 200
Thesis (implicit or explicit): The
essay/exam has a claim or thesis
Is such a declarative statement present or are combined
statements articulating a central claim?
A “controlling claim” or governing idea that
elements of the text contribute to “proving” or
supporting, declaratively stated. This claim is
worth proving and plausible for readers .
Yes (4 pts.) or no (1 pt.)
Assessment
Fails Criterion
(1)
The thesis is poorly
phrased as a result of
wordiness, poor
word choice or other
stylistic or usage
irregularities/No
discernible focus
suggested in thesis/
Competing thesis
statements.
Meets
Minimally (2)
Thesis is clear.
A thesis and
focus can be
discerned, but
poor syntax,
imprecise word
choices, or other
stylistic/usage
irregularities
interfere with
complete
understanding.
Essay/exam uses a
No discernible
Paragraph focus
coherent
sequencing of
and sequencing
organizational
paragraphs within the sometimes is
structure that supports text emerges. Focus inconsistent.
the thesis.
and purpose of
Ideas are
paragraphs and ideas sometimes
within paragraphs are presented
indiscernible. Failed logically and
efforts or no efforts
consistently.
to use topic
The writer
sentences or
attempts to use
transitional devices
topic sentences
and transitional
devices, but
sometimes does
so inconsistently
or ineffectively.
Meets with Minor
Exceptions (3)
Thesis and focus are
discernible, but not as
precise or direct as
possible.
Fully Meets
(4)
Thesis is precisely
stated, reflecting
an explicit focus.
Most paragraphs
remain on focus,
relating back to the
thesis. Paragraphs are
generally presented in
the best, most logical
order to support the
thesis. Ideas are
mostly presented
logically and
consistently, within
paragraphs. Topic
sentences and
transitions are
present, but are not
consistently used
within the overall text
and within paragraphs
Paragraphs clearly
refer back to the
thesis. The
sequence of
paragraphs is in
the best, most
logical order to
support the thesis.
Ideas are
presented logically
and consistently
within paragraphs.
Topic sentences
and transitional
devices maintain
focus within the
overall text and
within paragraphs.
Page
52
Assessment
Essay/exam uses
source material
effectively in support
of thesis.
Sentences conform to
standards of formal
edited American
English.
(subject-verb
agreement, plurals vs.
possessives, etc.)
Fails Criterion
(1)
Source material is
not included or not
relevant to purpose
of the text. Source
materials are misinterpreted. Source
materials are
ineffectively
presented, interfering
with meaning and
questioning
credibility.
The writer lacks
control of the
conventions of
standard edited
American English
serious enough to
interfere consistently
with reading
comprehension.
Meets
Minimally (2)
Source material
may be relevant
to purpose and
focus of essay,
but not clearly
presented.
Interpretation of
source material
may be
inconsistent.
Integration may
cause logical
errors and
interference
with reading.
The writer
demonstrates
principles of
academic
honesty.
The writer lacks
control of the
conventions of
standard edited
American
English
sometimes
serious enough
to interfere with
reading
comprehension.
Meets with Minor
Exceptions (3)
Generally, source
material is carefully
chosen; smoothly and
accurately integrated;
accurately interpreted;
and made relevant as
evidence and
illustration to support
claims, but efforts are
not as sophisticated or
superb as possible.
The writer clearly
demonstrates
principles of
academic honesty.
Fully Meets
(4)
Source material is
carefully chosen;
smoothly and
accurately
integrated;
accurately
interpreted; and
made relevant as
evidence and
illustration to
support claims.
The writer clearly
demonstrates
principles of
academic honesty.
The writer has basic
control of the
conventions of
standard edited
American English
with a few errors
present.
The writer
demonstrates clear
control of the
conventions of
standard edited
American English.
Back to Measure 2a
Page
53
FYS 101: First Year Seminar
Presentation Assessment Scoring Guide/Rubric
Fall 2010
Presenter: ____________________________________________________________________
Topic of Presentation: __________________________________________________________
Date: _______________________________
Failure
I.
Speaking (SLO 1a)
A. Used clear enunciation
B. Used appropriate grammar
C. Effectively communicated goals of
presentation
D. Addressed target audience effectively
E. Listened effectively to audience questions
Mark score for each category
Below Average Good Excellent
Average
0
0
0
1
1
1
2
2
2
3
3
3
4
4
4
0
0
1
1
2
2
3
3
4
4
II. Organization & medium of delivery (SLO 1a)
A. Evidence of planning
0
1
2
3
4
B. Organization effectively executed
C. Optimized use of time
D. Appropriate delivery medium (media) chosen
E. Judicious use of delivery medium (media)
0
0
0
0
1
1
1
1
2
2
2
2
3
3
3
3
4
4
4
4
Total
SLO 1a score
___ / 40= ____
III. Content
A. Articulation of ethical consequences is evident
(SLO 2d)
B. Investigation of worldview evident OR
Investigation of history of non-US culture is
evident (SLO 4b)
C. Analysis of one or more of the following
dynamics that influence individuals and
groups is evident Cultural, Social, Economic,
Geographic, Historical (SLO 4c)
Total Score (100 possible)
0
5
10
15
20
0
5
10
15
20
0
5
10
15
20
SLO 2d score
___ / 20= ____
SLO 4b score
___ / 20= ____
SLO 4c score
___ / 20= ____
Excellent=90-100, Good=80-89, Average=70-79, Below Average=60-69, Failure=Below 59
Back to Measure 4b
Back to Measure 4c
Page
54
FYS 101: First Year Seminar
Presentation Assessment Scoring Guide/Rubric
Spring 2011
Presenter: __________________________________________ Date: _______________________
Topic of Presentation: ______________________________________________________________
Target Audience: __________________________________________________________________
Directions:
1) Mark assessed level.
2) Sum the total for each section (I, II..) in the Score column
3) Add all scores to get Total Score.
1a) Listen and speak effectively in conversational, small group,
SLO
public and intercultural contexts
1a




Speaks clearly and distinctly throughout
Uses appropriate language for the discipline
Vocalized pauses (um, uh, er, etc.) are not distracting
Speaks with confidence; neither too quickly, nor too slowly
II. Organization/Preparation
A.
B.
C.
D.
1a
Score
(no. checked)
I. Speaking
A.
B.
C.
D.
1a
Meets
Criterion




Introduction effectively communicated presentation goals
Topic was well focused & appropriate
Clear evidence of planning, obviously rehearsed
Conclusion summarized ideas well
III. Delivery

A. Maintains eye contact with the audience
B. Facial expression and body language convey strong
enthusiasm & interest
C. Delivery medium was appropriate
D. Listens effectively to adequately address questions
Content



Fails to
Meet
Criterion
Meets
Criterion
Exceeds
Criterion
2d
IV. Articulated ethical consequences of an identified problem
1
2
3
4b
V. Investigation was evident of a (mark area investigated)
1
2
3


A. worldview (global perspective) OR
B. history of non-US culture
4c
VI. Analysis of one or more of the following dynamics that influence
individuals and groups is evident: (mark all that apply)
2
3





A. Cultural
B.
C.
D.
E.
1
Social
Economic
Geographic
Historical
Total Score (21 possible)
Back to Measure 4b
Back to Measure 4c
Page
55
Capstone Presentation Assessment Rubric
Spring 2011
Presenter: _______________________________________________ Date: ________________
Topic of Presentation: _____________________________ Target Audience: _________________
Directions:
SL
O
1a
1a) Listen and speak effectively in conversational, small group,
public and intercultural contexts
Meets
Criterion
Exceeds
Criterion
1
1
1
1
2
2
2
2
3
3
3
3
1
1
1
1
2
2
2
2
3
3
3
3
1
2
3
1
2
3
1
1
2
2
3
3
1
1
2
2
3
3
Score
Speaks clearly and distinctly throughout
Uses appropriate language for the discipline
Vocalized pauses (um, uh, er, etc.) are not distracting
Speaks with confidence; neither too quickly, nor too slowly
II. Organization/Preparation
A.
B.
C.
D.
1a
Fails to
Meet
Criterion
I. Speaking
A.
B.
C.
D.
1a
1) Mark assessed level.
2) Sum the total for each section (I, II..) in the Score column
3) Add all scores to get Total Score.
Introduction effectively communicated presentation goals
Topic was well focused & appropriate
Clear evidence of planning, obviously rehearsed
Conclusion summarized ideas well
III. Delivery
A. Maintains eye contact with the audience
B. Facial expression and body language convey strong
enthusiasm & interest
C. Delivery medium was appropriate
D. Listens effectively to adequately address questions
Content
2e
IV. Apply knowledge and skills to new settings and complex
problems
A. Summarized most meaningful ideas or findings
B. Discussed possible applications
Total Score (42 possible)
Back to IMS 300 Findings
Page
56
FYS 101: First Year Seminar
Writing Assessment Scoring Guide/Rubric
Fall 2010
Name: ___________________________________________________ Date: ______________
Target Audience: ___________________________ Writing Topic ______________________
Failure
Mark score for each category
Below Average Above Excellent
Average
Average
I. Used conventions associated with
standard English (SLO 1c)
A. Appropriate grammar
B. Appropriate punctuation
C. Appropriate spelling
D. Appropriate wording
E. Frequent and varied sentence structure
0
0
0
0
0
1
1
1
1
1
2
2
2
2
2
3
3
3
3
3
4
4
4
4
4
II. Organized writing (SLO 1c)
A. Effective opening
B. Logical organization of information
C. Effective conclusion
D. Effective transitions
E. Writing is purposeful and focused
0
0
0
0
0
1
1
1
1
1
2
2
2
2
2
3
3
3
3
3
4
4
4
4
4
0
0
0
1
1
1
2
2
2
3
3
3
4
4
4
0
1
2
3
4
0
1
2
3
4
III. Applied critical elements of writing
(SLO 1c)
A. Cited sources appropriately
B. Judicious use of quoted work
C. Writing engages the target audience
D. Appropriate writing style/voice
chosen
E. Used examples and/or details to enrich
writing
Total
SLO 1c score
___ / 60= ____
IV. Content
A. Articulation of ethical consequences is
evident (SLO 2d)
B. Application of knowledge and skills to
new settings and /or complex
problems (SLO 2e)
Total Score (100 possible)
0
5
10
15
20
0
5
10
15
20
SLO 2d score
___ / 20= ____
SLO 2e score
___ / 20= ____
Excellent=90-100, Good=80-89, Average=70-79, Below Average=60-69, Failure=Below 59
Back to Measure 2e
Page
57
FYS 101: First Year Seminar
Writing Assessment Scoring Guide/Rubric
Spring 2011
Name: _______________________________________________________ Date: _______________
Target Audience: ____________________________________ Writing Topic _________________
1c Write effectively for a variety of target audiences using
SLO conventions associated with standard English
1c
I.
Used conventions associated with standard English with no
more than 3 errors per page (CHECK all major problem
areas)
Fails to
Meet
Criterion
Meets
Criterio
n
Exceeds
Criterion
1
2
3




A. Appropriate grammar
B. Appropriate punctuation
C. Appropriate spelling
D. Appropriate wording
1c
1c
Total
Score
II. Organized writing
A. Logical organization of information
1
2
3
B. Effective opening
1
2
3
C. Effective transitions
1
2
3
D. Effective conclusion
1
2
3
E. Writing is purposeful and focused
1
2
3
A. Engaged the target audience
1
2
3
B. Used a consistent writing style/voice
1
2
3
C. Cited sources appropriately
1
2
3
D. Used quoted work judiciously
1
2
3
E. Examples or details enriched the writing
1
2
3
A. Recognized an ethical problem related to the topic
1
2
3
B. Clearly explained ethical principles related to problem
1
2
3
A. Applied knowledge to identify information needed to
address the situation or problem
1
2
3
B. Identified concepts & principles relevant to the topic
1
2
3
C. Synthesized key ideas
1
2
3
D. Drew conclusions related to the problem
1
2
3
III. Applied critical elements of writing
Content
2d
2e
IV. Perceived and articulated ethical consequences of
decisions and actions
V. Applied knowledge and skills to new settings and/or
complex problems
Total Score (51 possible)
Back to Measure 2e
Page
58
Discussion Board Grading Rubric (10 points)
Communication
Original Post
Communication Quality
 Initial post is posted on or
before 10:00pm on
Monday of the week of the
scheduled assignment.
 Original post
 Complete answers/thoughts
 Evidence of critical
thinking
 Correct grammar.
 Netiquette used.
 Minimum of 1 paragraph.
Interaction
2nd Post
Interaction Quality
 Second post is posted
between Monday at
10:00pm and Tuesday at
10:00pm of the week of
the scheduled assignment.
 Replies to a specific person
addressing them by name.
 Initiates discussion with
others & encourages
replies.
 Substantial post, minimum
of 1 paragraph.
(3 points possible)
Interaction Quality
 Second post submitted late,
after 10:00pm on Tuesday.
 Does not reply to a specific
person addressing them by
name.
 Some discussion with
others does not encourage
replies.
 Minimal ability to critical
think.
 Responds minimally to text
book materials.
 Minimal post, less than one
paragraph
Interaction
3rd Post
Interaction Quality
 Third post is posted
between Wednesday at
8:00am and Thursday at
10:00pm of the week of
the scheduled assignment.
 Replies to a specific person
addressing them by name.
 Initiates discussion with
others & encourages
replies.
 Substantial post, minimum
of 1 paragraph.
(3 points possible)
Interaction Quality
 Posts after 10:00pm on
Thursday will be given a
zero and not be graded.
 Mostly responds to
textbook information, not
classmates.
 Responds minimally to text
book materials.

Does not initiate
discussion with others.
 Minimal post, less than one
paragraph
 Minimal discussion with
others.
(4 points possible)
Communication Quality
 Initial post submitted late,
on Monday after 10:00pm
or on Tuesday before
10:00pm.
 Post relies on responses of
others before initiating
their thought process.
 Some ability to think
critically.
 Not substantial answer less
than 1 paragraph.
 Minimal answers, 3-4
sentences.
 Minimal ability to critical
think.
(2-3points possible)
(2 points possible)
(2 points possible)
Communication Quality
Interaction Quality
Interaction Quality
 Initial post submitted on
 Posts after 10:00pm on
 Posts after 10:00pm on
Wednesday or Thursday
Thursday will be given a
Thursday will be given a zero
before 10:00pm.
zero and not be graded.
and not be graded.
 Very minimal ability to
 Very minimal discussion
 Responds to text book materials
critical think.
with others.
only.
 Minimal answers, l-2
 Responds to text book
 Minimal discussion with other.
sentences.
materials only.
 Does not reply to any posts.
 Does not make an original
 Does not reply to any
post.
posts.
(0-1 point possible)
(0-1 point possible)
(0-1 point possible)
Total points=
Total points=
Total points=
* If DB is 20 points the amount of points possible for each section will be doubled.
LND 6/10
Back to Outcome 2c Measures
Page
59
Capstone Presentation Assessment Rubric
Spring 2011
Presenter: __________________________________________________ Date: ________________
Topic of Presentation: __________________________ Target Audience: _____________________
Directions:
SLO
1a
1a) Listen and speak effectively in conversational, small group,
public and intercultural contexts
Meets
Criterion
Exceeds
Criterion
1
1
1
1
2
2
2
2
3
3
3
3
1
1
1
1
2
2
2
2
3
3
3
3
1
2
3
1
2
3
1
1
2
2
3
3
1
1
2
2
3
3
Score
Speaks clearly and distinctly throughout
Uses appropriate language for the discipline
Vocalized pauses (um, uh, er, etc.) are not distracting
Speaks with confidence; neither too quickly, nor too slowly
II. Organization/Preparation
A.
B.
C.
D.
1a
Fails to
Meet
Criterion
I. Speaking
A.
B.
C.
D.
1a
1) Mark assessed level.
2) Sum the total for each section (I, II..) in the Score column
3) Add all scores to get Total Score.
Introduction effectively communicated presentation goals
Topic was well focused & appropriate
Clear evidence of planning, obviously rehearsed
Conclusion summarized ideas well
III. Delivery
A. Maintains eye contact with the audience
B. Facial expression and body language convey strong
enthusiasm & interest
C. Delivery medium was appropriate
D. Listens effectively to adequately address questions
Content
2e
IV. Apply knowledge and skills to new settings and complex
problems
A. Summarized most meaningful ideas or findings
B. Discussed possible applications
Total Score (42 possible)
Back to Measures 2e
Page
60
Capstone Project Rubric4
Name: ___________________________________________
Date: _______________
Course: _____________________
Lacking Insufficient Adequate
SLO
Chapter Criteria



Ample

Substantial

Statement of the Purpose
1c
 Clearly articulates an achievable purpose
2e
 Justifies the need through relevant, current literature
1c
 Summarizes rationale for the project based on the
review of literature
2e
 Specifies key research objectives to accomplish 
2e
Defines the scope and limitations to focus the project
2e
 Defines unique terminology
Review of Resources
2e
 Chooses sources (literature, records, interviews, focus
groups) relevant to support the need 
1c
 Organizes the review around a logical progression of
key ideas
2e
 Summarizes significant ideas derived from the
reviewed references/resources
2e
 Synthesizes concepts to show relationships and
patterns of knowledge
1c
 Restates the purpose of the project in the introductory
paragraph(s) summarizing key elements
Methods
1c
2e
2e
2e
Provides rationale for project design
 Identifies objectives to accomplish the project's
purpose
 Develops an implementation plan as a guide to project
completion
 Presents an evaluation plan to determine level of project
accomplishment 
 Describes the data/findings collection methods 
 Describes the data analysis (quantitative and
qualitative) methods utilized
Results
1c
1c
1c
2f
2f
 Summarizes the project in the introductory paragraph(s).
 Describe the sources of data or findings, such as the
number of participants or respondents
 Presents findings that address fulfilling the purpose of
the project
 Identifies project strengths: characteristics that
positively impact its efficiency or effectiveness 
 Identifies project weaknesses: characteristics that
negatively impact its efficiency or effectiveness 
 Summarizes the results
Page
61
Capstone Project Rubric6
Date: _______________
Name: ___________________________________________
Course: _____________________
Lacking Insufficient Adequate
SLO

Chapter Criteria


Ample
Substantial


Conclusions & Recommendations
1c
2e
2e
 Succinctly summarizes the project and findings 
 Accurately synthesizes the information generated from
the project to draw conclusions and recommendations
 Draws conclusions about the effectiveness of the
project to accomplish the purpose
2e
 Recommends at least one application of the results
2f
 Identifies unresolved issues or areas for further
research
Style and Format
1c
1c
1c
1c
1c
 Used conventions associated with standard English;
noted difficulties with 
o
Grammar

o
Punctuation

o
Spelling

o
Wording

 Engages the target audience
 Uses scholarly writing techniques to create a seamless
flow of ideas within and between sections
 Creates clear transitions between related ideas and
paragraphs
 Uses discipline specific writing style & formatting
throughout
1c
 Incorporates appropriate in-text citations
1c
 Lists references applying appropriate techniques and
formats
Overall Project (components assessed on adequacy scale)
1c
1c
1c

Complete
 Coherent
 Organized
Back to Measures 2e
Back to Measures 2f
6
Developed by Dr. Paula D. Serra, Ph.D., Morehead State University and Dr. Jennifer Cochran, Ph.D., Central Michigan University.
Adapted and approved by the General Education Council Feb. 18, 2011.
Page
62
Student Learning Outcome 3a: Analyze situations and/or problems using arithmetic,
geometric, algebraic, or statistical methods
Math 131 Fall: Sampling Theory Group Project Rubric: (Section 006)
Task
0
1
2
3
4
2
3
Clearly define the population of interest, and the variable to be measured.
Describe in great detail the method you used to collect a representative random
sample (Sample that truly captures the truth about the population variable of
interest).
Detailed description of how variable of interest was measured.
Graphical representations of this variable with detailed discussion of the
distribution‟s qualities.
Numerical descriptions with detailed discussion.
Detailed explanation of why you think your sample was adequate to properly
represent the intended population.
Math 131 Spring: Laboratory Report Common Rubric: (Section 005)
Factor
0
1
Student appears to understand the purpose of the project-Comparing how
representative of a known population two different types of random samples are for
a given sample size
Student carries out the required steps on Minitab 15
Student publishes the required steps from Minitab
Student keeps all the information produced well organized
Student arrives at the correct conclusion that Stratified Random Samples yield better
results than SRS‟s when sample sizes are the same.
Return to Measures 3a
Appendix A: P a g e A 63
Authentic Group Project: Buying a House Rubric (Math 135-3a_c / Math 131-3c)
Item
3
2
1
Organization
The project is well organized
and good grammar/spelling is
used. Questions are answered
in paragraph form. The
report is typed on Word.
Complete
All parts of the project are
included and well answered.
Organization is good.
The project is neat and
typed on Word. Some
mistakes in grammar and
spelling. Paragraphs
generally used.
Nearly all parts well
anwered.
Knowledge of
some of the
expenses
associated with
home ownership
Reasonable tax and insurance
rates are provided along with
the source of these rates.
The extra costs associated
with a mortgage are described
accurately.
The questions are answered
completely and all steps are
provided clearly.
Reasonable rates are
provided. Some
discussion of extra
costs.
Organization is
poor.
Grammar is poor.
The project is
handwritten and
messy.
Some of the
project is well
answered.
Some evidence of
findings in this
area.
The questions are
mostly correct and most
steps are shown.
Most questions
correct. Few
steps shown.
Students independently
research what an amortization
table is and how to create one
on Excel and create an
accurate table
Students get the
required help to aid
them in creating the
amortization table and
create an accurate table
Students create
an incorrect table
Knowledge of the
mathematics
involved with
mortgages
Amortization
Table
Return to Measures 3a
Return to Measures 3c
Appendix A: P a g e A 64
Exam Question Rubric 1 – Math 131, SLO 3a & 3b -- The student is graded on:



choosing the proper procedure from among many, and
carrying it out correctly and appropriately, and
reporting the answer with explanation.
Math 131, SLO 3a – Possible Final Exam Question:
The annual household incomes, measured in thousands of dollars, of a sample of six families in a
neighborhood were: 34.5, 44.2, 27.9, 38.1, 50.0, and 39.1.
a. Compute the (sample) variance.
b. Compute the (sample) standard deviation.
Return to Measures 3a
Return to Measures 3b
Exam Question Rubric 2 – Math135, SLO 3a




fully answering the questions in sentence form,
showing all steps to solution in an organized manner,
correctly checking and rechecking intermediate answers for correctness, and
checking final answers for reasonableness and correctness.
Math135, SLO 3a Exam Question such as
“In a March 2001 Gallup poll of 554 adults, 63% said that they would be willing to pay $500
more each year in higher prices so that the industry could reduce air pollution. If the margin
of error is 5%, what is the confidence level? How large a sample would be needed to have a
confidence level of 90%? How large a sample would be needed to decrease the margin of
error to 3% while maintaining the original confidence level?"
Return to Measures 3a
Appendix A: P a g e A 65
Exam Question Rubric 3 – Math 135, SLO 3b & 3c




Answers complete,
organized,
work shown and
correct.
Math 135, SLO 3b & c Exam Question 1 such as (Excerpt from Exam 1 Nursing and
Veterinary Dosage Calculations)
1. Order: Septra suspension 7.5 mL of trimethoprim 40 mg per 5mL p.o. q.12h.
This for a child weighing 15 kg with urinary tract infection. The recommended
dosage of Septra (trimethoprim and sulfamethoxazole) is based on trimethoprim
at 8mg/kg/day in 2 equal doses.
What is the recommended single dose for this particular child (15kg weight)?
Is the order safe?
Math 135, SLO 3b & c Exam Question 2 such as
"You have decided to make monthly deposits into an account earning 6.3%
interest, compounded monthly. If you want to have $1,000,000 in 20 years,
how much should you deposit each month?
Return to Measures 3b
Return to Measures 3c
Appendix A: P a g e A 66
Math 131 In-Class Activity Rubric
Task & Points
(a) Develop an order
requirement table and
Weighted Order Requirement
Digraph(WORD) for the tasks
involved in the problem.
3
ORD makes sense
and is correct.
ORD will be useful
in correctly
determining an
appropriate
schedule
The choice of
priority list is
well-defended and
correct
2
ORD may have a
minor flaw but
will still be useful
in determining an
appropriate work
schedule
(c) Schedule the tasks between
you and your significant other
in the least time possible, using
the priority list of choice.
List Processing
Algorithm (LPA) is
correctly used,
and leads to an
appropriate
schedule
Use of LPA is
attempted with
some minor flaws,
but leads to a
workable
schedule
(d) Determine if your schedule
is optimal and explain your
decision.
Student makes a
correct statement
about the quality
of the schedule
Student shows
some error(s) in
concepts required
to provide a
correct evaluation
of quality
(b) Determine if a priority list
other than the critical path
priority list makes sense in this
situation. If so, develop that
priority list; if not, develop the
critical path priority list.
0-1
ORD has enough
flaws in it that its
usefulness is
compromised and
will likely lead to an
inappropriate
schedule
No explanation of Priority list isn’t well
why a priority list thought out in the
is selected, but list context or is
of choice is correct incorrect
Little evidence of skill
in using the LPA.
Schedule is not much
better than what
would emerge with
no scheduling
techniques used
Student shows little
understanding of
how to judge the
quality
Return to Measures 3a
Appendix A: P a g e A 67
HUM 170 – Rubric for 4b
Rubric for 1c: Write effectively for a variety of target audiences using conventions associated with
standard English (used to assess the common paper and common essay questions)
Criteria
0
1
2
3
4
Has an identifiable thesis (essay)
or defined topic sentence (essay
question)
Has a logical organizational
pattern
Offers applicable evidence and/or
a clear definition
Has little to no mechanical or
syntactical problems
Uses language appropriate to its
target audience—film scholars
(essay) or general readers (essay
question)
Evaluation scale: 0=missing, 1=not much in evidence, incomplete and/or flawed, 2=evident and competently done,
3=evident and well done, 4=evident and superbly done
Scoring guide:
0-3 No effective writing skills
4-8
Poor writing skills
9-12 Proficient writing skills
13-16 Advanced writing skills
17-20 Exceptional writing skills
Target goal: 70% of the students in each class will score 9 or higher
Back to Measures for 4b
Back to Measure 6a
Appendix A: P a g e A 68
Back to Measure 6a
Appendix A: P a g e A 69
Appendix B: Supporting Tables
Supporting Tables from the SACS Response Report
Table 1-11: Counts for ENG 100
Reading Comprehension quiz
Score
Count
Pct
Cum Pct
10
99
28.0
28.0
9
76
21.5
49.6
8
65
18.4
68.0
7
54
15.3
83.3
6
30
8.5
91.8
5
14
4.0
95.8
4
6
1.7
97.5
3
8
2.3
99.7
2
1
0.3
100.0
0
0.0
Assessed
1
353
100
Missing
203
Total
556
Back to Findings 2c
Table 1-7: A Paired Samples t-Test of F2010-S2011 FYS Pretest and Posttest scores
Paired Differences
95% Confidence Interval
of the Difference
Mean
Sig.
Std. Error
F2010-S2011 FYS
Change
Std. Dev.
Mean
Lower
Upper
t
df
(2-tailed)
Posttest - Pretest
3.80
3.218
.121
3.560
4.036
31.378
706
.000
Back to Findings 2c
Appendix B: P a g e B 70
Part II: Class Discussion in ENG 200 (outcome 2c questions)
Question 15
How beneficial was class discussion to your learning experience?
Answers
Percent Answered
Not valuable at all
5.491%
Rarely valuable
6.069%
Somewhat valuable
32.37%
Very valuable
56.069%
Unanswered
Question 17
0%
Which statement below best describes the connection between your writing in the class
and class discussion?
Answers
Question 20
Percent Answered
Class discussion helped me remain focused on completing projects.
14.162%
Class discussion helped me understand class materials and clarify my own ideas
in writing.
32.659%
Both of the above
38.728%
None of the above
5.491%
I see no connection between discussion and my writing progress.
8.671%
Unanswered
0.289%
From your experience this semester, which statement below best describes your perception
of the value of class discussion in addressing multiple texts or points of view on a single
cultural issue, such as war, social class, science, and/or education?
Answers
Percent Answered
Class discussion is not an effective way of understanding multiple points of view.
2.89%
Class discussion is a somewhat effective way of understanding multiple points of
view.
15.318%
Class discussion is an effective way of understanding multiple points of view.
36.416%
Class discussion is a highly effective way of understanding multiple points of
view.
45.376%
Unanswered
Question 22
0%
After your experience in Eng 200 this semester, what advice would you give your instructor
or the General Education Writing Program about class discussion?
Answers
Percent Answered
Class discussion should be highly encouraged as a form of instruction in a writing
class.
Class discussion should be highly discouraged as a form of instruction in a
writing class.
96.821%
3.179%
Unanswered
0%
Back to Findings 2c
Appendix B: P a g e B 71
Normal Distribution*
*Wikipedia. (2011, Aug 2). Retrieved from
http://en.wikipedia.org/wiki/File:Standard_deviation_diagram.svg
Back to IMS 300 Findings
Appendix B: P a g e B 72
Support Table 6a-1: 2010-11 HUM 170 Posttest Frequency
Distribution
Posttest Score
50
1
Percent
2.0
48
3
5.9
7.0
9.3
46
8
15.7
18.6
27.9
44
5
9.8
11.6
39.5
42
1
2.0
2.3
41.9
40
6
11.8
14.0
55.8
38
4
7.8
9.3
65.1
36
4
7.8
9.3
74.4
34
4
7.8
9.3
83.7
30
4
7.8
9.3
93.0
26
1
2.0
2.3
95.3
24
1
2.0
2.3
97.7
100.0
22
Total
Missing
Total
Count
Valid Percent
2.3
Cum Pct
2.3
1
2.0
2.3
43
84.3
100.0
8
15.7
51
100.0
Support Table 6a-2: 2010-11 HUM 170 Change Score Frequency
Distribution
Change Score
32
Count
1
Percent
2.0
Valid Percent
2.6
Cum Pct
2.6
26
2
3.9
5.3
7.9
24
1
2.0
2.6
10.5
22
1
2.0
2.6
13.2
20
2
3.9
5.3
18.4
18
5
9.8
13.2
31.6
16
1
2.0
2.6
34.2
14
5
9.8
13.2
47.4
12
5
9.8
13.2
60.5
10
10
19.6
26.3
86.8
8
5
9.8
13.2
100.0
Total
38
74.5
100.0
Missing
13
25.5
Total
51
100.0
Back to Findings 6a
Appendix B: P a g e B 73
NSSE Table 1: 2009 First Year (FY) Class Baseline Data for Mapped SLOs
MSU General Education Student Learning
Outcomes
1. Communication Skills
NSSE Items mapped
to SLOs
11b) Acquiring job or work-related knowledge
& skills
2009
NSSE
FY
Effect
Mean Mean Sig Size
2.95 2.82
*
.13
1a. Listen and speak effectively in conversational, 11d) Speaking clearly and effectively
small group, public and intercultural contexts
11c) Writing clearly and effectively
1c. Write effectively for a variety of target
audiences using conventions associated with
standard English
3.00
2.84
2.97
3.02
-.06
2. Intellectual Skills
2b) Analyzing basic elements of an idea,
experience, or theory, such as examining a
particular case or situation in depth and
considering its components
11e) Thinking critically and analytically
3.13
3.14
- .01
3.17
3.23
-.07
2a. Employ current technologies to locate,
analyze, evaluate and use information in multiple
contexts and for a variety of purposes
1l) Used an electronic medium. . .to discuss or
complete an assignment
2.89
2.64
11g) Using computing & information technology
3.14
3.05
.11
2c. Thoughtfully analyze and evaluate diverse
points of view
1e) Included diverse perspectives (different races,
2.72
religions, genders, political beliefs, etc.) in class
discussions or writing assignments
2c) Synthesizing and organizing ideas, information, 2.83
or experiences into new, more complex
interpretations and relationships
2.80
-.10
2.93
-.12
2e) Applying theories or concepts to practical
problems in new situations
3.12
3.08
.05
11m) Solving complex real-world problems
2.69
2.91
2.72
2.96
-.03
-.06
.01
2e. Apply knowledge and skills to new settings
and complex problems
**
***
.17
.24
3. Quantitative Skills
11f) Analyzing quantitative problems
3c. Verify answers to mathematical and scientific
problems to determine reasonableness, identify
alternative methods of solution, and select the
most reliable results.
2d) Making judgments about the value of
information, arguments, or methods, such as
examining how others gathered and interpreted
data and assessing the soundness of their
conclusions
2.94
2.93
4. Knowledge of Human Cultures
11l) Understanding people of other racial &
ethnic backgrounds
11i) Voting in local, state, or national elections
2.56
2.71
*
- .15
2.44
2.57
*
-.12
6b) Exercised or participated in physical fitness
activities
2.71
2.82
- .11
11m) Solving complex real-world problems (d)
2.69
2.72
-.03
4a. Examine the history of the United States and
explain the basic principles and operation of the
United States government with a view to being a
responsible citizen
4d. Comprehend the cycle of human growth
necessary to provide sustained health and
individual well-being.
5. Knowledge of the Natural World
5a. Comprehend and apply basic scientific,
11g) Using computing and information technology 3.14 3.05
quantitative, and technological methods and
(dup)
knowledge of natural systems to the solution of
scientific problems.
6. Knowledge of Aesthetics
11a) Acquiring a broad general education
3.20 3.16
*based in part on the table entitled „2009 NSSE Survey Items Mapped to SACS Criteria‟ from the NSSE Accreditation Toolkit.
.11
.04
Appendix B: P a g e B 74
NSSE Table 2: 2009 Senior Class Baseline Data for Mapped SLOs
2009
NSSE
SR
Effect
Mean Mean Sig Size
3.36 3.06 *** .32
MSU General Education
Student Learning Outcomes
1. Communication Skills
NSSE Items mapped
to SLOs
11b) Acquiring Job or work-related knowledge &
skills
1a. Listen and speak effectively in
conversational, small group, public and
intercultural contexts
11d) Speaking clearly and effectively
3.29
2.99
***
.34
1c. Write effectively for a variety of target
audiences using conventions associated with
standard English
11c) Writing clearly and effectively
3.31
3.11
***
.24
2. Intellectual Skills
2b) Analyzing the basic elements of an idea,
experience, or theory, such as examining a
particular case or situation in depth and
considering its components
3.26
3.28
11e) Thinking critically and analytically
3.46
3.36
*
.13
3.21
2.86
***
.34
3.39
3.21
***
.22
2c. Thoughtfully analyze and evaluate diverse 1e) Included diverse perspectives (different races,
3.00
points of view
religions, genders, political beliefs, etc.) in class
discussions or writing assignments
2e. Apply knowledge and skills to new settings 2c) Synthesizing and organizing ideas, information , or 3.13
and complex problems
experiences into new, more complex interpretations
and relationships
2.83
***
.18
2a. Employ current technologies to locate,
1l) Used an electronic medium. . .to discuss or
analyze, evaluate and use information in
complete an assignment
multiple contexts and for a variety of purposes 11g) Using computing & information technology
- .03
3.08
.05
.09
2e) Applying theories or concepts to practical
problems in new situations
3.31
3.24
11m) Solving complex real-world problems
2.91
3.25
2.80
3.08
*
***
.11
.19
***
.21
3. Quantitative Skills
11f) Analyzing quantitative problems
3c. Verify answers to mathematical and
scientific problems to determine
reasonableness, identify alternative methods of
solution, and select the most reliable results.
2d) Making judgments about the value of information,
arguments, or methods, such as examining how others
gathered and interpreted data and assessing the
soundness of their conclusions
3.21
3.03
4. Knowledge of Human Cultures
11l) Understanding people of other racial & ethnic
backgrounds
11i) Voting in local, state, or national elections
2.66
2.67
- .01
2.45
2.33
.10.
2.53
2.73
*** - .19
2.91
2.80
4a. Examine the history of the United States
and explain the basic principles and operation
of the United States government with a view to
being a responsible citizen
4d. Comprehend the cycle of human growth
6b) Exercised or participated in physical fitness
necessary to provide sustained health and
activities
individual well-being.
5. Knowledge of the Natural World
11m) Solving complex real-world problems (d)
*
.11
5a. Comprehend and apply basic scientific,
11g) Using computing and information technology (d) 3.39 3.21 *** .22
quantitative, and technological methods and
knowledge of natural systems to the solution
of scientific problems.
6. Knowledge of Aesthetics
11a) Acquiring a broad general education
3.39 3.25
**
.18
*based in part on the table entitled „2009 NSSE Survey Items Mapped to SACS Criteria‟ from the NSSE Accreditation Toolkit.
Back to: Findings 2a Findings 2c
Findings 2e
Goal 3 Goal 4
Findings 4d
Goal 5 Goal 6
Appendix B: P a g e B 75