Support Unit - East Carolina University

Office of Institutional Planning,
Assessment, and Research
East Carolina
University
Support Unit Assessment
Resource Manual
Last Modified November 15, 2013
1
Table of Contents
ECU Strategic Directions………………………………………………………………2
Assessment at East Carolina University……………………………………………..3
Assessment Report Timeline………………………………………………………….4
Assessment Report Required Components…………………………………………5
Types of Outcomes……………………………………………………………………..6
Means of Assessment………………………………………………………………….9
Criterion for Success………………………………………………………………….10
Results………………………………………………………………………………….13
Actions Taken: Use of Results………………………………………………………15
Follow-up to Actions Taken…………………………………………………………...17
Overview of the Assessment Report Review Process…………………………….18
2012-13 Support Unit Assessment Report Evaluation Rubric…………………....25
Appendix A: Bloom’s Taxonomy of Action Verbs………………………………..…28
Last Modified November 15, 2013
2
East Carolina University Strategic Directions
Education for a New Century
• ECU will prepare our students to compete and succeed in the
global economy.
The Leadership University
• ECU will distinguish itself by the ability to train and prepare leaders
for our state and nation.
Economic Prosperity in the East
• ECU will create a strong, sustainable future for the East through
education, innovation, investment, and outreach.
Health, Health Care, and Medical Innovation
• ECU will save lives, cure diseases, and positively transform the
quality of health care for the region and state.
The Arts, Culture, and the Quality of Life
• ECU will provide world-class entertainment and powerful
inspiration as we work together to sustain and improve the
community’s quality of life
Last Modified November 15, 2013
3
Assessment at East Carolina University
In assuring a high quality education for our students, East
Carolina University is looking purposefully toward developing a
culture of evidence. Such a culture provides an evidence-based
framework for improving, revising, and introducing
comprehensive systems for the collection, dissemination, and
utilization of information on meaningful student learning
outcomes and programmatic improvements. Such information
can be used to develop new pedagogies, curricula, and
technologies to improve learning. Embracing such a culture of
innovation and quality improvement has been specifically called
for in the report of the National Commission on Higher
Education, otherwise known as the Spellings Commission.
Institutional Assessment is dedicated to a concept of quality
improvement. The office will support the university in achieving
continuous improvement by engaging educational programs and
support units in a sound and meaningful process of outcome
assessment. Although evaluation of an institution’s educational
quality and its effectiveness in achieving its mission is a difficult
task requiring careful analysis and professional judgment, an
institution is expected to document the quality and effectiveness
of all its programs and services.
Institutional Assessment works with the Provost, Deans,
Associate Deans, Department Chairs and Faculty and Staff to
craft, document and review assessment reports. Feedback is
provided to each unit on its outcomes assessment activities
according to a pre-defined rubric (see ECU Assessment Review
Process). The major objective continues to be to provide
meaningful and consistent information to units in order to
nurture a culture of assessment and foster institution-wide
improvement in institutional effectiveness.
Last Modified November 15, 2013
4
Assessment Report Timeline
Support Units Include: Administrative Support, Academic and
Student Support Services, Research and Public Service
Annual
Assessment
Reports
Due:
Results – summary of the findings
of the data collected through your
Means of Assessment
August 1
Actions Taken (Use of Results) –
summary of what was done or
decisions made (past tense) based
on the Results and indicate how
results were used for outcome/unit
improvement
Follow-Up to Actions Taken –
summary of whether or not the
Actions Taken from 2011-12
worked or not
Last Modified November 15, 2013
5
Assessment Report Required Components
Each year, units should provide thorough information in these seven areas:
Mission
Outcomes
•Unit Mission Statement
•The mission should be that of the unit, not the college, school or
department.
•Outcomes (Minimum of 3 except if units are classified as both research and
public service when they must have a minimum of 2 outcomes in each plan.)
•Outcomes should reflect the key functions and services of the unit.
•Means of Assessment
•Means of Assessment briefly describe how data is collected; methods must
Means of
clearly measure the stated outcomes.
Assessment
•Criterion for Success
•Criterion describes how well the unit is expected to perform. It should be set
Criterion for
at a level appropriate to the unit (i.e. not set too low just to be attainable).
Success
Results
Actions
Taken
Follow-up
•Results
•Results succinctly summarize the data collected with the means of
assessment and clearly state if the criterion for success was met or not;
clearly tied to the outcomes.
•Actions Taken (Use of Results)
•Actions taken clearly describe what was done (past tense) based on the data
collected and indicate how results were used for outcome/unit improvement.
•Follow-up to Actions Taken
•Follow-up to actions taken briefly describe whether the actions taken in the
previous year worked or not; this is reported for the previous year after there
has been sufficient time allotted to determining the effectiveness of the
action.
Last Modified November 15, 2013
6
Types of Outcomes
Each support unit must be assessing at least three outcomes. The only exception is if a
unit is classified as two types (for example, classified as both research and public
service). Then the unit should have at least two outcomes in each assessment report.
Outcomes should reflect the key functions and/or services of the unit. Units are strongly
encouraged to use the following template when writing their program learning outcomes:
The office of X will _____________ (relates to the unit’s mission) by _____________
(what the unit is assessing).
Support units can classify their outcomes as administrative, student learning, service, or
strategic planning in Tracdat.




Administrative Outcome
Student Learning Outcome
Service Outcome
Strategic Planning Outcome
Last Modified November 15, 2013
7
Examples
of
Outcomes
The Continuing Professional Education office will respond to
the educational needs of professionals and adults in our
service region by expanding online and face-to-face offerings.
The student development office will strive to achieve the vision
of leadership and service instilled by the university by tracking
student-athletes' participation in the community outreach
program.
The Office for Equity and Diversity fosters and advances an
environment that is equitable, diverse, inclusive and
community-connected by offering educational programs and
professional development.
The Pediatric and Healthy Weight Research and Treatment
Center will help reduce childhood obesity through research by
increasing grant submissions and awards.
The Office of Undergraduate Research will broaden the
dissemination of undergraduate research accomplishments by
increasing the number of student research presentations at
local, state, regional, and national meetings.
Last Modified November 15, 2013
8
The following hypothetical example will demonstrate how all required components of the
assessment report work together to provide a clear and concise representation of how
units should write each assessment report component.
Example
Outcome: The student development office will strive to achieve the
vision of leadership and service instilled by the university by
tracking student-athletes' participation in the community outreach
program.
Last Modified November 15, 2013
9
Means of Assessment
The Means of Assessment should briefly describe how data is collected and must clearly
measure the stated outcomes. Rubrics, surveys, etc. should be attached when appropriate.
Because it is good assessment practice to use multiple means of assessment, at least one
outcome should have more than one means of assessment.
Examples of
Means of
Assessment
Use item 18 on the AAMC Graduation Survey to measure
satisfaction with financial aid administrative services.
Students respond using a 5-point Likert scale (1=Not at
all satisfied, 5= Very Satisfied).
Sedona output and the annual report will be reviewed at
the end of the academic year to assess the total value of
grants, contracts, and other awards secured through the
office.
Utlilize student data collected during the application and
selection process to determine average SAT score.
Last Modified November 15, 2013
10
Example
Outcome: The student development office will strive to achieve the
vision of leadership and service instilled by the university by
tracking student-athletes' participation in the community outreach
program.
Means of Assessment: Participation rates are calculated using the
student-athlete roster and Community Outreach Forms that are
completed by student-athletes after finishing their community
outreach. Participation rates are calculated from August 1-July 31
each year.
Last Modified November 15, 2013
11
Criterion for Success
Criterion for Success describes how well the unit is expected to perform. It is to be
stated in a quantifiable manner so that it can be easily determined if it was met or not.
The criteria should be set at a level appropriate to the unit (i.e. not set too low just to be
attainable).
Examples of
Criterion for
Success
90% of students will report they were satisfied or very
satisfied with the service they received at the office.
Increase the number of grants submissions by 2
each year.
Decrease the number of late payments by 25%
annually.
Achieve a satisfactory rating on the Annual Fire and
Safety Inspection.
Maintain clean audit opinion from the state auditor by
receiving zero findings annually.
Last Modified November 15, 2013
12
Example
Outcome: The student development office will strive to achieve the vision of
leadership and service instilled by the university by tracking student-athletes'
participation in the community outreach program.
Means of Assessment: Participation rates are calculated using the student-athlete
roster and Community Outreach Forms that are completed by student-athletes after
finishing their community outreach. Participation rates are calculated from August 1July 31 each year.
Criterion for Success: 90% of student athletes will participate in the community
outreach program each year.
Last Modified November 15, 2013
13
Results
Assessment results should succinctly summarize the data collected with the means of
assessment and clearly state whether or not the Criterion for Success was met.
Examples
of Results
During 2012-13, 100 students recieved services from
our office and completed the survey. 75% of the
students indicated they were either satisfied or very
satisfied with the services they received. We did not
meet our goal of 90% student satisfaction.
Faculty and staff in the center submitted 10 articles
for publication during the academic year, exceeding
our goal of 8. Five of these submissions have been
accepted for publication.
During 2012-13, we offered 5 educational sessions
for the community to attend. While this fell short of
our goal of 8 educational sessions, the five sessions
were well attended.
During 2012-13, 50 students participated in research
activities, compared to only 40 in 2011-12. This
represents a 25% increase. Our criterion was met.
Last Modified November 15, 2013
14
Example
Outcome: The student development office will strive to achieve the vision of
leadership and service instilled by the university by tracking student-athletes'
participation in the community outreach program.
Means of Assessment: Participation rates are calculated using the studentathlete roster and Community Outreach Forms that are completed by studentathletes after finishing their community outreach. Participation rates are
calculated from August 1-July 31 each year.
Criterion for Success: 90% of student athletes will participate in the
community outreach program each year.
Results: Between August 1, 2012 and July 31, 2013, 92.9% of studentathletes participated in the community outreach program. The criterion for
success was met. Please see the attached document for sport-specific
participation rates.
Last Modified November 15, 2013
15
Actions Taken (Use of Results)
Actions taken must directly relate to assessment results and appear to be clear, logical,
and feasible. The actions should be those that have taken place, and capture the
interpretation of the collected data, along with conversations of staff. The emphasis is on
what has been done based on the results to improve the unit, not what is going to be
done in the future. Actions Taken must be written in the past tense, and clearly show
how results are being used for improvement.
Examples
of Actions
Taken
(Use of
Results)
After reviewing results from the satisfaction survey, new
training materials were developed and posted on our website.
The new training manual and presentation addressed several
issues identified in the survey, including how to enter a follow
up statement in Tracdat and how to complete an assessment
rubric.
Although the center met the criterion for publications, faculty
decided to continue the scholarly activity review group that
was formed last year. This group meets monthly to review any
publications or grants prior to submissionmorriss.
Because mapping showed that more computers were needed
to meet demand during the evenings and the period
preceeding and during exams, an additional 20 laptops were
purchased and added to the laptop loan program.
In order to increase attendance at the event staff had planned
to send postcard announcements and reminders. However,
when funding for mailings was cut, staff focused on
communication through Facebook and other electronic
means.
Last Modified November 15, 2013
16
Example
Outcome: The student development office will strive to achieve the vision of
leadership and service instilled by the university by tracking student-athletes'
participation in the community outreach program.
Means of Assessment: Participation rates are calculated using the student-athlete
roster and Community Outreach Forms that are completed by student-athletes after
finishing their community outreach. Participation rates are calculated from August 1July 31 each year.
Criterion for Success: 90% of student athletes will participate in the community
outreach program each year.
Results: Between August 1, 2012 and July 31, 2013, 92.9% of student-athletes
participated in the community outreach program. The criterion for success was met.
Please see the attached document for sport-specific participation rates.
Actions Taken: One month prior to the July 31 deadline for calculating participation
rates, a status report was provided to each head coach. Coaches were encouraged to
contact students who had no record of community service hours and remind them to
submit any unreported hours or to participate in a service activity. Involving coaches
allowed them to take more ownership over their teams' participation.
Last Modified November 15, 2013
17
Follow-Up to Actions Taken
Follow-up to Actions Taken briefly describe whether the actions reported in the previous
year worked or not to improve the unit; this is reported for the previous year after there
has been sufficient time allotted to determining the effectiveness of the action. Follow-up
summaries should be included as evidence that the action steps have been completed,
or that progress has been made. For example, if a unit is reporting 2012-13 results and
actions, follow-up summaries should be added to the 2011-12 actions taken. These
should “close the loop” of the assessment cycle.
Examples of
Follow-Up
to Actions
Taken (Use
of Results)
Based on the 2012-13 results that showed we met
our criterion for success, the 2011-12 action to
purchase additional laptops worked to provide
computer access to students.
The action implemented in the previous academic
year was not successful. Based on the collection of
results from this year, it has been determined by staff
that additional training in customer service is needed.
Last Modified November 15, 2013
18
Example
Outcome: The student development office will strive to achieve the vision of
leadership and service instilled by the university by tracking student-athletes'
participation in the community outreach program.
Means of Assessment: Participation rates are calculated using the student-athlete
roster and Community Outreach Forms that are completed by student-athletes after
finishing their community outreach. Participation rates are calculated from August 1July 31 each year.
Criterion for Success: 90% of student athletes will participate in the community
outreach program each year.
Results: Between August 1, 2012 and July 31, 2013, 92.9% of student-athletes
participated in the community outreach program. The criterion for success was met.
Please see the attached document for sport-specific participation rates.
Actions Taken: One month prior to the July 31 deadline for calculating participation
rates, a status report was provided to each head coach. Coaches were encouraged
to contact students who had no record of community service hours and remind them
to submit any unreported hours or to participate in a service activity. Involving
coaches allowed them to take more ownership over their teams' participation.
Follow-Up to Actions Taken (in 2012-13): Based on the results collected in AY 13-14,
which showed that 98% of student athletes participated in a service activity, the 201213 action of asking coaches to talk with athletes who had no service activity hours
worked to improve participation rates.
Last Modified November 15, 2013
19
Overview of the “Institutional Effectiveness” Assessment Process
East Carolina University is committed to documenting the quality and effectiveness of all its
programs and services. As such, all academic and administrative/support units on campus have
developed an assessment plan and annually report results, actions taken, and follow up to
previous actions taken. Components of the assessment report are as follows:
1. Follow-Up to Actions Taken summarizes whether the improvement initiative worked or
not. The follow-up summary closes the assessment loop for the previous reporting year.
2. Results of the current means of assessment summarize the data collection process and
clearly provide evidence that the Criteria for Success was met, partially met, or not met
3. Actions Taken (Use of Results)clearly describe what was done and how the data
collected was used for improvement
4. Assessment Plan revised, if appropriate
ECU has adopted TracDat as the institutional tracking system that provides the venue to house
the assessment reports.
The Office of Institutional Assessment (OIA), the Institutional Assessment Advisory Council
(IAAC), the Assessment Review Committees (ARC) and the Unit Assessment Coordinators
(UACs) work together to:
1. Provide resources, guidance, training and support for faculty and staff,
2. Promote timely submission of follow up actions, results, actions taken, and revisions to
plans, if appropriate; and
3. Ensure that an effective and timely review process is completed
The OIA provides oversight to the assessment process in support of institutional effectiveness
and works with the IAAC to manage the ARCs that review the assessment reports.
Institutional Assessment Advisory Council
The IAAC was established in September 2012 and evolved naturally from the former SACS
Institutional Effectiveness Committee. The primary purpose of the IAAC is to advise the OIA on
matters relating to assessment and to manage the ARCs that review the assessment plans and
reports.
Each academic college/school and major administrative/support unit are represented on the
IAAC. Members are selected and appointed by the appropriate college dean, vice chancellor or
Provost and serve for at least one year with the normal length of service expected to be three
years. A member can be re-appointed. The Chair is elected from the membership and serves for
a three year time period.
Last Modified November 15, 2013
20
The IAAC charge includes, but is not limited to, the following duties:
1. Providing advice on guidelines and processes for comprehensive assessment in each
area where assessment is required, including determining the assessment cycles
appropriate to different university offices.
2. Providing advice on procedures for ensuring the skilled application of assessment tools
and appropriate use of results by faculty, staff and administrators (the implementation of
training workshops, for example).
3. Providing advice on the purpose, method and use of assessment in undergraduate and
graduate program review and of the forms of academic and support unit review.
4. Providing advice on the purpose, method and use of assessment in meeting UNC
General Administration and ECU productivity goals.
5. Providing advice on quality standards for assessment tools and their application.
6. Providing advice on procedures for reviewing assessment plans, activities, their products
and the use of these products.
7. Providing advice on guidelines and procedures that support campus engagement in
collaborative, integrated planning and assessment that supports institutional
effectiveness.
IAAC responsibilities in managing Assessment Review Committees (ARCs):
1. Chair an ARC for a designated division/area in the university and organize the ARC’s
reviews of its assessment reports
2. Provide a current list of ARC members and UACs in designated units to the OIA each fall
or whenever changes are made
3. Communicate all institutional instructions and deadlines regarding assessment to ARCs,
UACs and others as appropriate
4. Establish internal instructions and deadlines for submission of unit assessment reports
5. Work with the OIA to provide assistance with TracDat and to coordinate training for
assessment personnel and other appropriate groups
6. Inform deans, directors and vice chancellors of issues related to the continuous quality
improvement process impacting institutional effectiveness
7. Report a summary of the ARC’s reviews/assessment plans to the IAAC each
spring.
Assessment Review Committee
On March 28, 2013, the Academic Council approved the Assessment Review Committee (ARC)
structure in order to facilitate the institutional effectiveness assessment process at East Carolina
University.
Each major academic or administrative area/division determines the membership and length of
service for its ARC. The duties of the Assessment Review Committee include:
1. Guiding faculty and staff in adhering to university and internal review criteria and
deadlines
2. Reviewing and evaluating the quality of assessment reports using the approved
university’s rubric
3. Working with assigned IAAC chair to prepare the summary report due in the spring (see
copy of UCF’s report)
Last Modified November 15, 2013
21
As of spring 2013, there are a total of 18 Assessment Review Committees needed to represent
each major academic or administrative unit/division currently identified at East Carolina
University:
Academic Units/Divisions
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Brody School of Medicine (22 units)
College of Education (45 units)
College of Allied Health Sciences (29 units)
College of Nursing (8 units)
College of Fine Arts and Communication (30 units)
Harriot College of Arts and Sciences – (114 units)
a. Undergraduate (46 units)
b. Graduate (34 units)
c. Foundations/Support Units (34 units)
College of Human Ecology (32 units)
School of Dental Medicine (4 units)
College of Health and Human Performance (33 units)
College of Technology and Computer Science (21 units)
College of Business (18 units)
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Dean’s Representative
Administrative Units/Divisions
12.
13.
14.
15.
16.
17.
18.
Academic Affairs (currently 29 units)
Administration and Finance (7 units)
Chancellor’s Division (14 units)
Division of Health Sciences (7 units)
Division of Research and Graduate Studies (21 units)
Division of Student Affairs (38 units)
Division of University Advancement (1 unit)
Provost’s Representative
Vice-Chancellor’s Representative
Chancellor’s Representative
Vice-Chancellor’s Representative
Vice-Chancellor’s Representative
Vice-Chancellor’s Representative
Vice-Chancellor’s Representative
Unit Assessment Coordinator (UAC)
Unit Assessment Coordinators (UACs) have been identified for every assessment unit on
campus and are responsible for entering the information into TracDat. For academic programs,
the UACs should be familiar with the program and for administrative/support units; the UACs
should be familiar with the unit’s mission and function.
After faculty and staff develop the outcomes, means of assessment, and criterion for success,
the UACs coordinate the collection of results and actions taken for the current year as well as
follow up actions on the previous year by leading the faculty and staff in a discussion of the
assessment report. The UACs share the assessment reports with program chairs, directors and
deans/directors/vice chancellors prior to the final submission.
Last Modified November 15, 2013
22
Assessment Review Cycle
Annual assessment reports are due in TracDat no later than August 1 for administrative units
and October 15 for academic units. Internal unit deadlines may be earlier. A complete
assessment report includes:
 Follow up actions on the previous reporting year to “close the loop”
 Results for the current reporting year
 Action taken for the current reporting year
Reviews of the assessment reports are to be completed using the university approved rubric
(currently in InfoPath) no later than December 15 as specified below. Internal unit deadlines may
be earlier:
 All assessment reports are reviewed in academic years ending in an even number (i.e.
2013-2014)
 Only the units identified as beginning or developing are reviewed in academic years
ending in an odd number (i.e. 2014-2015).
Deans/Directors/Vice Chancellors receive an aggregate report of assessment components and
overall unit rating no later than January 31 (run by IPAR).
A summary of the assessment reviews compiled by the ARCs will be presented to the IAAC
during spring semester.
The official Institutional Effectiveness Assessment Report (format to be developed) is compiled
by the OIA in collaboration with the chair of the IAAC and submitted to the Institutional
Effectiveness Council* no later than August 15 as specified below:
 A full report of all assessment reports is submitted in academic years ending in even
numbers (i.e. 2013-2014)
 An interim report of only the units identified as beginning or developing is submitted in
academic years ending in odd numbers (i.e. 2014-2015)
Last Modified November 15, 2013
23
The responsibilities of the OIA regarding the annual Institutional Effectiveness Assessment
Annual report include, but are not limited to the following:
1. Communicating in a timely manner with units not responding to assessment
requirements, with copies sent to all IAAC members.
2. Summarizing each spring, in a memorandum sent to deans and vice chancellors, the
status of assessment reports in the respective college/school or division, with copies sent
to IAAC members.
3. Sending a final communication on the status of all non-responding units to the
appropriate dean, director or vice chancellor at the end of the second summer session
before the final submission of the Institutional Effectiveness Assessment Annual report,
with copies sent to all IAAC members.
4. Submitting the official Institutional Effectiveness Assessment Annual report in mid-August
to the Institutional Effectiveness Council*
Office of Institutional Assessment provides support for the process through:
Training:
Surveys:
Institutional-level Assessments:
Integration with campus strategic planning efforts:
*Institutional Effectiveness Council
•
•
Chaired by the Associate Provost for IPAR
Suggested members to include: Representatives from Academic Deans, Directors,
Provost and Vice Chancellors as well as associate deans
Last Modified November 15, 2013
24
ECU Assessment Report Review Model
Faculty and Staff
Unit Assessment
Coordinator
Assessment Review
Committees
Institutional
Assessment
Advisory Council
Institutional
Effectiveness
Council
Last Modified November 15, 2013
25
2012-13 Support Unit Assessment Report Evaluation Rubric
Type the “short name “of the outcome you are reviewing. The “short name” can be found in the
Outcomes column to the far left of the four column assessment report beneath the word “Outcome”.
Outcome Statement – Statements that describe the key functions and services within the support
unit.


Developing
Is not stated clearly.
Is not feasible in terms
of collecting accurate
and reliable data.


Acceptable
Represents key functions
and/or services of the unit.
Is stated in terms that can
be determined but needs
clarification.
Proficient
 Is stated clearly.
 Aligns clearly to the unit’s
mission.
 Is feasible to collect
accurate and reliable data.
Please provide comments that support any “Developing” or “Acceptable” ratings. The comments will help
the unit see how they can improve their report to become “Proficient.”
Comments _______________________________________________________________________
Means of Assessment – Methods should be briefly described in terms of direct and/or indirect data
collection. The methods should clearly measure the stated outcomes.



Developing
Assessment instruments
and/or methods have not
been developed and/or
implemented.
Assessment instruments
and/or methods are vaguely
described.
Assessment instruments
and/or methods do not
measure the stated
outcome.


Acceptable
Assessment instruments
and/or methods are
understandably described.
Assessment instruments
and/or methods measure
the stated outcome.



Proficient
Multiple means of
assessment for outcome.
Assessment instruments
attached, where
appropriate.
Assessment instruments
reflect sound methodology
(validated and reliable).
Please provide comments that support any “Developing” or “Acceptable” ratings. The comments will help
the unit see how they can improve their report to become “Proficient.”
Comments _________________________________________________________________________
Last Modified November 15, 2013
26
Criterion for Success - – Criterion should describe specifically how well the unit is expected to perform on
the outcome. It should be set at a level appropriate to the unit (i.e. not set too low just to be attainable).


Developing
Criterion for success (desired
level of achievement) was
not included, too general, or
inappropriate.
Criterion for success does
not match the means of
assessment.


Acceptable
Criterion for success
(desired level of
achievement) was described
for all means of assessment
but too much information is
provided for clarity.
Criterion for Success
matches all means of
assessment.


Proficient
Criterion for success (desired
level of achievement) was
clearly and succinctly
described for each means of
assessment.
Criterion for Success is
appropriate for all means of
assessment.
Please provide comments that support any “Developing” or “Acceptable” ratings. The comments will help
the unit see how they can improve their report to become “Proficient.”
Comments ___________________________________________________________________________
Results – Results should include a concise summary of the data collected with the means of assessment
and need to be clearly tied to the outcomes.



Developing
No results reported for the
current year.
Results are too general and
do not prove whether
criterion for success was
met, partially met, or not
met.
Results do not match the
means of assessment.



Acceptable
Some results are reported;
however, language is vague
or needs revision.
Results match the means of
assessment.
Results provide evidence
that criterion for success
was met, partially met, or
not met.


Proficient
Results are a clear,
complete, and wellorganized summary of data
for all means of assessment
used.
Results include supporting
documentation (rubrics,
surveys, tables, charts, etc.,
as appropriate).
Please provide comments that support any “Developing” or “Acceptable” ratings. The comments will help
the unit see how they can improve their report to become “Proficient.”
Comments __________________________________________________________________________
Last Modified November 15, 2013
27
Actions Taken –Actions Taken should clearly describe what was done (past tense) based on the data
collected and indicate how results were used for outcome/unit improvement.




Developing
Actions taken are missing for
the current reporting year.
Actions taken are futureoriented.
Actions taken indicate that
no changes were needed.
Actions taken do not
describe how results were
used for improvement of the
outcome/unit.



Acceptable
Actions taken are described
in past-tense.
Actions taken are based on
the data collected/results
entered.
Actions taken indicate how
results were used for
improvement of the
outcome/unit, but language
is vague or needs revision.



Proficient
Actions taken are specific
and succinct.
Actions taken clearly
described how results were
used for improvement of the
outcome/unit.
Actions taken that require
further explanation are
described within an attached
document.
Please provide comments that support any “Developing” or “Acceptable” ratings. The comments will help
the unit see how they can improve their report to become “Proficient.”
Comments _________________________________________________________________________
Follow Up to Actions Taken – Follow up to actions taken should summarize how the actions in the
previous reporting year impacted the outcome and whether the actions worked or not.


Developing
Follow up to the previous
reporting year’s actions
taken is missing.
Follow up does not
summarize whether or not
the actions taken worked or
not.

Acceptable
Summarizes whether or not
the actions taken worked
based on the current
reporting year’s assessment
results, but language is
vague or needs revision.

Proficient
Summarizes clearly
whether or not the
actions taken worked
based on the current
reporting year’s
assessment results
Please provide comments that support any “Developing” or “Acceptable” ratings. The comments will help
the unit see how they can improve their report to become “Proficient.”
Comments ___________________________________________________________________________
Last Modified November 15, 2013
28
Appendix A – Bloom’s Taxonomy Action Verbs
Definitions
Bloom’s
Definition
Knowledge
Remember
previously
learned
information.
Comprehension
Demonstrate an
understanding of
the facts.
Application
Apply
knowledge to
actual
situations.
Analysis
Break down
objects or ideas
into simpler
parts and find
evidence to
support
generalizations.
Synthesis
Compile
component
ideas into a
new whole or
propose
alternative
solutions.
Verbs
• Arrange
• Define
• Describe
• Duplicate
• Identify
• Label
• List
• Match
• Memorize
• Name
• Order
• Outline
• Recognize
• Relate
• Recall
• Repeat
• Reproduce
• Select
• State
• Classify
• Convert
• Defend
• Describe
• Discuss
• Distinguish
• Estimate
• Explain
• Express
• Extend
• Generalized
• Give
example(s)
• Identify
• Indicate
• Infer
• Locate
• Paraphrase
• Predict
• Recognize
• Rewrite
• Review
• Select
• Summarize
• Translate
• Apply
• Change
• Choose
• Compute
• Demonstrate
• Discover
• Dramatize
• Employ
• Illustrate
• Interpret
• Manipulate
• Modify
• Operate
• Practice
• Predict
• Prepare
• Produce
• Relate
• Schedule
• Show
• Sketch
• Solve
• Use
• Write
• Analyze
• Appraise
• Breakdown
• Calculate
• Categorize
• Compare
• Contrast
• Criticize
• Diagram
• Differentiate
• Discriminate
• Distinguish
• Examine
• Experiment
• Identify
• Illustrate
• Infer
• Model
• Outline
• Point out
• Question
• Relate
• Select
• Separate
• Subdivide
• Test
• Arrange
• Assemble
• Categorize
• Collect
• Combine
• Comply
• Compose
• Construct
• Create
• Design
• Develop
• Devise
• Explain
• Formulate
• Generate
• Plan
• Prepare
• Rearrange
• Reconstruct
• Relate
• Reorganize
• Revise
• Rewrite
• Set up
• Summarize
• Synthesize
• Tell
• Write
Last Modified November 15, 2013
Evaluation
Make and
defend
judgments
based on
internal
evidence or
external
criteria.
• Appraise
• Argue
• Assess
• Attach
• Choose
• Compare
• Conclude
• Contrast
• Defend
• Describe
•
Discriminate
• Estimate
• Evaluate
• Explain
• Judge
• Justify
• Interpret
• Relate
• Predict
• Rate
• Select
• Summarize
• Support
• Value