Assessing Student g Learning: Questions to ask about your Plan

Assessing
g Student
Learning: Questions
to ask about your Plan
Hamline University
February 2008
February,
Susan H
S
Hatfield
tfi ld
Winona State University
[email protected]
Assessing Student
Learning
Review your Student Learning
Outcomes
Review your Student Learning
O t
Outcomes
•
1. H
1
Has th
the program id
identified
tifi d program
level student learning outcomes?
Review your Student Learning
O t
Outcomes
•
•
•
•
•
•
1A: Reasonable number?
1B: Tied to mission / goals?
1C: Appropriate format?
1D: Cog
Cognitively
t e y app
appropriate?
op ate
1E: Do faculty agree on the definition of
the outcome? (components)
1F: Is there a common language for
describing student performance?
(characteristics)
Review your Student Learning
O t
Outcomes
•
•
•
•
•
•
1A: Reasonable number?
1B: Tied to mission / goals?
1C: Appropriate format?
1D: Cog
Cognitively
t e y app
appropriate?
op ate
1E: Do faculty agree on the definition of
the outcome? (components)
1F: Is there a common language for
describing student performance?
(characteristics)
l
A Hamline graduate will be able to:
l
--serve, collaborate, and lead in a community
--solve p
problems in innovative,, integrative,
g
, analytical,
y
, and ethical
ways
--work and create understanding across cultural differences locally,
nationally,
y, and internationallyy
--use information and technology competently and responsibly
--communicate effectively in writing and in speaking
--apply
apply the theories and methods of a field of expertise
--engage independently and reflectively in lifelong learning
l
l
l
l
l
l
Review your Student Learning
O t
Outcomes
•
•
•
•
•
•
1A: Reasonable number?
1B: Tied to mission / goals?
1C: Appropriate format?
1D: Cog
Cognitively
t e y app
appropriate?
op ate
1E: Do faculty agree on the definition of
the outcome? (components)
1F: Is there a common language for
describing student performance?
(characteristics)
St d t Learning
Student
L
i
Outcomes
O t
•
FORMAT
FORMAT:
Students should be able to
<<action verb>> <<something>>
St d t Learning
Student
L
i
Outcomes
O t
•
•
•
Learner Centered
Specific
Action oriented
Review your Student Learning
O t
Outcomes
•
•
•
•
•
•
1A: Reasonable number?
1B: Tied to mission / goals?
1C: Appropriate format?
1D: Cog
Cognitively
t e y app
appropriate?
op ate
1E: Do faculty agree on the definition of
the outcome? (components)
1F: Is there a common language for
describing student performance?
(characteristics)
COMPREHENSION
EVALUATION
APPLICATION ANALYSIS SYNTHESIS
KNOWLEDGE
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
T ll
Tell
Trace
Underline
Associate
Classify
Compare
Compute
p
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
Apply
Calculate
Classify
Demonstrate
Determine
Dramatize
Employ
Examine
Illustrate
Interpret
Locate
Operate
Order
Practice
Report
Restructure
Schedule
Sketch
Solve
Translate
Use
Write
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Su rize
Test
Arrange
Assemble
Collect
C
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
Validate
Lower division course
outcomes
COMPREHENSION
EVALUATION
APPLICATION ANALYSIS SYNTHESIS
KNOWLEDGE
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
T ll
Tell
Trace
Underline
Associate
Classify
Compare
Compute
p
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
Apply
Calculate
Classify
Demonstrate
Determine
Dramatize
Employ
Examine
Illustrate
Interpret
Locate
Operate
Order
Practice
Report
Restructure
Schedule
Sketch
Solve
Translate
Use
Write
Upper division
Course / Program
outcomes
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Summarize
Test
Arrange
Assemble
Collect
C
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
Validate
Review your Student Learning
O t
Outcomes
•
•
•
•
•
•
1A: Reasonable number?
1B: Tied to mission / goals?
1C: Appropriate format?
1D: Cog
Cognitively
t e y app
appropriate?
op ate
1E: Do faculty agree on the definition of
the outcome? (components)
1F: Is there a common language for
describing student performance?
(characteristics)
C
Components
t
•
•
•
•
Define student learning outcomes
Provide a common language for
describing student learning
Must be outcome specific
Must be shared across faculty
E
Example
l #1
Gather factual information and apply it to a given
problem in a manner that is relevant, clear,
comprehensive,
p
, and conscious of p
possible
bias in the information selected
BETTER: Students will be able to apply
factual information to a problem
COMPONENTS:
Relevance
Clarity
Comprehensiveness
Aware of Bias
E
Example
l #2
Imagine and seek out a variety of possible
goals, assumptions, interpretations, or
perspectives
p
p
which can g
give alternative
meanings or solutions to given situations or
problems
BETTER: Students will be able to provide
alternative solutions to situations or problems
COMPONENTS:
COMPONENTS
Variety of assumptions, perspectives,
interpretations
Analysis of comparative advantage
E
Example
l #3
Formulate and test hypotheses by performing laboratory,
simulation, or field experiments in at least two of the
natural science disciplines (one of these experimental
components should develop,
develop in greater depth
depth,
students’ laboratory experience in the collection of
data, its statistical and graphical analysis, and an
appreciation of its sources of error and uncertainty)
BETTER: Students will be able to test hypotheses.
COMPONENTS
Data collection
Statistical Analysis
Graphical Analysis
Identification of sources of error
Review your Student Learning
O t
Outcomes
•
•
•
•
•
•
1A: Reasonable number?
1B: Tied to mission / goals?
1C: Appropriate format?
1D: Cog
Cognitively
t e y app
appropriate?
op ate
1E: Do faculty agree on the definition of
the outcome? (components)
1F: Is there a common language for
describing student performance?
(characteristics)
Performance Rubric
•
Scale or description for assessing
each of the components /traits
•
Two to
T
t Five-point
Fi
i t scales
l ffor each
h
component / trait
Performance Rubric
Level or degree:
• Accurate, Correct
• Depth,
Depth Detail
• Coherence, Flow
• Complete,
Complete Thorough
• Integration
• Creative,
C ti
IInventive
ti
• Evidence based, supported
• Engaging,
E
i
enhancing
h
i
Performance Characteristics
D
Description
i ti
Anchors
A h
•
•
•
•
•
•
•
•
•
Missing - Included
Inappropriate - Appropriate
Incomplete - Complete
Incorrect - Partially Correct - Correct
Vague - Emergent - Clear
Marginal - Acceptable - Exemplary
Di t ti - Neutral
Distracting
N t l - Enhancing
E h
i
Usual - Unexpected - Imaginative
O di
Ordinary
- Interesting
I t
ti - Challenging
Ch ll
i
Performance Characteristics
D
Description
i ti
Anchors
A h
•
•
•
•
•
•
•
Simple - More fully developed - Complex
Reports - Interprets - Analyzes
Basic - Expected - Advanced
Few - Some - Several - Many
Isolated - Related - Connected - Integrated
Less than satisfactory - satisfactory - more
than satisfactory - outstanding
Never - Infrequently - Usually - Always
Performance Rubric
OUTCOME:
Performance Anchors:
Components
Descriptions
of
Performance
Performance Rubric
OUTCOME:
Components
Performance Characteristics
Does not meet
Meets
Exceeds
Expectations Expectations Expectations
Performance Rubric
Exceeds
Expectations
Meets
Expectations
Does not meet
Expectations
p
Review your Student
L
Learning
i O
Outcomes
t
•
2. Are the outcomes supported by core
2
courses in the curriculum?
Review your Student
L
Learning
i O
Outcomes
t
•
2. Are the outcomes supported by core
2
courses in the curriculum?
•
orphan outcomes?
Program
Learning
g
Outcomes
Course
1
Course
2
X
Course
3
X
X
X
X
X
Course
4
X
X
X
X
X
X
X
X
X
X
Course
5
X
X
Review your Student Learning
O t
Outcomes
•
2. Are the outcomes supported by core
2
courses in the curriculum?
•
•
orphan outcomes?
empty requirements?
Program
Learning
g
Outcomes
Course
1
Course
2
Course
3
Course
4
X
X
X
Course
5
X
X
X
X
X
X
X
X
X
X
X
X
Revisit your Assessment
Methods
Revisit your Assessment
M th d
Methods
•
•
•
•
3. Does the assessment plan measure
student learning, or program
effectiveness?
4. Does the plan rely on direct
measures of student learning?
5. Do the learning objects match the
outcomes?
6. Is there a systematic approach to
implementing the plan?
Revisit your Assessment
M th d
Methods
•
What the program will do or achieve
•
•
•
•
•
Curriculum
Retention
Graduation
Placement
Satisfaction (graduate and employer)
Revisit your Assessment
M th d
Methods
•
•
What students will do or achieve
Examines actual student work
Revisit your Assessment
M th d
Methods
•
•
•
•
3. Does the assessment plan measure
student learning, or program
effectiveness?
4. Does the plan rely on direct
measures of student learning?
5. Do the learning objects match the
outcomes?
6. Is there a systematic approach to
implementing the plan?
Direct Measures of Student
L
Learning
i
•
•
•
•
•
•
•
Capstone experiences
Standardized tests
Performance on national licensure
certification or professional exams
Portfolios
Essays or Term Papers
Exhibitions performances
Evaluation of Internships based upon
program learning outcomes
Revisit your Assessment
M th d
Methods
•
•
•
•
3. Does the assessment plan measure
student learning, or program
effectiveness?
4. Does the plan rely on direct measures
of student learning?
5. Do the learning objects match the
outcomes?
6. Is there a systematic approach to
implementing the plan?
L
Learning
i
Objects
Obj t
•
Standardized Exam, abstract, advertisement,
annotated bibliography, biography, briefing,
brochure, budget, care plan, case analysis, chart,
g
map,
p court brief, debate, definition,
cognitive
description, diagram, dialogue, diary, essay,
executive summary, exam, flow chart, group
y, lab notes,,
discussion,, instruction manual,, inventory,
letter to the editor, matching test, mathematical
problem, memo, micro theme, multiple choice test,
narrative,, news story,
y, notes,, oral report,
p , outline,,
performance review, plan, precis, presentation,
process analysis, proposal, regulation, research
proposal,
p
p
, review of literature,, taxonomy,
y, technical
report, term paper, thesis, word problem, work of art.
(Walvoord Anderson 1998).
M t hi
Matching
Outcomes
O t
to
t Objects
Obj t
•
•
•
•
•
Logical decision
Demonstrate components
? Individual vs. group assignment
? Amount of feedback p
provided in
preparation of the assignment
? Timing / program / semester
Revisit your Assessment
M th d
Methods
•
•
•
•
3. Does the assessment plan measure
student learning, or program
effectiveness?
4. Does the plan rely on direct measures
of student learning?
5. Do the learning objects match the
outcomes?
6. Is there a systematic approach to
implementing the plan?
Revisit your Assessment
M th d
Methods
•
Not every outcome in every course by
every faculty every semester
•
Identified Assessment Points
Program Level
Student Learning
g
Outcomes
Course
1
Course
2
x
Course
3
x
x
x
x
x
x
x
x
x
x
x
x
x
Course
5
x
x
x
x
Course
4
x
x
x
Reconsider your
Implementation Plan
Reconsider your
I
Implementation
l
t ti
Plan
Pl
•
•
7. Is the assessment plan being phased
in?
8. What is the method for collecting and
organizing data?
Program Level
Student Learning
g
Outcomes
Phase 4
Course
1
Course
2
K
Course
3
A
K
A
K
A
A
S
S
A
A
S
A
A
Course
5
S
K
K
K
Course
4
S
S
S
Reconsider your
I
Implementation
l
t ti
Plan
Pl
•
•
7. Is the assessment plan being phased in?
8. What is the method for collecting
g and
organizing data?
• Collecting
• Scoring
• Sampling
Reconsider your
I
Implementation
l
t ti
Plan
Pl
•
9. How were faculty trained to use
9
the tools?
Decision rules
Anchor papers
N
Norming
i sessions
i
Reaffirm your Data
I t
Interpretation
t ti Strategy
St t
Reaffirm your Data
I t
Interpretation
t ti
Strategy
St t
•
IMPORTANT!! The
Th first
fi t data
d t collection
ll ti
per se, but instead
in not about the data p
about testing the methods and tools.
Reaffirm your Data
I t
Interpretation
t ti
Strategy
St t
•
WARNING WARNING WARNING!!!
Do not be in a rush to Close the Loop
At the same time, be careful of letting the
process become gaseous
Reaffirm your Data
I t
Interpretation
t ti
Strategy
St t
•
10. Did the rubrics distinguish
between levels of achievement?
Goal: Communicate Effectively
Outcome:
Students will be able to deliver an oral presentation
Does not
meet
Meets
11%
65%
24%
23%
75%
2%
Organization
14%
81%
4%
Evidence
35%
54%
13%
Transitions
19%
71%
10%
Verbal
Delivery
Nonverbal
Delivery
Exceeds
Test Q
Question
Component Component Component
1
2
3
% meets
or exceeds
x
56%
x
x
x
77%
x
x
43%
x
Reaffirm your Data
I t
Interpretation
t ti
Strategy
St t
•
•
•
11. What patterns emerged in the
data?
12 Who was involved in the
di
discussion
i off th
the d
data?
t ?
13. What p
plans were made to
address issues of concern?
C
Consistency
i t
Examines the same practice of and
individual or group over time
Key question:
» Has this person or group acted, felt, or
performed this way in the past / over
time?
C
Consistency
i t
How well are students performing on the
departmental learning outcome measures?
High
performance
Low
performance
03
04
05
06
07
08
Consensus
•
Comparison to or among groups of
students
•
•
Variation between disciplines, gender, other
demographic variables
Key questions:
•
•
What is the general feeling, outcome, attitude,
behavior?
Do other groups of people act, perform or feel
this way?
Consensus
How well are students performing on the
departmental learning outcome measure?
High
performance
Low
performance
Females
Males
Transfers
OTA
Di ti ti
Distinctiveness
•
Examines individual or cohort perspectives
across different outcomes
•
Key Question:
• Does a person or group perform equally
as well on different outcomes?
Di ti ti
Distinctiveness
How well are our students performing on
the learning outcomes?
High
Performance A
N
A
L
Y
S
Low
I
Performance S
R
E
S
E
A
R
C
H
W
R
I
T
I
N
G
T
H
I
N
K
I
N
G
E
T
H
I
C
S
S
P
E
A
K
I
N
G
Reaffirm your Data
I t
Interpretation
t ti
Strategy
St t
•
•
•
11. What patterns emerged in the
data?
12 Who was involved in the
di
discussion
i off th
the d
data?
t ?
13. What p
plans were made to
address issues of concern?
Di
Discussion
i
Questions
Q
ti
Based upon your experience, does the
data surprise you?
Di
Discussion
i
Questions
Q
ti
Does our students’ performance on the
outcome meet our expectations?
Di
Discussion
i
Questions
Q
ti
How can this data be validated? What
else can we do to find out if this is
accurate?
Reaffirm your Data
I t
Interpretation
t ti
Strategy
St t
•
•
•
11. What patterns emerged in the
data?
12 Who was involved in the
di
discussion
i off th
the d
data?
t ?
13. What p
plans were made to
address issues of concern?
Cl i
Closing
the
th Loop
L
•
Development
•
•
Infrastructure
•
•
•
Faculty,
y Staff, Student
Policy,
y, Process,, Planning
g
Curriculum
Learning Opportunities
The Seven Principles for Good
Practice in Undergraduate
Education
•
•
•
•
•
•
•
1. Student-Faculty Contact
2 C
2.
Cooperative
ti L
Learning
i
3. Active Learning
4. (Prompt) Feedback
5. Time on Task
6. High Expectations
7. Respect for Diverse Talents and
Ways of Learning
Assessing
g Student
Learning: Questions
to ask about your Plan
Hamline University
February 2008
February,
Susan H
S
Hatfield
tfi ld
Winona State University
[email protected]