Choosing the Right Assessment Techniques

301
Choosing the
Right Assessment
Techniques
Margaret Martinez
CEO, The Training Place, Inc.
WWW.eLearningGuild.com
WWW.eLearningGuild.com
November 5-8, 2007
San Jose, CA
Choosing the Right
Assessment
Techniques
Margaret Martinez
Instructional Psychologist, Ph.D.
CEO, The Training Place, Inc.
Session 606
Engaging learners in the excitement of learning!
Passionate experiences inspire passionate learners!
Highlights
• Why assessment and what
techniques will align with
business or academic objectives
• Core principles, strategies, and
assessment types that fit a
specific objective
• The pros and cons of different
assessment strategies
• Selecting assessment tools
• Do’s
2
Your Challenge
• In fresh approaches to teaching and
learning, deciding what students
need to know and should be able to
do is critical.
• Assessments that demonstrate
increased accountability and the
manner and degree to which
learning takes place is increasingly
critical.
http://connect.educause.edu/library/abstract/TheNewEc
onomyTechnol/44829?time=1191977581
3
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 1
November 5-8, 2007
San Jose, CA
Purpose
Assessment is the
process of using criteria
to measure and get
feedback about things,
e.g., achievement of
instructional goals,
skills, knowledge,
competencies,
attitudes, or beliefs .
4
Measurement
• Measurement is
the estimation of
the magnitude or
range of some
attribute to
achieve some
objective.
• Surveys, quizzes,
inventories….
5
Types
Assessment can be:
• Formative (pre-), periodic, and
summative (post-), practice,
self-assessment
• Objective and subjective
• Referencing (criterionreferenced, norm-referenced,
and ipsative)
• Informal and formal
• Portfolio, projects,
observations, or artifacts
6
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 2
November 5-8, 2007
San Jose, CA
Goals?
Reason for Assessment:
• Make you look good
• Memory retention
• Long-term improvements
• Compliance
• Accountability
• Competence
• Stakes (Hi, Med, Lo)
7
Test Objectives?
• What users should be able
to do or know or believe
that would demonstrate
that they have
accomplished your intended
objectives.
http://edtech.tennessee.edu/~bobannon/writing_
objectives.html
http://www2.gsu.edu/~mstmbs/CrsTools/Magero
8
bj.html
Activity
1. Think about an assessment that you
gave that really worked well.
a)What was your objective?
b)What attributes worked well?
2. Share this experience with your
partner to the side and see if you
have any common elements.
9
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 3
November 5-8, 2007
San Jose, CA
Assessment Cycle
Assessment Plan
Mission/Purposes
Revisions
Test Design
Objectives
Evidence/Interpret
Standards
Test Development
Test Delivery
Item Writing
Outcomes
Item BankAnalysis
Methodology
Item Review
Implementation
Results
Test Analysis
http://www.skidmore.edu/administration/asses
sment/H_Sample_Assessment_Plan.htm
http://en.wikipedia.org/wiki/Test_Plan
10
Kirkpatrick
• Level 1 measures learner reaction
and perceptions. (Did they like it?
Was it relevant?)
• Level 2 measures if a student
achieved the learning objectives
(Test for phone skills).
• Level 3 measures if learners are able
to apply their new knowledge and
skills. (Has their behavior changed?)
• Level 4 Was the learning successful?
(e.g., increased sales, reduced
costs?)
11
Jack and Patti Phillips
• Level 5 measures
return on investment
and financial benefit to
the company compared
to the training
investment (e.g., larger
training programs).
12
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 4
November 5-8, 2007
San Jose, CA
Bloom etal.
Three domains of educational
activities:
• Cognitive: mental skills
(Knowledge)
• Affective: growth in feelings
or emotional areas (Attitude)
• Psychomotor: manual or
physical skills (Skills)
13
Knowledge
Recall data or
information.
Ex: Recite a
policy.
Keywords: define,
describe, recall,
state, list, identify,
match, point to,
summarize, list
14
Comprehension
Understand
meaning.
Ex: Explain a
problem in one's
own words.
Keywords: annotate,
explain, give examples,
predict, infer, interpret,
calculate, convert
15
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 5
November 5-8, 2007
San Jose, CA
Application
Apply what was learned
to a new situation.
Ex: Use statistics to
evaluate the reliability
of a written test.
Keywords: apply,
change, compute,
construct,
demonstrate, discover,
manipulate, modify,
operate, predict,
prepare, produce,
relate, show, solve
16
Analysis
Explore concepts so parts
and structure may be
understood. Distinguishes
between facts and
inferences.
Ex: Troubleshoot a piece of
equipment by using logical
deduction.
Keywords: analyze, break
down, compare,
contrast, diagram,
deconstruct, differentiate,
discriminate, distinguish,
identify, illustrate, outline,
relate, select, separate.
17
Synthesis
Gather elements to form a
new model with an
emphasis on creating a
new meaning, application,
or understanding.
Ex: Write an SOP to
improve operations.
Keywords: categorize,
combine, compile,
compose, create, devise,
design, explain, generate,
modifies, organize, plan,
rearrange, reconstruct,
relate, reorganize, rewrite
18
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 6
November 5-8, 2007
San Jose, CA
Evaluation
Make judgments about
the value of ideas or
materials.
Ex: Justify a new
budget.
Keywords: appraise,
compare, conclude,
contrast, criticize,
critique, defend,
discriminate, evaluate,
justify, support
19
Framework
20
Item Types
• Authentic Assessment
21
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 7
November 5-8, 2007
San Jose, CA
Activity
• You are designing an assessment for
audience
– Objective:
– Choose an item type and write an
item for Knowledge
– Choose an item type and write an
item for Comprehension
– Choose an item type and write an
item for Analysis
– Choose an item type and write an
item for Application
22
Score Interpretation
• People want to know what
their scores mean:
– Criterion-referenced
– Norm-referenced
23
Validity and Reliability
A valid Assessment:
• Measures what is intended to
measure
• Interpretations
A reliable Assessment:
• Refers to the consistency of
assessment results among similar
groups
http://www.questionmark.com/newsletter/newsus4463.htm
24
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 8
November 5-8, 2007
San Jose, CA
Controversy
• Only one measure should not
measure success
• Some people experience anxiety in
test taking situations
• High stakes may invoke
punishment
• Conformity to whose standards
• Post-assessment is not necessarily
a reliable predictor
• Bias
25
Assessment Tool
Selection
• What is my budget?
• What are my assessment
capabilities and analysis needs?
• How easily can I fit their methods
to my needs?
• Who needs reports to make
decisions with this data?
• Will this kind of evidence help me
make decisions?
• How will I document the evidence
and my decisions made?
26
Do’s
• Diagnostic assessments can help us
identify and support more suitable
learning experiences.
• Formative assessments can provide cues
or strategies that aid retrieval of learned
information.
• Summative assessment can help
maximize future retrievability of learned
information
– Context should mirror or simulate the
future retrieval context.
– Methods or items can prove to be
better than others in producing future
context-generated
retrieval.
27
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 9
November 5-8, 2007
San Jose, CA
Glossaries
• http://wiki.literacytent.org/index.php/ALEG
lossary
• http://www.newhorizons.org/strategies/ass
ess/terminology.htm
• http://www.sabes.org/assessment/glossary
.htm
• http://www.questionmark.com/us/glossary
.aspx
28
Measuring e-Learning
Success
• The eLearning Guild
announces the release of a
new report about
assessment. Available at the
Guild Research-based
Management Symposiums
(DevLearn 2007).
29
Summary
In your assessment, remember
to:
– identify assessment objectives first
– design assessment first
– use emotions to trigger improved
performance and retention
– audience analysis should identify
potential bias
30
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 10
November 5-8, 2007
San Jose, CA
Assessment Links
• http://www.brookes.ac.uk/services/ocsd/2_lea
rntch/methods.html
• http://www.ncrel.org/sdrs/areas/as0cont.htm
• http://www2.acs.ncsu.edu/UPA/assmt/
• http://jonathan.mueller.faculty.noctrl.edu/tool
box/
• http://www.aacu.org/resources/assessment/in
dex.cfm
• http://www.google.com/search?hl=en&rlz=1T
4GGIH_enUS211US211&q=assessment+wiki
• http://en.wikipedia.org/wiki/IEEE_829
31
To Your
Learning &
Assessment Success
• Maggie Martinez
[email protected]
(520) 877-3991
743 W Bougainvillea Drive
Oro Valley, AZ 85737
32
Session 301 – Choosing the Right Assessment Techniques – Margaret
Martinez, The Training Place, Inc.
Page 11