Classroom Assessment Strategies

Classroom Assessment
Strategies
Chapter Fifteen
Educational Psychology: Developing Learners
6th edition
Jeanne Ellis Ormrod
Assessment as Tools
 Assessment is the process of observing a
sample of a student’s behavior and drawing
inferences about the student’s knowledge
and abilities.

When we are looking at students’ behavior, we
typically only use a sample of classroom
behavior.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Assessment as Tools
 Assessment instruments do not dictate the decisions
to be made.
 Teachers, administrators, government officials,
parents, and even students interpret assessment
results and make decisions based on the results.
 Assessments are tools.

Allow us to make informed decisions about how best
to help our students learn and achieve
 Assessment interpretation can be abused.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Informal assessment
Paper-pencil assessment
vs.
vs.
Formal assessment
Performance assessment
ASSESSMENT
Standardized test
Traditional assessment
vs.
vs.
Teacher-developed
assessment
Authentic assessment
Informal assessment
vs.
Formal assessment
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Using Assessment for
Different Purposes
 Two basic types of assessment

Some assessments are formative and assess
students’ knowledge before or during instruction.
 Homework assignments, in-class assignments,
quizzes

Some assessments are summative and assess
students’ achievement after instruction.
 Exams
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Other Purposes of Assessment
 To promote learning
 In order for assessment to promote students’ learning
and achievement, it should:





Provide specific & concrete feedback
Act as a learning experience, letting students know
what they have and have not mastered
Act as a motivator—students should know what to
study and when
Act as a review mechanism
Influence cognitive processing
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Other Purposes of Assessment
 To guide instructional decision making
 To assist in the diagnosis of learning and
performance problems
 To promote self-regulation
 To determine what students have learned
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Important Qualities of
Good Assessment
 Remember RSVP
 Reliability

The results of our assessments should be
consistent no matter when we give it.
 Standardization

The assessment should have a similar format,
content, and procedure for all students.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Important Qualities of
Good Assessment
 Validity

The assessment should measure what it is
intended to measure.
 Practicality

The assessment and its procedures should be
fairly simple to use and take only a small
amount of time to administer and score.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Reliability
 There may be slight variation from time to
time.




Students change from day to day.
The physical environment may change.
Sometimes teachers are more clear in their
instructions than others.
There is always subjectivity in scoring.

Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
More likely when responses are scored on the
basis of vague, imprecise criteria
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Enhancing Reliability
 Include several tasks in each instrument and look for





consistency in students’ performance
Define each task clearly so students know exactly
what they are being asked to do
Identify specific, concrete criteria for evaluation
Try not to let expectations for students’ performance
influence judgments
Avoid assessing students when they are obviously
tired, ill, etc.
Administer assessments in similar ways and under
similar conditions for all students
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Validity
 Content Validity



This is the extent to which an assessment includes a
representative sample of tasks within the domain being
assessed.
It assures that what we are testing truly represents what we
have taught (the instructional objectives).
 High content validity is essential in summative evaluations.
Teachers can use a table of specifications to enhance
content validity.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Validity
 Predictive Validity
 Extent to which the results of an assessment
predict future performance

Often take the form of aptitude tests
 Construct Validity
 Extent to which an assessment accurately
measures general, abstract characteristics

Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
E.g., motivation, self-esteem, or intelligence
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Informal Assessment
 Informal assessment occurs in our day-to-day interactions
with students.
 Advantages:




It provides continuing feedback about the effectiveness of
instructional tasks and activities.
It helps determine the appropriateness and success of our formal
assessments.
It is easily adjusted.
It provides valuable clues about social, emotional, and motivational
factors affecting classroom performance.
 Disadvantages:

It is not very reliable or valid.

We sometimes see the halo effect.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Paper-Pencil Assessment
 Paper-pencil assessment is often the first choice for
formal assessment because of its practicality.
 It may use recognition or recall tasks.


Recognition: Multiple choice, true-false, matching
Recall: Short-answer, essay, word problems
 It often only measures lower-level skills.


However, they can be used to measure higher-level
skills, but these questions take more time to write.
Essays are more often used to measure higher-level
skills.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Constructing Paper-Pencil
Assessments
 Alternative-Response Items



Rephrase ideas presented in class or the
textbook
Make statements that clearly reflect one
alternative or the other
Avoid excessive use of negatives
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Constructing Paper-Pencil
Assessments
 Matching Items

Keep the items in each column homogeneous

Have more items in one column than the other
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Constructing Paper-Pencil
Assessments
 Multiple-Choice Items
 Present distractors that are clearly wrong to students
who know the material but plausible to students who
haven’t mastered it
 Avoid putting negatives in both the stem and the
alternative
 Use “all of the above” or “none of the above” seldom if
at all
 Avoid giving logical clues about the correct answer
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Constructing Paper-Pencil
Assessments
 Short-Answer and Completion Items


Indicate the type of response required
For completion items, include only one or two
blanks per item
 Problems and Interpretive Exercises


Use new examples and situations
Include irrelevant information
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Constructing Paper-Pencil
Assessments
 Essay Tasks



Ask for several essays requiring short
responses rather than one essay requiring a
lengthy response
Give students a structure for responding
Ask questions that can clearly be scored as
correct or incorrect
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
General Guidelines for Constructing
Paper-Pencil Assessments
 Define tasks clearly and unambiguously
 Decide whether students should have access
to reference materials
 Specify scoring criteria in advance
 Place easier and shorter items at the
beginning of the instrument
 Set parameters for students’ responses
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Administering the Assessment
 Provide a quiet and comfortable environment
 Encourage students to ask questions when
tasks are not clear
 Take steps to discourage cheating
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Strategies for Scoring
Students’ Responses
 Specify scoring criteria in concrete terms
 Unless specifically assessing grammar skills, score




grammar and spelling separately from the content of
students’ responses
Skim a sample of students’ responses ahead of time
Score item by item rather than paper by paper
Try not to let prior expectations of students’
performance influence judgments of their actual
performance
Keep students’ scores confidential
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Performance Assessment
 Performance assessment can be used for
measuring mastery of:



Playing a musical instrument
Performing a workplace routine
Engaging in a debate
 Ideal for the assessment of complex
achievements
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Choosing Appropriate
Performance Tasks
 Four distinctions to help choose tasks most
appropriate for the purpose
 Decide whether to look at the products, the
processes, or both

Is what you are assessing tangible (product) or a
behavior (process)?
 Determine if you need an individual or group
performance

Dependent upon WHAT you are assessing
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Choosing Appropriate
Performance Tasks
 Restricted vs. extended performance

E.g., is the student playing a few notes or an
entire piano piece?
 Should you use static or dynamic
assessment?

Dynamic assessment applies the Vygotskian
concept of the zone of proximal development.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Planning and Administering
Performance Assessments
 Consider incorporating the assessment into
normal instructional activities
 Provide an appropriate amount of structure
 Plan classroom management strategies for
the assessment activity

Be continually aware of what the students are
doing and make sure all students are busy and
engaged
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Strategies for Scoring
Student Performance
 Consider using checklists, rating scales, or both in your




rubric
Decide whether analytic or holistic scoring better serves
your purpose(s)
 Analytic: Scoring a student’s performance by
evaluating various aspects of it separately
 Holistic: Summarizing a student’s performance with a
single score
Limit the criteria to the most important aspects of the
desired response
Describe the criteria as explicitly and concretely as
possible
Make note of other significant aspects of a student’s
performance that the rubric doesn’t address
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Including Students in the
Assessment Process
 Including students in the process encourages them
to self-assess.
 Teachers should:




Provide examples of “good” and “poor” products
Make evaluation criteria explicit
Allow students to compare self-ratings with teacherratings
Encourage self-reflection via the use of daily journal
entries
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Evaluating Assessment Tools
 An item analysis can be done to determine if
certain items are measuring the knowledge or
skill we intended to measure:


Item difficulty measurements
Item discrimination measurements
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
Taking Student Diversity
into Account
 Some things to keep in mind:
 Students often suffer from test anxiety.
 Gender and ethnic differences may impact
assessment performance independently of their
actual learning and achievement.
 Assessment instruments must comply with the
federal mandates regarding students with special
needs.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.
The “Big Picture” of Assessment
 Our assessments will indirectly affect students’





learning and achievement.
Our instruments and practices should match our
instructional goals and objectives.
Remember RSVP.
Our scoring criteria should be as explicit as possible.
Students’ errors provide valuable information about
where their difficulties lie.
We should continually evaluate our instruments.
Jeanne Ellis Ormrod
Educational Psychology: Developing
Learners, sixth edition
Copyright © 2008 by Pearson Education, Inc.
Upper Saddle River, New Jersey 07458
All rights reserved.