Engaging Students in Higher Level Thinking with Multiple Choice

Engaging Students in Higher Level Thinking with Multiple Choice Questions
When seeking efficient and reliable measures of student learning, faculty might find multiple
choice tests appealing. After all, multiple choice assessments tend to be easier to grade and
more objective than their constructive response counterparts. But multiple choice assessments
are more than a convenience.
Despite the belief that multiple choice tests emphasize lower level skills such as recall and
comprehension, the multiple choice format, by its very nature, requires students to engage in
one of the highest levels of Bloom’s taxonomy, evaluation. As students weigh one option
against another to determine the “best” response, they are practicing the skills of comparing,
making judgments, and, in some cases, reflecting to justify their final answer. Even if the
student isn’t sure of the correct answer, the process of eliminating incorrect answers requires
these same higher level skills.
By incorporating Bloom’s higher level verbs, rewording open response or lower level questions,
and adding explanation components to multiple choice questions, faculty can design multiple
choice tests that encourage evaluation and other higher level thinking skills. See the table
below for some examples.
Verbs from Higher Levels of
Bloom’s into Question Stems
Reword Existing
Questions
Mix Multiple Choice and
Constructed Response
 Evaluate the following
options then select the one
that is the most … for …
 Reword open-ended
questions by changing the
key verb to a noun (ex.
change Describe …. to Which
is the best description of…?
(Dickinson, 2011)
 Have students elaborate on
their final answer choice
and/or explain why the
remaining choices are not
the best
 Which of the following best
distinguishes … from …?
 If applying … to …, which of
the following is a possible
outcome?
 Which of the following
judgments could you make
about … based on …?
 Which evidence justifies …?
 Which of the following would
disprove …?
 Change simple questions into
multi-logic questions that
require students to combine
knowledge from more than
one area to solve a problem,
draw a conclusion, etc. (ex.
interpret results from a graph
then select the principle that
best explains the result)
(Brame, 2015)
 Offer more than one possible
correct answer then ask
students to choose one (or
more) and justify their
choice(s)
 Give students the chance to
challenge a test question in
writing, explaining why the
question (or answer choices)
might not be valid (Kerkman
& Johnson, 2014)
Another approach, micro-questioning, involves creating a series of multiple-choice items for
each learning objective that helps students “hit the target” from multiple angles (Kuddus,
2016). The questions for each objective range from those that test lower level of Blooms
taxonomy to those that involve practical application of the objective. Questions can be recycled
and used for multiple learning tasks including quizzes, online practice, in class group activities,
and exam review.
Creating good multiple choice questions can be challenging, even when testing lower level
skills. When constructing questions for higher level thinking, be aware of certain pitfalls that
could hinder higher level thinking tasks as described in the table below.
Pitfalls to Assessing Higher Level Thinking with Multiple Choice Questions
Pitfall 1—Including obvious or
“silly” distractors.
 Including answer choices that are obviously wrong
or “silly” increases the probability of students
choosing the correct answer through guessing
because they have fewer legitimate answer
choices. To avoid this pitfall, make all answer
choices plausible. Plausible answers are often
common misconceptions. (Brame, 2015).
Pitfall 2—Using examples and
wording directly from the text
or class
 Using the same wording and examples from the
course text or class discussions emphasizes
recognition and recall. To avoid this pitfall, present
new examples and contexts (Dickinson, 2011), and
paraphrase any ideas taken directly from the text.
Pitfall 3—Testing for minor
details that students can
merely memorize
 Asking questions about minor or trivial details puts
the focus on recall rather than analysis, application
and evaluation. To avoid this pitfall, focus multiple
choice questions on concepts or processes.
For additional tips on constructing multiple choice questions, check out the following from the
eLearning Coach.
References
Brame, J.C. (2015). Writing good multiple choice test questions. Vanderbilt University Center
for Teaching. Retrieved from http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiplechoice-test-questions/
Brigham Young University Faculty Center. (2001). 14 rules for writing multiple choice questions.
Retrieved from
https://testing.byu.edu/handbooks/14%20Rules%20for%20Writing%20MultipleChoice%20Questions.pdf
Dickinson, M. (2011). Writing multiple choice questions for higher level-thinking. Learning
Solutions Magazine. Retrieved from
http://www.learningsolutionsmag.com/articles/804/writing-multiple-choice-questions-forhigher-level-thinking
Kerkman, D.D. & Johnson, A.T. (2014). Challenging multiple-choice questions to engage critical
thinking. Insight: A Journal of Scholarly Teaching, 9, pp. 92-97. Retrieved from
http://www.insightjournal.net/Volume9/8ChallengingMultipleChoiceQuestionsEngageCriticalThinking.pdf
Kuddus, Ruhul. (2016). The micro-questioning approach for content transmission. Presentation
presented at Lilly International Conference, Bethesda, MD.