Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Engaging Students in Higher Level Thinking with Multiple Choice Questions

When seeking efficient and reliable measures of student learning, faculty might find multiple
choice tests appealing. After all, multiple choice assessments tend to be easier to grade and
more objective than their constructive response counterparts. But multiple choice assessments
are more than a convenience.

Despite the belief that multiple choice tests emphasize lower level skills such as recall and
comprehension, the multiple choice format, by its very nature, requires students to engage in
one of the highest levels of Bloom’s taxonomy, evaluation. As students weigh one option
against another to determine the “best” response, they are practicing the skills of comparing,
making judgments, and, in some cases, reflecting to justify their final answer. Even if the
student isn’t sure of the correct answer, the process of eliminating incorrect answers requires
these same higher level skills.

By incorporating Bloom’s higher level verbs, rewording open response or lower level questions,
and adding explanation components to multiple choice questions, faculty can design multiple
choice tests that encourage evaluation and other higher level thinking skills. See the table
below for some examples.

Verbs from Higher Levels of Reword Existing Mix Multiple Choice and
Bloom’s into Question Stems Questions Constructed Response

 Evaluate the following  Reword open-ended  Have students elaborate on


options then select the one questions by changing the their final answer choice
that is the most … for … key verb to a noun (ex. and/or explain why the
change Describe …. to Which remaining choices are not
 Which of the following best
is the best description of…? the best
distinguishes … from …?
(Dickinson, 2011)
 Offer more than one possible
 If applying … to …, which of
 Change simple questions into correct answer then ask
the following is a possible
multi-logic questions that students to choose one (or
outcome?
require students to combine more) and justify their
 Which of the following knowledge from more than choice(s)
judgments could you make one area to solve a problem,
 Give students the chance to
about … based on …? draw a conclusion, etc. (ex.
challenge a test question in
interpret results from a graph
 Which evidence justifies …? writing, explaining why the
then select the principle that
question (or answer choices)
 Which of the following would best explains the result)
might not be valid (Kerkman
disprove …? (Brame, 2015)
& Johnson, 2014)
Another approach, micro-questioning, involves creating a series of multiple-choice items for
each learning objective that helps students “hit the target” from multiple angles (Kuddus,
2016). The questions for each objective range from those that test lower level of Blooms
taxonomy to those that involve practical application of the objective. Questions can be recycled
and used for multiple learning tasks including quizzes, online practice, in class group activities,
and exam review.

Creating good multiple choice questions can be challenging, even when testing lower level
skills. When constructing questions for higher level thinking, be aware of certain pitfalls that
could hinder higher level thinking tasks as described in the table below.

Pitfalls to Assessing Higher Level Thinking with Multiple Choice Questions


Pitfall 1—Including obvious or  Including answer choices that are obviously wrong
“silly” distractors. or “silly” increases the probability of students
choosing the correct answer through guessing
because they have fewer legitimate answer
choices. To avoid this pitfall, make all answer
choices plausible. Plausible answers are often
common misconceptions. (Brame, 2015).

Pitfall 2—Using examples and  Using the same wording and examples from the
wording directly from the text course text or class discussions emphasizes
or class recognition and recall. To avoid this pitfall, present
new examples and contexts (Dickinson, 2011), and
paraphrase any ideas taken directly from the text.

Pitfall 3—Testing for minor  Asking questions about minor or trivial details puts
details that students can the focus on recall rather than analysis, application
merely memorize and evaluation. To avoid this pitfall, focus multiple
choice questions on concepts or processes.

For additional tips on constructing multiple choice questions, check out the following from the
eLearning Coach.

References

Brame, J.C. (2015). Writing good multiple choice test questions. Vanderbilt University Center
for Teaching. Retrieved from http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-
choice-test-questions/
Brigham Young University Faculty Center. (2001). 14 rules for writing multiple choice questions.
Retrieved from
https://testing.byu.edu/handbooks/14%20Rules%20for%20Writing%20Multiple-
Choice%20Questions.pdf

Dickinson, M. (2011). Writing multiple choice questions for higher level-thinking. Learning
Solutions Magazine. Retrieved from
http://www.learningsolutionsmag.com/articles/804/writing-multiple-choice-questions-for-
higher-level-thinking

Kerkman, D.D. & Johnson, A.T. (2014). Challenging multiple-choice questions to engage critical
thinking. Insight: A Journal of Scholarly Teaching, 9, pp. 92-97. Retrieved from
http://www.insightjournal.net/Volume9/8ChallengingMultiple-
ChoiceQuestionsEngageCriticalThinking.pdf

Kuddus, Ruhul. (2016). The micro-questioning approach for content transmission. Presentation
presented at Lilly International Conference, Bethesda, MD.

You might also like