Professional Documents
Culture Documents
Multiple Choice Questions - Final
Multiple Choice Questions - Final
1 For further information on learning outcomes and their alignment to the curriculum, please, see the MCSHE resource ‘Writing Learning Outcomes: A
Practical Guide for Academics’ by. Available at: https://melbourne-cshe.unimelb.edu.au/resources/categories/teaching-and-learning/curriculum-design
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , M AY 2 0 2 0 1
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
A great deal of systematic and anecdotal A. Incorporating MCQs, including those testing
evidence supports the assumption that MCQs higher-level learning, into lecture, tutorial
tend to assess low-level knowledge and there and online programs and clearly articulating
are several reasons why this is so. Among expected learning outcomes. This will help
them is that the design of items to test high students understand what is being assessed.
level knowledge and understanding is difficult B. Using MCQs to only test low-level knowledge,
and time consuming. The design of items that whilst using short open-ended questions to
are valid and reliable indicators of high-level test high-level knowledge.
knowledge and understanding, with well-designed
C. Using MCQs to test both low-level and
distractors, requires expertise and experience. It is
high-level knowledge within formative
comparatively much easier to design items to test
assessments, with expected learning
low-level knowledge and understanding. Indeed,
outcomes clearly articulated and directed
it can be argued that the efficiency in marking
feedback provided.
items measuring high-level knowledge and
understanding may not compensate for the time Suggestion A gives students a mix of low-level and
needed to develop valid and reliable measures high-level MCQs, asking them to answer the items,
of such knowledge and understanding. We will then asking them to classify the items as testing
address ways of mitigating these risks later in our low-level or high-level knowledge and finally
list of Do’s and Don’ts. discussing their responses contributes to both
assessment of, and for, learning. Using this strategy
The effect on students’ approaches to study also
students experience the items and are assisted to
needs consideration2,3. Rote and reproductive
reflect on them, building their understanding of
approaches to learning are common. The
the expected levels of achievement in the subject.
evidence is that students perceive MCQs to test
Incorporating these interactive sessions
low-level knowledge and understanding. The
very structure of MCQs tends to evoke such a Suggestion B incorporates the issue of difficulty in
perception. Consequently, students tend to adopt design of high-level items and students’ perceptions
‘surface’ approaches to studying for MCQ tests, that they test low-level knowledge. In this approach,
perceiving that if they can remember ‘enough relatively easily designed MCQs that are used to test
stuff’ then they can do well. It is difficult to change low-level knowledge are supplemented with short
this perception. Consequently, when well- open-ended questions to test high-level knowledge
designed items measuring high-level knowledge and understanding. This approach mitigates
and understanding are used, students often do against the risk of poorly designed high-level items
poorly. Suggested ways for mitigating this risk are: and is consistent with students’ perceptions of
MCQs and open-ended questions.
2 Scouller, K. M., & Prosser, M. (1994). Students’ experiences in studying for multiple choice question examinations. Studies in Higher Education, 19(3): 267–279.
3 Scouller, K. (1998). The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment
essay. Higher Education, 35: 453–472.
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 2
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
PART II: Writing MCQs • Ensure that the stem and alternatives make
sense on their own without reference to
An MCQ comprises the stem, a question or
additional materials and use terms and
problem that leads into the list of possible
symbols that are familiar to students. This will
solutions, or alternatives, comprising both the
focus students’ attention on answering the
key (the correct answer) and several incorrect
MCQ rather than interpreting its construction,
distractors.
reducing their cognitive load. Similarly, use
The stem may take the form of a question or simple sentence structures, avoiding non-
statement and may also refer to external or standard lingo and jargon, complex grammar
supplementary material, such as a figure or and lengthy unnecessary wording.
diagram, which students are asked to interpret • Avoid using negative language within the
in selecting the best alternative. Similarly, stem, including double negatives within
alternatives can be in the form of single terms, both the stem and alternatives, unless the
statements, figures or diagrams. learning objectives call for such language.
There are several key considerations that need Using negative language can both increase a
to be applied when writing MCQ items. Taken students’ cognitive load, by requiring them to
together, these aim to ensure that MCQ questions examine the question more carefully to detect
are reliable – consistently measuring a learning this language, and decrease the question’s
outcome, are valid – testing student learning at the validity, since students who fail to detect the
appropriate level, and are focused, with students negative language will be less likely to answer
spending most of their time considering their the question correctly. If using negative
response to the question, rather than on making language, ensure it is flagged or highlighted,
sense of extraneous or irrelevant material. So, for example in bold text or capitalisation.
when writing an MCQ question, one should: Brame (2013) and Chiavaroli (2017) discuss
these aspects in more detail.
• Make sure all alternatives are plausible and
In the following section, we present several
relate in some way to the topic or subject
examples of MCQ items, of different structural
matter. This will increase the question’s
forms, that aim to assess both low-level learning
reliability – increasing the chances that
outcomes such as recall and comprehension, and
students who choose the correct answer
higher-level learning outcomes such as analysis,
meet the learning objective.
evaluation and synthesis.
• Write alternatives that are as similar as
possible in grammar, length, language and
form. This will reduce the chances that
students may detect hidden clues as to which
alternative is the correct one, assisting to
increase the question’s validity.
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 3
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
D. CO2
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 4
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
low nutrient content and high fibre content. Which of the ✓ Most alternative answers are plausible but
Option D could be easily excluded by a student
following combinations of dentition and digestive tracts with little knowledge of the subject. Try to ensure
would be most suitable for processing this diet? each alternative is at a similar degree of difficulty.
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 5
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
4 Green, K. Sample multiple choice questions that test higher order thinking and application. Washington State University Office of Assessment of Teaching
and Learning, in McGill University, Workshop: Designing effective multiple choice questions. Available at https://www.mcgill.ca/skillsets/files/skillsets/
mcq_handout3.pdf
5 Burton, SJ, Sudweeks, RR, Merrill, PF &Wood, B (1991) How to prepare better multiple-choice test items: Guidelines for university faculty. Brigham Young
University Testing Centre http://testing.byu.edu/info/handbookd/betteritems.pdf
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 6
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 7
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
Further Reading
Brame, C. (2013) Writing good multiple choice test questions. Retrieved [May 18, 2020] from
https://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/.
Centre for Teaching Excellence (2020, May 18) Designing Multiple-Choice Questions,
Retrieved from: https://uwaterloo.ca/centre-for-teaching-excellence/teaching-resources/teaching-tips/
developing-assignments/assignment-design/designing-multiple-choice-questions.
Rodriguez, M. C. (2005). Three options are optimal for multiple-choice items: A meta-analysis of 80 years
of research. Educational measurement: issues and practice, 24(2): 3–13.
Tarrant, M., Ware, J. and Mohammed, A.M. (2009). An assessment of functioning and non-functioning
distractors in multiple-choice questions: a descriptive analysis. BMC Medical Education, 9: 40.
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 8
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
The ‘choose the best The most common Use familiar language A simple question Writing plausible Choosing how many
answer’ and familiar form of commonly used form that is quick distractors can be distractors to include
MCQ, these present during term and easy to read for difficult – these should is important. Fewer
the question stem, students and easy to be created by content distractors (2 or 3) can
Avoid using verbal
several incorrect grade for teaching experts. be effective, but only if
association clues from
answers to choose staff. they are plausible and
stem in the key The validity of newly
from (distractors), represent a genuine
written MCQs can also
along with the single Ensure the distractors choice for students
be evaluated in low-
most correct answer are plausible and (Tarrant et al. 2009;
stakes or formative
(the key). effectively test Rodriguez 2005).
quizzes by analysing
students’ knowledge
Asking students student responses. Too many distractors
and understanding
to select the ‘best (more than 3 or 4)
of the concepts Avoid simply asking
answer’ rather than may result in students
conveyed in the students to identify
the ‘correct answer’ spending substantial
question a familiar term from
mitigates against time reading, instead
their lectures or
some distractors Plausible distractor of considering the
textbook.
having correct terms could be, for question and choosing
elements example, common a possible answer.
misconceptions on
the topic, or numerical
answers that are
within a plausible
range but are not
quite correct.
The ‘complete the Students choose from Ensure that only one Choosing the correct Be careful that the Reading through
statement’ a list of statements statement is truly statement can help statements are not question stems and
that address the correct, but that students recognise simply copied from possible answer
question or complete others may also be and address any the lecture notes or statements will
the sentence in the plausible. misconceptions textbook, as this test take longer than
stem they have with the students’ recall rather assessing a short list
Alternative statements
material. than their conceptual of distractors.
should be of
understanding.
similar length and Asking students Think carefully about
complexity. to consider full It can be difficult the best use of the
statements can also to create plausible students’ time and
help them integrate alternative statements increase the difficulty
their conceptual without veering into of the question
knowledge, whereas ‘trick’ questions that and the number of
picking a correct seem correct but have assigned marks if
single simple answer slight differences in needed.
favours general recall. wording that trip up
students.
Be mindful of students
with English as a
second language
when writing
questions and try
not to include tricky
grammar or non-
standard lingo.
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 9
MELBOURNE CSHE TE ACHING AND LE ARNING SHORT GUIDE SERIES
The ‘figure The stem asks Asking students Using figures of These questions take Using these types
interpretation’ students to refer to view a figure, concepts from longer to write, read of questions in an
to and analyse diagram or concept lectures, tutorials and complete so they online MCQ can
supplementary map and show their or other learning usually need to be be beneficial if the
information to answer understanding by activities can help assessed at a higher diagram, concept map
the question. filling in gaps in reinforce students’ level and assigned or figure isn’t readily
the figure or map, understanding of more marks than a available on the web.
choosing from a list these concepts. simple distractor. Try to create a new or
of interpretative modified figure for the
Higher-level learning Given their long read-
statements or making MCQ.
processes such time, they may be
calculations based
as evaluation, more suited to take-
on the material
interpretation home or non-timed
presented.
and application of quizzes and exams.
knowledge can be
assessed in an easy-
to-grade format.
The ‘word match’ Students match A word match Allows students to Choosing the correct Consider increasing
the terms to their question can help demonstrate their matches may the allotted reading
descriptions, or match to clarify where knowledge of terms take some time, time to suit.
concepts. misconceptions exist and their descriptions. particularly if the list
May be more
in the material. is long.
Functions in a similar Can be modified beneficial in take-
way to the ‘most to test higher-level Each word match is home, non-timed or
correct statement’ but conceptual knowledge testing a different formative quizzes or
poses single question and applications. piece of knowledge exams.
per statement. and should be allotted
All material is relevant
a full mark or grade.
to the question, so
plausible alternatives
or distractors are not
required.
The ‘fill in the blank’ Students choose The stem can be Tests student’s Presenting multiple Several guides (e.g.
words from a list to a description of a general knowledge of blank spaces for Brame 2013) advise
fill in gaps within a concept, with the terminology, which students to fill in may against using this
written statement. blanks representing can be an advantage confuse students, so question form, as it
key terms, directions in content-heavy that their answers are requires students
of responses or subjects and subject not a true reflection of to hold the stem
concepts. themes. their knowledge. statement in their
working memory
Students choose the Allows students to Be careful not to
while evaluating each
best term to complete show understanding overcomplicate the
alternative answer.
the statement, or the of linked concepts statement and make
best combination within long sure the associations These questions
of multiple terms in statements. are unambiguous. should be used
order. sparingly and their
Versatile – can test Avoid using in
purpose clearly
knowledge levels from high-stakes or high-
communicated to the
recall to applications. pressure situations.
students.
melbourne-cshe.unimelb.edu.au
M E L B O U R N E C E N T R E F O R T H E S T U DY O F H I G H E R E D U C AT I O N , J U LY 2 0 2 0 10