Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Nurse Education Today 133 (2024) 106052

Contents lists available at ScienceDirect

Nurse Education Today


journal homepage: www.elsevier.com/locate/nedt

Review

Changing test answers: A scoping review


Jean S. Coffey a, *, Annette T. Maruca a, E. Carol Polifroni b, Marianne Snyder c
a
231 Glenbrook Road, Storrs, CT 06268, United States of America
b
UConn Office of Clinical Placement Coordination, 231 Glenbrook Road, Storrs, CT 06268, United States of America
c
University of Connecticut, 231 Glenbrook Road, Storrs, CT 06268, United States of America

A R T I C L E I N F O A B S T R A C T

Keywords: Objective: Various disciplines at all education levels worldwide are steeped in contradictions regarding the value
Backtracking of answer changing on exams. Therefore, the purpose of this scoping review was to explore the current evidence
Changing answers related to answer changing at the graduate and undergraduate level with a focus on nursing education.
Test wiseness
Design: The scoping review was guided by Arksey and O'Malley's scoping review framework. The team used the
Scoping review
Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews guideline to
report the findings.
Data sources: Using identified keywords such as answer changing, backtracking and test wiseness, researchers
located forty-nine relevant studies from multiple databases. After applying the predetermined inclusion criteria
including college students from all disciplines at the undergraduate or graduate level, ten remained for review.
Review method: The a priori protocol was; changing answers on tests will lower scores. Studies were randomly
assigned to each researcher for a first and second review using an author designed abstraction tool. Verbal
reconciliation was the third and final review of each article.
Results: The scoping review revealed that answer changes produced more wrong to right answers. In addition,
total score improvements with answer changes were as high as 45.15 %.
Conclusion: The synthesis of findings from 10 articles refuted assumptions that answer changing would negatively
impact the number of correct responses and potentially lower the overall exam score. The findings from this
scoping review support permitting answer changes. However, the varied research approaches in the studies
reviewed also suggest the need to conduct further research on the topic.

1. Background disciplines other than nursing to the contrary, it appeared many edu­
cators still held tight to the belief without evidence that students should
When a group of nursing faculty were approached by nursing stu­ go with their “first hunch” (Ouyang et al., 2019).
dents seeking guidance about the practice of changing answers during These findings prompted the question of whether changing answers
an exam, the faculty searched the literature to inform an evidenced to test questions is beneficial or detrimental to a student's test grade.
based response. A comprehensive literature review revealed a dearth of This topic has been debated for the past several decades in education
current studies on the topic, many written in the previous several de­ literature (Merry et al., 2021) albeit not recently. Waddell and Blan­
cades and very few were focused on nursing (Blakeman and Laskowski, kenship (1994) noted that “more than 50% of faculty and students”
2020). believed that changing answers would lower test scores (p. 155). Of
What became evident in exploring the scant available literature and significance, when study variables such as gender, personality traits,
discussing the topic at a faculty meeting was that faculty adhered to how academic ability, item difficulty, and item position were correlated with
they were taught, “go with your first answer and do not change it”. Some answer changing the only statistically significant variable was total test
faculty and students, however, were found to endorse changing answers that correlated positively with answer changing and test score (p. 157).
(Blakeman and Laskowski, 2020; Bultas et al., 2021; Merry et al., 2021; Still the belief and practice that changing answers would be detrimental
Yonemoto et al., 2023). Despite the substantial body of evidence in for student test scores persists (Nieswiadomy et al., 2001). Advice given

* Corresponding author.
E-mail addresses: jean.coffey@uconn.edu (J.S. Coffey), annette.maruca@uconn.edu (A.T. Maruca), carol.polifroni@uconn.edu (E.C. Polifroni), marianne.snyder@
uconn.edu (M. Snyder).

https://doi.org/10.1016/j.nedt.2023.106052
Received 6 June 2023; Received in revised form 13 November 2023; Accepted 18 November 2023
Available online 23 November 2023
0260-6917/© 2023 Published by Elsevier Ltd.
J.S. Coffey et al. Nurse Education Today 133 (2024) 106052

by educators on test-taking strategies surrounding changing answers examination of methods used in a specific field, characteristics of con­
varies and may not be informed by best practices (Merry et al., 2021). cepts, precursor to a systematic review, and analysis of gaps in the
Answer changing as a test strategy is frequently misunderstood by both literature. Thus, we believe a scoping review was the method of choice
test takers and educators. In addition to the variables mentioned pre­ for this study as the body of evidence surrounding changing answers had
viously, other variables such as how confident are the examinees in not been discussed nor used when faculty made decisions about test
changing answers, what reasons do they identify as to why they changed taking strategies. A scoping review would aid us in knowing the evi­
an answer were examined by some for the interrelationship between dence and be a precursor to a systematic review if appropriate. There is
monitoring and control, or more specifically metacognition, and the role little in the nursing literature discussing whether answers should be
this has during test taking (Bultas et al., 2020; Stylianou-Georgiou and changed or not. Thus, a scoping review within the discipline of nursing
Papanastasiou, 2017; Papanastasiou and Stylianou-Georgiou, 2022). was needed. It is important to know and map the available evidence on
Both the question and the answer to the question of the impact of changing answers and it is also important to note that a limitation of the
answer changing is not simple, which reinforces and prolongs the varied scoping review approach is the quality of the evidence is not addressed.
practices among educators and students. This is further impacted when We chose to address this limitation by including only research studies
considering the diversity of nursing students across the globe. “Evolving which were published; in other words, we did not use reports or case
cultural, economic, political, social, and technological forces in the studies.
world community affect local and global nursing … .” (Kulbok et al., Arksey and O'Malley's (2005) scoping review framework guided this
2012, p.1). For example, the health workforce in the U.S. is slightly more study. Their steps include defining the research question, searching the
diverse than the overall population (Snyder et al., 2018). Acknowl­ databases for relevant material, carefully selecting the studies using
edging the multiple forces impacting nursing globally and the increase predetermined inclusion criteria, recording the abstracted data, sum­
in diversity in nursing, it is important to gain an understanding of the marizing the information, and sharing the findings with stakeholders for
best guidance to provide students on test taking strategies and answer validation and to inform them of the findings. The use of this framework
changing. With the combination of both paper and pencil tests and is described in the subsequent sections.
increased move to computerized examination platforms, the choice to Beginning with the challenge from students as to why they could not
allow test answer changing requires a thoughtful decision by faculty to backtrack on exams, we asked the question “what does the evidence say
endorse backtracking and changing test answers (Bultas et al., 2020). about changing answers?” With the assistance of the nursing liaison
The impact of the evolution to computerized testing for nursing has been from the university's library, Author # 1, using the keywords back­
described in Canada (Hobbins and Bradley, 2013), Ghana (Christmals tracking, changing answers, and testwiseness identified relevant studies
and Gross, 2019) and the US (Gloekler and Lucas, 2021) including the from CINAHL, Pub Med, Psych Info and ERIC prior to the actual start of
school of nursing at the authors' university. The switch to a computer­ the scoping review process.
ized platform for administering examinations, moving away from paper On examination of the initial search, it was noted that many of the
and pencil format, provides one question at a time. If backtracking is articles were dated in the 1990s, focused on changing answers on paper
offered as an option for students, the student would flag a question(s), go tests, and included high school and elementary school students. Thus,
forward or backward, and change answers during the examination. for this 2023 scoping review, the team decided on the inclusion criteria
With more computerized testing being used, it became clear the of 1) articles written in English, 2) original research studies (meta-
decision to allow backtracking with answer changing was based on the analysis excluded), 3) peer reviewed publications, 4) no discipline to be
individual faculty member's belief on the benefit or harm to the student excluded, 5) addressed college, graduate, or post graduate students, and
with allowing backtracking and as a result backtracking was inconsis­ 6) the article published after 2000.
tently used by faculty across the curriculum. In some courses back­
tracking was permitted by faculty while in others, the test settings were 3. Study selection and abstraction tool
chosen to prevent backtracking. Despite existing literature that suggests
it is beneficial for students to change answers on multiple choice tests Author # 3 had previously conducted scoping reviews and led the
(Bultas et al., 2021; Gaskins et al., 1996; Merry et al., 2021; Waddell & team in creating the abstraction tool. Data elements deemed important
Blankenship, 2004), the long-held belief by faculty and nursing students were the inclusion criteria, as well as the geographic location of the
continued that it is better to stay with the first choice or “first impres­ study, type of study design, medium of the test (computer or paper),
sion” and not to change the test answer (Nieswiadomy et al., 2001, p. number of participants, discipline of participants, phase of education
142). The attitudes and behaviors regarding answer changing are stee­ (college, graduate, or lifelong learning), examination of right to right,
ped in contradictions worldwide, across disciplines, and at all levels of right to wrong and wrong to wrong answer changes, and overall sources
education. Nursing is no exception to the conundrum regarding back­ of evidence (abstraction tool available upon request). Every article was
tracking on exams and answer changing. Consequently, more informa­ reviewed independently by two members of the team and then recon­
tion is needed on this topic based on the scant available evidence. ciled by the same team members together. If there was a disagreement,
The aim of this scoping review is to explore the available body of the plan was to involve a third member of the team, but this was not
evidence on changing test answers and provide an overview on this needed. The reconciled abstraction was used for this report.
topic. Changing answers were identified as: wrong to wrong (W–W), The team used the Preferred Reporting Items for Systematic Reviews
wrong to right (W-R), right to right (R-R), and right to wrong (R-W). A and Meta-Analyses extension for Scoping Reviews guideline to report
priori assumption is that changing test answers will lower student's the findings (Tricco et al., 2018). This manuscript is the final stage of the
scores therefore, backtracking is not beneficial. This scoping review may Arksey and O'Malley (2005) process wherein the results are shared with
help inform colleges or schools of nursing protocols and have potential stakeholder groups.
to guide future studies.
4. Summary of evidence
2. Method
Ten articles remained at the end of the three-step reconciliation
There are numerous formats to consider when conducting and process (see Fig. 1). Six out of 10 of the papers were from the USA. The
writing a systematic review. Randles and Finnegan (2023) compare the other four consisted of two from Germany, and two were multi-national.
varied methodologies to consider in a comprehensive review. Munn The education level reported in the studies was primarily undergraduate
et al. (2018) identified six reasons to consider a scoping review: identify with a few graduate level exams. Demographic information on the
the types of extant evidence, clarification of definitions and/or concepts, participants such as age or gender was not consistently provided across

2
J.S. Coffey et al. Nurse Education Today 133 (2024) 106052

Fig. 1. Process of article abstraction.

all studies. Students who participated as test takers were from a range of In the included studies, 9/10 exams, were described as high stakes.
disciplines including biology, education, physics, medicine, dentistry, The mode of exam delivery was primarily via computer (Fischer et al.,
and nursing. The number of student participants across the ten studies 2005; Liu et al., 2015; Ouyang et al., 2019; Pagni et al., 2017). The
ranged from 36 in a study of Graduate Medical Examinations in Ger­ remaining exams, excluding two studies that did not specify delivery
many (Fischer et al., 2005) to a high of 27,830 taking the USMLE-Step method, were paper and pencil. Stylianou-Georgiou and Papanastasiou
Two Clinical Knowledge examination in the United States (Ouyang (2017) administered a paper test and asked the students to use a pen to
et al., 2019). Table 1 provides additional details on the ten studies. circle the answers. They felt this would clearly identify the initial answer

Table 1
Summary of articles used to inform the scoping review.
Author(s) Location Discipline Education Participants # tests Test High %W-R %R-W
phase formata stakes Wrong to right Right to wrong
testing

Bauer, Copp, & Fischer Germany Medicine College 79 1 NC Y 48.2 % W-R 21.6 % R-W
2007 30.2 % W-W
2.5 % grade
increase
Fischer, Herman, Kopp Germany Medicine GME 36 1 C Y 55 % 25 %
2005
George, Muller, & Bartz, USA Nursing College 135 472 exams P 55.6 % 27.5 %
2016
Liu, Ou Lydia, Global 12 Multiple Certification 17,638 2 Quantitative C Y NC NC
Bridgeman, Brent, Gu, Countries and verbal
Lixiong, Xu Jun, sections rGRE
Kong,& Nan
2015
Merry, Elenchin, & USA Biology College 79 2 NC Y 2.8 1.0
Surma
2021
Nieswiadomy, Arnold, USA Nursing College 122 7 P Y 86 % gained 6.7 % lost
&Garza
2001
Ouyang, Harik, Clasuer, USA Medicine Graduate 27,830 8 C Y 60 % 40 %
&Paniagua
2019
Pagni, Bak, Eisen, USA Dental Graduate 160 7 C Y 64.4 % Not Addressed
Murphy, Jennipher Students
Finkelman, & Kugel
2017
Stylianou-Georgiou and Europe Education College 120 1 P Y The average The average
Papanastasiou wrong-to-right changes from right-
2017 changes was 1.62 to-wrong was 1.26
(SD = 1.62). (SD = 1.5)
Wainscott USA Physics College 985 1 P Y More Less
2016
a
NC Not clear, P paper, C computer.

3
J.S. Coffey et al. Nurse Education Today 133 (2024) 106052

and any subsequent changes requiring crossing out a pen versus erasing 5. Discussion
a pencil mark. However, some students chose, counter to the in­
structions, to make their original mark with pencil, erase and replace This scoping review explored the evidence of changing answers on
their final choice in pen. The authors considered both pen and pencil examinations. The synthesis of findings from 10 articles about changing
erased answer changes to determine their findings. This example high­ answers on objective exams refuted long held faculty and student as­
lights the variation in determining answer changes in the paper and sumptions that changing answers would lower the overall exam score.
pencil tests. We have learned that although there was variability in testing across
The majority of the ten reviewed studies were nonrandomized with a these studies conducted in several multi-national locations as well as the
convenience sample of individuals required by their course of study to United States, allowing students the option to change their test answers
take the exam/test at a prescribed time. Many of the results were from on an exam more often will result in changing from a W-R answer. In
the administration of one test in a discreet time period. However, in one fact, the studies in this review supported that changing answers would
setting, Ouyang et al. (2019) examined results from two test blocks to result in a correct answer 48 %–60 % of the time (Fischer et al., 2005;
determine if the time of day impacted test results. They reported there Bauer et al., 2007; George et al., 2016; Ouyang et al., 2019; Pagni et al.,
was no significant difference in results from answer changing between 2017). Interestingly, among these studies, the only statistically signifi­
the two blocks of time during which the tests were administered. cant variable that positively associated with total test score was
The results of answer changing on individual questions were re­ changing answers.
ported as Wrong to Right (W-R) or Right to Wrong (R-W). Consistently, Findings also supported that most examinees would change answers
the percentages of correctly changing an answer from W-R when the on an exam if given the opportunity, so educators are encouraged to
student chose to make a change was in the 48 %–60 % range in the allow backtracking on computerized exams whenever possible. Two
studies reviewed (Fischer et al., 2005; Bauer et al., 2007; George et al., studies reported that faculty guidance in changing answers on an exam
2016; Ouyang et al., 2019; Pagni et al., 2017). The exception to this (Bauer et al., 2007; Merry et al., 2021) may have been associated with
range was in one of the nursing studies where the percentage from W-R positive examinee experiences when deciding to change an answer on an
was 86 % (Nieswiadomy et al., 2001). Overall answer changes resulted exam item. Faculty should encourage examinees before testing to
in a reduced change from R-W. This ranged from a low of 6.7 % in the thoughtfully review questions they have answered but are doubtful of
nursing study (Nieswiadomy et al., 2001) to the high of 40 % in the their response before deciding to change their answers. Further
graduate medical exam (Ouyang et al., 2019). explaining that if the examinee decides to change their answers, it often
The change in the final exam scores was not always reported. will lead to a correct response. Nursing programs and the faculty who
However, in those studies that did report, there was a gain in the total teach in them should consider the evidence that has existed for over 50
score (Fischer et al., 2005; George et al., 2016; Ouyang et al., 2019; years and recommended by Morgan and Deese (1969) that changing
Pagni et al., 2017). Total score increases with answer changes were as answers is consistent with the evidence and with careful thought will
high as 45.15 % in an exam administered to 27,830 medical students more often result in a higher score and positively affect the final score.
using a battery of 8 test blocks (Ouyang et al., 2019). Answer changing Many nursing programs now administer tests using various laptop-based
prompted an increased score of 14.59 points in an exam given to 160 testing programs within an existing learning management system or
dental students in the United States (Pagni et al., 2017). through programs. Faculty administering laptop-based exams are
The participants, test formats, and test delivery (computer vs paper encouraged to enable exam settings to allow examinees to backtrack
and pencil) varied across all the studies. In addition, the method of while testing to review previously answered questions and encourage
determining a changed answer was unique in each paper reviewed. them to review questions where they doubted their answer selection and
Some of the authors also added information regarding the student or consider changing it.
faculty perspective on answer changing. George et al. (2016) published a As diversity increases and program populations include students
mixed methods study reviewing the answer-changing process and the from various backgrounds and geographic locations, it is important to
students' lived experience related to their anxiety impacting test taking examine exposure to multiple choice tests, education on test taking
and answer changing. As far as the student perception of the value of strategies, and test taking metacognition (Papanastasiou and Stylianou-
answer changing, in the studies where this was reported, there did seem Georgiou, 2022) for best outcomes. Evidence must be used for decisions,
to be a positive correlation between the belief that answer changing was which means more research is needed. This paper can be a precursor to a
beneficial and improved scores (Bauer et al., 2007). This belief may have potential systematic review and more research on the topic, specifically
been impacted by faculty guidance regarding answer changing, as re­ in nursing.
ported in some studies reviewed (Bauer et al., 2007; Merry et al., 2021).
The impact of faculty guidance and student attitudes, while variables to 6. Limitations
be considered, were not specifically addressed in most of the papers
included in this scoping review. However, Wainscott (2016) who was The literature is limited on this topic for nursing. The studies used for
initially convinced that answer changing was detrimental to student test the scoping review varied in method, exam process, and reporting of the
outcomes reported results consistent with the other studies reviewed. outcomes. While the primary indicators of answer changing from W-R,
After examining 985 exams taken by physics students, he stated the R-W were reported in most of the studies, many other variables were
positive results of answer changing were a surprise to him. This author inconsistent, and research methods were wide ranging. In addition, the
goes on to say how he has shared the information on answer changing participants, disciplines, and demographics were heterogenous, and the
among his colleagues. majority w nonrandomized.
Out of the 10 studies reviewed, answer changing resulted in more
wrong to right answer changes than right to wrong. In the studies that 7. Implications for nursing education
reported overall test scores, answer changing positively impacted those
scores. The results are derived from varying research approaches and The findings from this scoping review have implications for nursing
determinations for what constitutes an answer change. In addition, only education globally to reconsider the misconception that changing an­
a few of the studies examined the impact of advice provided to students swers on a multiple-choice test will lower the students overall test score.
about answer changing. Students change answers, and as this review shows, this is done to
benefit the student. Review of nursing protocols should consider
allowing backtracking with the option to change answers as part of test
taking practice. With encouragement from faculty to reread the question

4
J.S. Coffey et al. Nurse Education Today 133 (2024) 106052

again and carefully reconsider the answers, the likelihood is greater that Christmals, C.C., Gross, J.J., 2019. An analysis of the introduction of digital nursing
licensing examination in Ghana. Int. J. Caring Sci. 12 (3), 1892.
students will go from the wrong to right answer. This can require faculty
Fischer, M.R., Herrmann, S., Kopp, V., 2005. Answering multiple-choice questions in
members to change previously held beliefs regarding changing answers high-stakes medical examinations. Med. Educ. 39 (9), 890–894. https://doi.org/
and to be comfortable to support this approach. Future studies can 10.1111/j.1365-2929.2005.02243.x.
explore the impact of variables such as faculty guidance, student atti­ Gaskins, S., Dunn, L., Forte, L., Wood, F., Riley, P., 1996. Student perceptions of changing
answers on multiple choice examinations. J. Nurs. Educ. 35 (2), 88–90. https://doi.
tudes towards changing answers, reasons given for changing or not org/10.3928/0148-4834-19960201-09.
changing answers, and patterns of changing answers. The findings from George, T.P., Muller, M.A., Bartz, J.D., 2016. A mixed-methods study of Prelicensure
this scoping review support the need to conduct further research on the nursing students changing answers on multiple choice examinations. J. Nurs. Educ.
55 (4), 220–223. https://doi.org/10.3928/01484834-20160316-07.
efficacy of test taking strategies as well as faculty beliefs on such, Gloekler, L.A., Lucas, D., 2021. Nursing students’ preferences in test-taking, E-books, and
including targeted studies with nursing students worldwide. learning styles: a longitudinal study. Int. J. Nurs. Educ. 13 (1) (1.52-159).
Given that many universities now use exam software, the potential Hobbins, M.S., Bradley, P., 2013. Developing a prelicensure exam for Canada: an
international collaboration. J. Prof. Nurs. 29 (2), S48–S52.
decreased variation in evaluation of answer changes and clear grade Kulbok, P., Mitchel, M.l, Glick, D., Greiner, D., 2012. International experiences in nursing
outcomes would enhance the findings of any future studies. Variables to education: a review of the literature. Int. J. Nurs. Educ. Scholarsh. 9, 1.
be considered in future studies could include clarity on exam format, Liu, O.L., Bridgeman, B., Gu, L., Xu, J., Kong, N., 2015. Investigation of response changes
in the GRE revised general test. Educ. Psychol. Meas. 75 (6), 1002–1020. https://doi.
identification when the exam is high stakes, randomization with con­ org/10.1177/0013164415573988.
trols, and any pre-test advice given to the students. Merry, J.W., Elenchin, M.K., Surma, R.N., 2021. Should students change their answers on
multiple choice questions? Adv. Physiol. Educ. 45 (1), 182–190. https://doi.org/
10.1152/advan.00090.2020.
CRediT authorship contribution statement
Morgan, C.T., Deese, J., 1969. How to Study. McGraw Hill.
Munn, Z., Peters, M.D.J., Stern, C., Tufanaru, C., McArthur, A., Aromataris, E., 2018.
Jean Coffey: Conceptualization, formal analysis, investigation, data Systematic review or scoping review? Guidance for authors when choosing between
curation, writing, Annette Maruca: formal analysis, investigation, data a systematic or scoping review approach. BMC Med. Res. Methodol. 18 (143), 1–7.
https://doi.org/10.1186/s12874-018-0611-x.
curation, writing Carol Polifroni: methodology, formal analysis, Nieswiadomy, R.M., Arnold, W.K., Garza, C., 2001. Changing answers on multiple-choice
investigation, data curation, writing Marianne Snyder: formal analysis, examinations taken by baccalaureate nursing students. J. Nurs. Educ. 40 (3),
investigation, data curation, writing 142–144. https://doi.org/10.3928/0148-4834-20010301-11.
Ouyang, W., Harik, P., Clauser, B.E., Paniagua, M.A., 2019. Investigation of answer
changes on the USMLE® step 2 clinical knowledge examination. BMC Med. Educ. 19
Declaration of competing interest (389), 1–7. https://doi.org/10.1186/s12909-019-1816-3.
Pagni, S.E., Bak, A.G., Eisen, S.E., Murphy, J.L., Finkelman, M.D., Kugel, G., 2017. The
benefit of a switch: answer-changing on multiple-choice exams by first-year dental
The authors declare that they have no known competing financial students. J. Dent. Educ. 81 (1), 110–115. https://doi.org/10.1002/j.0022-
interests or personal relationships that could have appeared to influence 0337.2017.81.1.tb06253.x.
the work reported in this paper. Papanastasiou, E.C., Stylianou-Georgiou, A., 2022. Should they change their answers or
not? Modeling achievement through a metacognitive lens. Assess. Educ.: Princ.
Policy Pract. 29 (1), 77–94. https://doi.org/10.1080/0969594X.2022.2053945.
Acknowledgement Randles, R., Finnegan, A., 2023. Guidelines for writing a systematic review. Nurse Educ.
Today 125, 105803. https://doi.org/10.1016/j.nedt.2023.105803.
Snyder, C.R., Frogner, B.K., Skillman, S.M., 2018. Facilitating racial and ethnic diversity
Thank you to Jillian Levesque, RN, Class of 2022 for her advocacy
in the health workforce. J. Allied Health 47 (1), 58–65.
regarding permitting backtracking and answer changing which promp­ Stylianou-Georgiou, A., Papanastasiou, E.C., 2017. Answer changing in testing situations:
ted this scoping review. the role of metacognition in deciding which answers to review. Educ. Res. Eval. 23
(3–4), 102–118. https://doi.org/10.1080/13803611.2017.1390479.
Tricco, A.C., Lillie, E., Zarin, W., O’Brien, K.K., Colquhoun, H., Levac, D., Moher, D.,
References Peters, M., Horsley, T., Weeks, L., Hempel, S., Aki, E., Chang, C., McGowan, J.,
Stewart, L., Hartling, L., Aldcroft, A., Wilson, M., Garritty, C., et al., 2018. PRISMA
Arksey, H., O’Malley, L., 2005. Scoping studies: towards a methodological framework. Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann.
Int. J. Soc. Res. Methodol. 8 (1), 19–32. https://doi.org/10.1080/ Intern. Med. 169 (7), 467–473. https://doi.org/10.7326/M18-0850.
1364557032000119616. Waddell, D.L., Blankenship, J.C., 1994. Answer changing: a meta-analysis of the
Bauer, D., Kopp, V., Fischer, M.R., 2007. Answer changing in multiple choice assessment prevelence and patterns. J. Contin. Educ. Nurs. 25 (4), 155–158.
change that answer when in doubt – and spread the word! BMC Med. Educ. 7 (28), Wainscott, H., 2016. Multiple-choice answers: to change or not to change? Perhaps not
7–28. https://doi.org/10.1186/1472-6920-7-28. such a simple question. Phys. Teach. 54 (8), 469–471. https://doi.org/10.1119/
Blakeman, J., Laskowski, P., 2020. Beliefs and experiences of nurse educators regarding 1.4965266.
changing answers on examinations. Nurs. Educ. Perspect. 41 (2), 97–102. https:// Yonemoto, G., Kashani, M., Benoit, E.B., Weiss, J.J., Barbosa, P., 2023. Changing answers
doi.org/10.1097/01.NEP.0000000000000497. in multiple-choice exam questions: patterns of TOP-tier versus BOTTOM-tier
Bultas, M.W., Schmuke, A.D., Rubbelke, C., Jackson, J., Moran, V., 2021. Assessment of students in podiatric medical school. J. Med. Educ. Curric. Dev. 10, 1–7. https://doi.
answer changing behaviors of nursing students and faculty recommendations. org/10.1177/23821205231179312.
J. Nurs. Educ. 60 (6), 324–328. https://doi.org/10.3928/01484834-20210520-04.

You might also like