Cognitive and System Factors Contributing To Diagnostic Errors in Radiology

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

M e d i c a l P hy s i c s a n d I n f o r m a t i c s • R ev i ew

Lee et al.
Diagnostic Errors in Radiology

Special Articles
Review
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

Cognitive and System Factors


Contributing to Diagnostic
Errors in Radiology
Cindy S. Lee1 OBJECTIVE. In this article, we describe some of the cognitive and system-based sources
Paul G. Nagy of detection and interpretation errors in diagnostic radiology and discuss potential approach-
Sallie J. Weaver es to help reduce misdiagnoses.
David E. Newman-Toker CONCLUSION. Every radiologist worries about missing a diagnosis or giving a false-
positive reading. The retrospective error rate among radiologic examinations is approximate-
Lee CS, Nagy PG, Weaver SJ, Newman-Toker DE ly 30%, with real-time errors in daily radiology practice averaging 3–5%. Nearly 75% of all
medical malpractice claims against radiologists are related to diagnostic errors. As medical
reimbursement trends downward, radiologists attempt to compensate by undertaking addi-
tional responsibilities to increase productivity. The increased workload, rising quality expec-
tations, cognitive biases, and poor system factors all contribute to diagnostic errors in radiol-
ogy. Diagnostic errors are underrecognized and underappreciated in radiology practice. This
is due to the inability to obtain reliable national estimates of the impact, the difficulty in eval-
uating effectiveness of potential interventions, and the poor response to systemwide solutions.
Most of our clinical work is executed through type 1 processes to minimize cost, anxiety, and
delay; however, type 1 processes are also vulnerable to errors. Instead of trying to completely
eliminate cognitive shortcuts that serve us well most of the time, becoming aware of common
biases and using metacognitive strategies to mitigate the effects have the potential to create
sustainable improvement in diagnostic errors.

D
iagnostic errors are estimated to malpractice lawsuits are due to failure to diag-
account for 40,000–80,000 deaths nose [6]. This means oversight of abnormalities
annually in U.S. hospitals alone or misinterpretation of radiologic images [3, 7].
[1]. These figures only partially Errors in diagnostic radiology have long
account for patients whose ambulatory misdi- been recognized, beginning with the pioneer-
agnoses lead to death, and they do not include ing revelation of Garland [8] in 1949. Multiple
nonlethal disability, which may be just as studies have identified suboptimal radiology
common as death [2]. Tort claims for negli- processes as contributors to the overwhelming
Keywords: cognitive biases, diagnostic errors, fatigue,
gent diagnostic errors result in billions of dol- number of medical errors and escalating eco-
medical errors, misdiagnosis
lars in payouts annually [2]. Nearly 75% of all nomic costs, which are estimated at more than
DOI:10.2214/AJR.12.10375 medical malpractice claims against radiolo- $38 billion annually [9, 10]. Overall, approx-
gists are related to diagnostic errors [3]. Every imately 30% of abnormal radiographic stud-
Received November 17, 2012; accepted without revision radiologist worries about missing a diagnosis ies are missed. Approximately 4% of radio-
November 29, 2012.
or erring too heavily on the side of caution and logic interpretations rendered by radiologists
1
All authors: The Russell H. Morgan Department of giving a false-positive reading [4]. in daily practice contain errors [11]. Quekel et
Radiology, Johns Hopkins University School of Medicine, al. [12] found that 19% of lung cancers pre-
22 S Greene St, Baltimore, MD, 21201. Address Definition, Prevalence, and Impact senting as a nodule with a median diameter of
correspondence to P. G. Nagy (pnagy2@jhmi.edu).
of Diagnostic Errors 16 mm on chest radiographs were missed, and
CME/SAM Diagnostic error has been defined as a di- even higher rates between 25% and 90% have
This article is available for CME/SAM credit. agnosis that is missed, wrong, or delayed as been reported in the literature [13–15].
detected by some subsequent definitive test or Mammography has been the standard of care
AJR 2013; 201:611–617 finding [5]. Here we use the terms “diagnostic for the detection of breast carcinoma. Howev-
0361–803X/13/2013–611
error” and “misdiagnosis” interchangeably and er, a misdiagnosis of breast cancer occurs in
do not distinguish between them. In radiology, 4–30% of screening mammography studies ac-
© American Roentgen Ray Society the most common problems leading to medical cording to multiple randomized controlled trials

AJR:201, September 2013 611


Lee et al.

[16, 17]. Given that 38,294,403 mammography making in real-world settings [24]. This mod- likely to lead to diagnostic errors in radiolo-
studies were performed annually in the United el proposes two general classes of cognitive gy (anchoring, framing, search satisfication,
States as of 2013, it is evident why radiolog- operations and suggests causal explanations of premature closure, and multiple alternative
ic misdiagnosis is an important public health where and how diagnostic errors occur in clin- bias) and potential metacognitive strategies
issue [18]. Moreover, screening mammogra- ical reasoning [25]. Early in diagnosis, radiol- to reduce them [27, 31].
phy also results in overdiagnosis in 1–54% of ogists must assess the features of an imaging
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

cases, which represents the false-positive find- finding for pattern recognition. If the condition Anchoring Bias
ings that would not have become symptomatic is recognized, so-called “type 1” (automatic) Anchoring is relying on an initial impres-
during a woman’s lifetime if no screening had processes will rapidly and effortlessly make sion and failing to adjust this impression in
taken place [19]. the diagnosis and nothing further may be re- light of subsequent information [32]. For ex-
With radiologic diagnostic testing, as in lab- quired. If it is not, then linear, analytical, de- ample, in a patient with multiple sclerosis who
oratory medicine [20], diagnostic errors may liberate, and effortful “type 2” processes are develops a new enhancing brain lesion seen on
result from failures related to test ordering be- engaged instead. Dynamic oscillation may oc- MRI, the most appropriate diagnosis might be
fore a radiologist is ever involved or in the or- cur between the two systems throughout the another demyelinating plaque. A repeat image
dering clinician’s use of the results after the ra- decision-making process. Certain types of er- a week later showing additional enhancing le-
diologist’s work is complete. Diagnostic errors rors are prone to occur when type 1 process- sions might be dismissed as more demyelinat-
attributed to radiologists have been grouped as es are used because mental shortcuts (heuris- ing plaques during a multiple sclerosis exacer-
related to failures in detection, interpretation, tics) used in this type of cognitive processing bation without a closer inspection that might
communication of results, or suggesting an ap- are particularly susceptible to human biases identify features suggestive of CNS lympho-
propriate follow-up test [6]. [25], which will be further described later. Er- ma. Anchoring is particularly dangerous when
Despite the high prevalence and serious con- rors occurring during type 2 processes are be- combined with confirmation bias, when the
sequences of diagnostic errors, until recently, lieved to be less frequent in everyday practice radiologist seeks confirming evidence to sup-
they have received relatively little attention. For but may be no less consequential [25]. These port the hypothesis rather than contradictory
example, a text search of the 1999 Institute of cognitive processes are also impacted by in- evidence to refute it [29].
Medicine (IOM) report To Err is Human [21], ternal (e.g., fatigue, stress) and external (e.g., Corrective strategy: Avoid early guesses;
which focused on the importance of medical lighting) factors. seek to disprove the initial diagnosis, rather
error, found the term “diagnostic errors” men- Graber et al. [22] showed that cognitive fac- than just confirm it (or seek disconfirming in-
tioned only twice compared with 70 times for tors contribute to the diagnostic error in 74% formation rather than confirmatory informa-
“medication errors” [21]. of cases. Cognitive errors include faulty per- tion); when findings are worsening, reconsid-
ception, failed heuristics, and biases. We rely er the diagnosis or get a second opinion.
The Cause of Error in Radiology: on these shortcuts in reasoning to minimize
System-Related Causal Factors and delay, cost, and anxiety in our clinical deci- Framing Bias or Effect
Cognitive-Perceptual Causal Factors sion making. Over the past three decades, the Framing is being strongly influenced by
Diagnostic error in internal medicine is com- cognitive evolution in psychology literature subtle ways in which the problem is worded or
monly multifactorial in origin, typically sec- has given rise to extensive literature on cogni- framed [32]. For example, a radiologist detects
ondary to a mix of cognitive and system fac- tive bias in decision making. Cognitive bias is multiple foci of abnormal activity in bilateral
tors [22]. In radiology, cognitive errors (e.g., a best defined as a replicable pattern in percep- ribs on a bone scan in a frail elderly patient.
missed lung nodule when interpreting a chest tual distortion, inaccurate judgment, and illog- If a truncated clinical indication states, “his-
radiograph) are usually linked to problems of ical interpretation [26]. Cognitive biases are tory of weight loss, chest pain,” this finding
visual perception (scanning, recognition, inter- the result of psychologic distortions in the hu- might be erroneously interpreted as strongly
pretation). System errors (e.g., failure to com- man mind, which persistently lead to the same suggesting metastatic lesions. If the full indi-
municate the presence of a nodule to the order- pattern of poor judgment, often triggered by a cation, “history of weight loss, chest pain after
ing physician) are usually linked to problems particular situation. Some authors suggest that recent fall down stairs” were provided, a diag-
with the health system or context of care deliv- metacognition (thinking about thinking) may nosis of multiple rib fractures would be made.
ery. As with general medical diagnosis, errors enable us to avoid being trapped by these cog- Because radiologists rely on abridged clinical
often result from a combination or interaction nitive biases using deliberate type 2 cognitive details, framing may be a major contributor to
between the two (e.g., night-staffed preliminary forcing strategies [27, 28]. Rather than elim- diagnostic error in radiology.
reports by resident radiologists that are altered inating these cognitive shortcuts that serve Corrective strategy: Masked read before
in a final report but not fully communicated to us well most of the time, we might be better reviewing clinical indication; seek more clin-
caregivers) [23]. As described later in this ar- served by recognizing the potential diagnostic ical information from treating physicians
ticle, certain system factors (e.g., lighting con- dangers that arise from specific shortcuts and when image interpretation is tightly coupled
ditions, shift length, pace of reading required) overriding them when appropriate. with clinical context or abnormal findings are
have a profound effect on the likelihood of cog- Dozens of cognitive biases have been de- likely to alter management.
nitive diagnostic errors in radiology. scribed [29]. Some of these biases likely play
only a small role in radiology diagnostic er- Availability Bias
Cognitive Errors in Radiology ror (e.g., certain emotional biases associat- Availability is the tendency to consider diag-
The dual-process theory of reasoning has ed with direct patient interaction) [30]. On noses more likely if they readily come to mind.
emerged as the dominant theoretic model for the basis of a review of recent literature, we For example, if a radiologist missed lung can-
cognitive processing during human decision identified five cognitive biases particularly cer on a chest radiograph, he or she is more

612 AJR:201, September 2013


Diagnostic Errors in Radiology

likely to overcall suspected lung nodules on policies and procedures, inefficient process- dynamically [40]. After a work shift, radiolo-
subsequent chest radiographs despite the low es, teamwork, communication, and technical gists have increased variability in their ocular
likelihood. Radiologists should be aware of and equipment failures [22]. Factors such as convergence capabilities, indicating increased
the tendency to overestimate the frequency equipment failures and the methods of com- oculomotor strain and visual fatigue. Detec-
of previously missed, unusual, or otherwise municating dangerous radiographic findings tion accuracy for pulmonary nodules was re-
memorable cases. to treating clinicians influence the likelihood duced on dynamically displayed CT images
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

Corrective strategy: Obtain and use objective of radiographic diagnostic error from a patient in the resident group only, with no signifi-
information to estimate the true base rate of a perspective. However, our focus is primarily cant effect on the attending physicians. Inter-
diagnosis; benchmark diagnostic performance on those factors that affect the likelihood of di- estingly, this difference between the residents
against peers (e.g., for screening mammogra- agnostic error by the radiology provider. Sys- and attending physicians was also previously
phy, radiologists should enroll in the American tem issues, such as lighting conditions, shift shown in the fracture detection study. Accom-
College of Radiology National Mammography length and timing, task repetitiveness, pace of modative relaxation (shifting the focal point
Data Registry to compare their recall rate, can- reading images, and environmental distrac- from near to far or vice versa) is effective in
cer detection rate, and positive predictive value tions, all may impact the psychophysical pro- reducing visual fatigue. In fact, a radiologist
with the established local, regional, and nation- cess of visual diagnosis. Many of these issues can even become more resistant to visual fa-
al benchmarks) [33, 34]. ultimately exert their effects through visual tigue by undergoing automated accommoda-
and mental fatigue for radiologists [36]. tive training [36, 41, 42].
Search Satisficing (Satisfaction of Search) Fatigue is a subcategory of system-related
Search satisficing is the tendency to stop error in radiology because health care provid- Decision (Mental) Fatigue
a search for abnormality once one diagno- ers are constantly required to deliver quality Radiologists also experience decision fa-
sis that is evaluated as likely is found. For patient care while under the stress of disrupted tigue as a consequence of continuous and pro-
example, when a brain mass is identified on circadian rhythms. Although many other sys- longed decision making [43]. Decision fatigue
CT in a patient with headache, a radiologist tem issues coexist and contribute to misdiagno- is thought to increase later in the day or after the
might miss ethmoid or sphenoid sinus con- sis, we choose fatigue as the primary example work shift when cognitive processes respond
solidation (especially if the radiologist does for this discussion because it is a well-studied to mental strain by taking short cuts, leading
not know that the brain tumor diagnosis is field. As medical reimbursement continues to to poor judgment and diagnostic errors [37].
old or that the patient has a fever). trend downward, radiologists attempt to com- Those working prolonged shifts, off hours and
Corrective strategy: Use a checklist or algo- pensate by undertaking additional responsibili- with high-volume or high-complexity tasks are
rithmic approach to insure a systematic search, ties and increasing organizational productivity. at the greatest risk [43]. In particular, one of the
particularly for “do-not-miss” diagnoses [35]; The increased workload and rising quality ex- most vulnerable populations is radiology resi-
always commence a secondary search after the pectations, poor communication, cognitive bi- dents who provide preliminary interpretations
first search has been completed; be mindful of ases, and imperfect information systems serve independently during off hours [43].
known combinations (e.g., multiple foreign as major sources of fatigue, often leading to
bodies, multiple fractures or contusions, or in- diagnostic errors [37]. Despite continuously Potential Solutions
farction or vascular occlusion). evolving technology refinement and develop- The ultimate goal in reducing diagnostic er-
ment, the current medical imaging system has rors is to first describe, analyze, and research
Premature Closure developed as a one-size-fits-all model with rela- cognitive biases in the context of medical de-
Premature closure is the tendency to accept tive inflexibility, which can impede workflow cision making and then to find effective ways
a diagnosis before full verification. For exam- and productivity as well as cause end-user fa- of cognitively ridding ourselves and our peers
ple, in a patient with myasthenia gravis and a tigue [36]. As imaging volume and complexity of bias. Rather than attempting to completely
homogeneous mediastinal mass seen on chest continue to grow over time, the impact of visual eliminate cognitive shortcuts that often serve
CT, a diagnosis of thymoma might be made, fatigue on diagnostic accuracy is becoming in- us well, becoming aware of the common bi-
even though thymic hyperplasia, lymphoma, creasingly important [38]. ases will lead to a more sustained improve-
and germ cell tumors remain on the differen- ment in patient care. Moreover, there is no one
tial diagnosis. A general limitation of imaging Visual Fatigue simple solution to diagnostic errors. Improv-
diagnoses is that pathologic diagnoses are in- Krupinski et al. [39] studied the direct im- ing diagnostic accuracy will require a multi-
ferred and not confirmed until tissue pathol- pact of fatigue using fractures on skeletal ra- dimensional approach that includes renewed
ogy is obtained. diographs as the detection task. There is a sig- emphasis on traditional teaching of clinical
Corrective strategy: Always generate a dif- nificant reduction in diagnostic accuracy after skills, exploration of new methods for diag-
ferential diagnosis (use checklists for common the day of work (p < 0.05) with associated in- nostic education (e.g., simulation or gaming),
lesion differentials); never convert a working creasing myopia. As expected, subjective rat- improvements in health information technol-
diagnosis to a final diagnosis before full (path- ings of physical discomfort, eye strain, and ogy systems, and investment in the basic sci-
ologic) verification. lack of motivation also increase by the end of ence of clinical diagnosis [44].
the workday. Interestingly, residents suffered
System-Related Error in Radiology greater effects of fatigue on all measures com- Feedback System: Radiology-Pathology
In internal medicine, system-related factors pared with attending radiologists [39]. The ef- Correlation
contribute to diagnostic error in 65% of cas- fects of visual fatigue seen with static radio- Radiologic-pathologic correlation of many
es [22]. The vast majority of system-related graphs also seem to apply to cross-sectional clinical diagnoses has been described in the
error in these cases relates to problems with imaging examinations, which are displayed literature, but its adoption as a quality mea-

AJR:201, September 2013 613


Lee et al.

sure to assess radiologists’ diagnostic accura- clude opportunities for self-reflection on rea- is intended to improve the organization, con-
cy is a relatively new concept. For diagnoses soning processes and formative feedback [1]. tent, readability, and usefulness of the radi-
with pathologic correlation, we can track data The trainees should be taught not to miss cer- ology report as well as advance the efficien-
on positive predictive value, disease detec- tain key diagnoses; the board certification or- cy and effectiveness of the reporting process
tion rates, and abnormal interpretation rates ganizations also need to emphasize key ele- [52]. Despite the importance of the radiology
to determine the interpretive accuracy for in- ments of diagnostic accuracy as part of robust report, it has historically been created with
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

dividual radiologists [45]. A Cornell Medical evaluation methods. These key competencies free-style conventional dictation, leading to
Center study showed that the error rate of ra- include the knowledge to make correct diag- nonstandardized, error prone, vague, incom-
diologic-pathologic correlation in suspected noses, ability to use electronic resources ef- plete, or untimely delivery of findings with
acute appendicitis is a feasible and effective fectively to find information, awareness of significant interobserver variability. Both
measure of interpretive accuracy of radiolo- common cognitive biases and meta-cognitive referring clinicians and radiologists have
gists [46]. This study provided documenta- strategies to mitigate them, mature clinical found that structured reports have better con-
tion of departmental accuracy of diagnosis. judgment, and ability to engage in eliminat- tent and greater clarity than conventional re-
Further research with larger, multiinstitution- ing cognitive bias [21]. ports for body CT [53], although structured
al studies may enable the development of na- reports did not significantly improve report
tional benchmarks for radiologic-pathologic Empower Information Technology Tools accuracy or completeness according to a co-
concordance in acute appendicitis and other to Improve Training hort study [54]. According to a survey study,
conditions. Each radiologist will be required A critical step to reduce diagnostic errors more than 80% of clinicians prefer to receive
to interpret a sufficient number of cases to is the process of defining radiology qual- standardized reports that consist of templates
draw statistically significant conclusions for ity metrics and developing the information with separate headings for each organ sys-
individual accuracy. technology tools to quantify and track these tem [55]. However, most radiology residency
quality metrics. More specifically, the IOM programs in the United States do not provide
Peer Review cited a lack of adequate resident supervision residents with more than 1 hour of reporting
Multiple regulatory organizations re- and excessive fatigue as significant contrib- instruction a year [56]. According to 92%
quire the ongoing practice-based evaluation utors to diagnostic errors, which resulted in of clinicians and 95% of radiologists, struc-
of physician performance. In radiology, the the recent implementation of the Accredita- tured reporting should be an obligatory part
single most important measure of perfor- tion Council for Graduate Medical Education of residency training [55]. This serves as a
mance is diagnostic accuracy of interpreta- restriction of resident work hours to 80 per good area for further prospective studies to
tion because errors can directly result in pa- week [48]. To evaluate trainee performance see whether a structured reporting system
tient harm. while on-call, the University of Pennsylvania can improve diagnostic accuracy, particular-
Peer review is continuous, systematic, and radiology department developed a software ly given fears of “copy and paste” errors [57].
critical reflection and evaluation of physician application (Orion) to facilitate the identifi- Structured reporting also serves the impor-
performance using structured procedures. cation and monitoring of major discrepancies tant role of a checklist (i.e., a cognitive job
Peer review acts as an essential tool to assess in preliminary reports issued on-call [49]. aid), a meta-cognitive tool that can help cir-
radiologists’ performance and to improve di- The study included 19,200 on-call studies in- cumvent some cognitive biases and reflect on
agnostic accuracy. Setting up a successful terpreted by residents and 13,953 studies in- cognitive shortcuts that often lead to diagnos-
peer review program requires a committed terpreted by fellows. Standard macros were tic errors. Mindfully using checklists encour-
team and a positive culture [47] that is consci- used to classify these reports as “agreement,” ages the user to decrease reliance on memo-
entious regarding consumption of radiologists’ “minor discrepancy,” and “major discrepan- ry; step back to examine the thinking process
valuable time and disruption of workflow. cy” on the basis of the potential to impact pa- (metacognition); develop strategies to avoid
tient outcome or management. This new soft- predictable biases (cognitive forcing); and
Education ware enables the residency director to use the recognize altered emotional states caused by
The problem of misdiagnosis cannot be major discrepancy rate to identify outliers fatigue, sleep deprivation, or other stressful
solved without education, but it also cannot and knowledge gaps in specific subspecialty conditions [58]. Diagnostic checklists have
be solved with education alone. Five evi- areas within the training program [50]. The been shown to be effective in reducing er-
dence-based educational recommendations program can also be used to evaluate the rate rors in other fields of medicine, such as emer-
should be considered: First, teach from cases of diagnostic errors by the length of shift and gency medicine and anesthesiology [58–62].
that are numerous, varied, and unknown; sec- volume of studies [51]. Through a powerful Using structured reporting as a diagnostic
ond, focus learners on real-world diagnostic information technology tool such as this, we checklist can help users to consider common
decisions; third, force integration of analyt- can better understand the contributory factors and particularly serious (“do-not-miss”) di-
ic and intuitive thinking; fourth, make meta- in misdiagnosis and design solutions to im- agnoses in a systematic manner [35].
awareness part of the curriculum; and fifth, prove physician training and reduce errors.
take a multidimensional approach to evalua- Computer-Aided Detection
tion (Newman-Toker DE, presented at 2012 Structured Reporting Systems In this age of digital information, new clini-
Grand Rounds of Johns Hopkins Armstrong Structured reporting has gained attention cal decision support tools empower physicians
Institute). Training programs for medical stu- in the radiology community for improving in many areas, such as constructing a differ-
dents, residents, and fellows should include communication between referring physicians ential diagnosis list and ordering appropriate
structured practice in diagnostic reasoning and radiologists [52]. The written report is diagnostic testing. Within radiology, clinical
with model patients and simulations that in- the most tangible product of radiologists and decision support primarily takes the form of

614 AJR:201, September 2013


Diagnostic Errors in Radiology

computer-aided detection, which has gained the workload will be difficult to implement 2. Saber-Tehrani AS, Lee HW, Matthews SC, et al.
clinical acceptance for assisting imaging di- secondary to a loss in productivity and prof- 20-year summary of US malpractice claims for
agnosis. Using mammography as an exam- itability, but there are alternative less costly diagnostic errors from 1985–2005. (abstr) Pro-
ple, studies have shown that computer-aided strategies that can help improve diagnostic ceedings of the fourth Annual Diagnostic Error
detection can improve the sensitivity of a sin- performance. Examples include instituting in Medicine Conference. Chicago, IL: Johns
gle reader, with an incremental cancer detec- double reads, limiting the length of work Hopkins University School of Medicine, 2011
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

tion rate ranging between 1% and 19% [63]. shifts, establishing structured breaks, and 3. Berlin L, Berlin JW. Malpractice and radiologists
However, computer-aided detection will also switching between modalities during the in Cook County, IL: trends in 20 years of litiga-
substantially decrease specificity and cause workday. It is commonly believed that look- tion. AJR 1995; 165:781–788
unnecessary further testing in approximately ing away at a distant object at least twice 4. Berlin L. Accuracy of diagnostic procedures: has
6–35% of women [63]. Evidence indicates that an hour during computer usage is sufficient it improved over the past 5 decades? AJR 2007;
computer-aided detection does not perform as for preventing visual fatigue [73]. In addi- 188:1173–1178
well as double human reading in the context of tion, proper lighting, better workstation er- 5. Graber M. Diagnostic errors in medicine: a case
breast screening mammography, leaving room gonomics, and eyeglass correction have also of neglect. Jt Comm J Qual Patient Saf 2005;
for refinement of computer-aided detection al- been suggested as effective solutions [36]. 31:106–113
gorithms to address this issue [63]. Computer- 6. Pinto A, Brunese L. Spectrum of diagnostic er-
aided detection has also been used in CT for Conclusion rors in radiology. World J Radiol 2010; 2:377–
the detection of pulmonary nodules. Similar to Diagnostic errors are underrecognized 383
its impact on mammography, computer-aided and underappreciated in radiology practice 7. Berlin L. Malpractice and radiologists, update
detection substantially increases the sensitivi- because of the inability to obtain reliable na- 1986: an 11.5-year perspective. AJR 1986;
ty of lung nodule detection with a concomitant tional estimates of the impact, difficulty in 147:1291–1298
decrease in specificity [64]. The role of com- evaluating effectiveness of potential inter- 8. Garland LH. On the scientific evaluation of diag-
puter-aided detection in diagnostic imaging is ventions, and poor response to systemwide nostic procedures. Radiology 1949; 52:309–328
emerging and necessitates well-designed pro- solutions. Most clinical work is executed 9. Johnson CD, Krecke KN, Miranda R, et al. Qual-
spective studies. through type 1 processes to minimize cost, ity initiatives: developing a radiology quality and
anxiety, and delay; however, type 1 process- safety program—a primer. RadioGraphics 2009;
Design Workload to Align With es are also vulnerable to errors. Instead of 29:951–959
Productivity Benchmarks trying to completely eliminate cognitive 10. Kruskal JB, Anderson S, Yam CS, et al. Strate-
There is a wealth of literature on the neg- shortcuts that serve us well most of the time, gies for establishing a comprehensive quality and
ative impact of excessive workload, long becoming aware of common biases and us- performance improvement program in a radiolo-
work hours, and fatigue on patient safety ing meta-cognitive strategies to mitigate gy department. RadioGraphics 2009; 29:315–
and medical errors [65–67]. The clinical their effects have the potential to create sus- 329
productivity of radiologists is most com- tainable improvement in diagnostic errors. 11. Borgstede JP, Lewis RS, Bhargavan M, Sunshine
monly measured with the resource-based For diagnostic errors to receive the re- JH. RADPEER quality assurance program: a
relative value scale with a relative value unit sources and attention they deserve in the multifacility study of interpretive disagreement
(RVU) assigned to each radiology examina- field of patient safety, multiple approaches rates. J Am Coll Radiol 2004; 1:59–65
tion [68]. By design, the RVU scale does not are required. First, we need the methodol- 12. Quekel LG, Kessels AG, Goei R, van En-
account for important administrative, lead- ogy to accurately measure diagnostic er- gelshoven JM. Miss rate of lung cancer on the
ership, or academic efforts. Moreover, the rors, thereby evaluating the effectiveness of chest radiograph in clinical practice. Chest 1999;
RVU does not assess the quality of servic- potential interventions. Second, we need to 115:720–724
es or the professionalism of the radiologist encourage research in the basic science of 13. Heelan RT, Flehinger BJ, Melamed MR, et al.
[69]. For this reason, academic and private diagnostic errors to better understand why Non-small-cell lung cancer: results of the New
radiology practices have different bench- we make mistakes and how we can prevent York screening program. Radiology 1984;
marks of productivity. Lu et al. [70] reported them. Third, we need to maximize and re- 151:289–293
a mean clinical workload of 9671 annual ex- fine available information technology tools, 14. Muhm JR, Miller WE, Fontana RS, et al. Lung
aminations or 7136 RVUs per full-time aca- such as computer-aided detection. Finally, cancer detected during a screening program us-
demic radiologist. Not surprisingly, this re- training programs should include diagnos- ing 4-month chest radiographs. Radiology 1983;
flects increases of 15% and 22% from their tic reasoning; board certification organiza- 148:609–615
previous survey in 2003 [71]. tions also need to emphasize key elements 15. Stitik FP, Tockman MS. Radiographic screening
In comparison, a full-time private prac- of diagnostic accuracy in the licensing pro- in the early detection of lung cancer. Radiol Clin
tice radiologist on average interprets 12,669 cess. In summary, health information tech- North Am 1978; 16:347–366
examinations or 7429 RVUs per year [72]. nology, improved education, and increasing 16. Giess CS, Frost EP, Birdwell RL. Difficulties and
Armed with these published benchmarks of acknowledgment of diagnostic errors hold errors in diagnosis of breast neoplasms. Semin
radiologist productivity in both academic promise in error reduction. Ultrasound CT MR 2012; 33:288–299
and private practice environments, we have 17. Humphrey LL, Helfand M, Chan BK. Breast can-
a more realistic perspective of what consti- References cer screening: a summary of the evidence for the
tutes an excessive workload, which can neg- 1. Newman-Toker DE, Pronovost PJ. Diagnostic er- U.S. Preventive Services Task Force. Ann Intern
atively impact radiologists’ performance rors: the next frontier for patient safety. JAMA Med 2002; 137:347–360
and diagnostic accuracy. Solutions that limit 2009; 301:1060–1062 18. U.S. Food and Drug Administration website.

AJR:201, September 2013 615


Lee et al.

Mammography Quality Standards Program. ty-Safety/National-Radiology-Data-Registry/ 50. Ruutiainen AT, Scanlon MH, Itri JN. Identifying
www.fda.gov/Radiation-EmittingProducts/ National-Mammography-DB. Accessed October benchmarks for discrepancy rates in preliminary
MammographyQualityStandardsActandProgram/­ 10, 2012 interpretations provided by radiology trainees at
facilityScorecard/ucm113858.htm. Accessed July 35. Graber ML, Wachter RM, Cassel CK. Bringing an academic institution. J Am Coll Radiol 2011;
1, 2013 diagnosis into the quality and safety equations. 8:644–648
19. de Gelder R, Heijnsdijk EA, van Ravesteyn NT, JAMA 2012; 308:1211–1212 51. Itri JN, Redfern RO, Scanlon MH. Using a web-
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

Fracheboud J, Draisma G, de Koning HJ. Inter- 36. Blehm C, Vishnu S, Khattak A, Mitra S, Yee RW. based application to enhance resident training
preting overdiagnosis estimates in population- Computer vision syndrome: a review. Surv Oph- and improve performance on-call. Acad Radiol
based mammography screening. Epidemiol Rev thalmol 2005; 50:253–262 2010; 17:917–920
2011; 33:111–121 37. Reiner BI, Krupinski E. The insidious problem of 52. Kahn CE, Langlotz CP, Burnside ES, et al. To-
20. Plebani M. Errors in clinical laboratories or er- fatigue in medical imaging practice. J Digit Im- ward best practices in radiology reporting. Radi-
rors in laboratory medicine? Clin Chem Lab Med aging 2012; 25:3–6 ology 2009; 252:852–856
2006; 44:750–759 38. Vertinsky T, Foster B. Prevalence of eye strain 53. Schwartz LH, Panicek DM, Berk AR, Li Y, Hri-
21. Wachter RM. Why diagnostic errors don’t get any among radiologists: influence of viewing vari- cak H. Improving communication of diagnostic
respect: and what can be done about them. Health ables on symptoms. AJR 2005; 184:681–686 radiology findings through structured reporting.
Aff 2010; 29:1605–1610 39. Krupinski EA, Berbaum KS, Caldwell RT, Radiology 2011; 260:174–181
22. Graber ML, Franklin N, Gordon R. Diagnostic Schartz KM, Kim J. Long radiology workdays 54. Johnson AJ, Chen MY, Swan JS, Applegate KE,
error in internal medicine. Arch Intern Med reduce detection and accommodation accuracy. J Littenberg B. Cohort study of structured report-
2005; 165:1493–1499 Am Coll Radiol 2010; 7:698–704 ing compared with conventional dictation. Radi-
23. McCreadie G, Oliver TB. Eight CT lessons that 40. Krupinski EA, Berbaum KS, Caldwell RT, ology 2009; 253:74–80
we learned the hard way: an analysis of current Schartz KM, Maden MT, Kramer DJ. Do long 55. Bosmans JM, Weyler JJ, De Schepper AM, Pari-
patterns of radiological error and discrepancy radiology workdays affect nodule detection in zel PM. The radiology report as seen by radiolo-
with particular emphasis on CT. Clin Radiol dynamic CT interpretation? J Am Coll Radiol gists and referring clinicians: results of the COV-
2009; 64:491–499 2012; 9:191–198 ER and ROVER surveys. Radiology 2011;
24. Weaver SJ, Newman-Toker DE, Rosen MA. Cog- 41. Iwasaki T, Tawara A, Miyake N. Reduction of 259:184–195
nitive skill decay and diagnostic error: best prac- asthenopia related to accommodative relaxation 56. Sistrom C, Lanier L, Mancuso A. Reporting in-
tices for continuing education in healthcare. J by means of far point stimuli. Acta Opthalmol struction for radiology residents. Acad Radiol
Contin Educ Health Prof 2010; 30:208–220 Scand 2005; 83:81–88 2004; 11:76–84
25. Croskerry P. Clinical cognition and diagnostic 42. Cooper J, Feldman J, Selenow A, et al. Reduction 57. Hirschtick R. A piece of my mind: copy-and-
errors: applications of a dual process model of of asthenopia after accommodative facility train- paste. JAMA 2006; 295:2335–2336
reasoning. Adv in Health Sci Educ 2009; 14:27– ing. Am J Optom Physiol Opt 1987; 64:430–436 58. Ely JW, Graber ML, Croskerry P. Checklists to
35 43. Gaba DM, Howard SK. Patient safety: fatigue reduce diagnostic errors. Acad Med 2011;
26. Ariely D. Predictably irrational: the hidden forc- among clinicians and the safety of patients. N 86:307–313
es that shape our decisions. New York, NY: Engl J Med 2002; 347:1249–1255 59. Hales BM, Pronovost PJ. The checklist: a tool for
HarperCollins, 2000 44. Eva KW. What every teacher needs to know error management and performance improve-
27. Croskerry P. Cognitive forcing strategies in clini- about clinical reasoning. Med Educ 2005; 39:98– ment. J Crit Care 2006; 21:231–235
cal decision making. Ann Emerg Med 2003; 106 60. Gawande A. The checklist: if something so sim-
41:110–120 45. Lee JK. Quality: a radiology imperative—inter- ple can transform intensive care, what else can it
28. Croskerry P. The cognitive imperative: thinking pretation accuracy and pertinence. J Am Coll Ra- do? New Yorker 2007; 86–101
about how we think. Acad Emerg Med 2000; 7: diol 2007; 4:162–165 61. Pronovost P, Needham D, Berenholtz S, et al. An
1223–1231 46. Gurian MS, Kovanlikaya A, Beneck D, Baron intervention to decrease catheter-related blood-
29. Croskerry P. Achieving quality in clinical deci- KT, John M, Brill PW. Radiologic-pathologic stream infections in the ICU. N Engl J Med 2006;
sion making: cognitive strategies and detection of correlation in acute appendicitis: can we use it as 355:2725–2732
bias. Acad Emerg Med 2002; 9:1184–1204 a quality measure to assess interpretive accuracy 62. Hart EM, Owen H. Errors and omissions in anes-
30. Croskerry P, Abbass A, Wu AW. Emotional influ- of radiologists. Clin Imaging 2011; 35:421–423 thesia: a pilot study using a pilot’s checklist.
ences in patient safety. J Patient Saf 2010; 6:199– 47. Kaewlai R, Abujudeh H. Peer review in clinical Anesth Analg 2005; 101:246–250
205 radiology practice. AJR 2012; 199:[web]W158– 63. Houssami N, Given-Wilson R, Ciatto S. Early de-
31. Redelmeier DA. The cognitive psychology of W162 tection of breast cancer: overview of the evidence
missed diagnoses. Ann Intern Med 2005; 48. Accreditation Council for Graduate Medical Ed- on computer-aided detection in mammography
142:115–120 ucation (ACGME) website. Common program screening. J Med Imaging Radiat Oncol 2009;
32. Croskerry P. The importance of cognitive errors requirements. duty hours: ACGME standards. 53:171–176
in diagnosis and strategies to minimize them. Effective July 1, 2011. www.acgme.org/acg- 64. Saba L, Caddeo G, Mallarini G. Computer-aided
Acad Med 2003; 78:775–780 meweb/Portals/0/PDFs/commonguide/Com- detection of pulmonary nodules in computed to-
33. Burnside ES, Sickles EA, Bassett LW, et al. The pleteGuide_v2%20.pdf. Accessed August 5, 2012 mography: analysis and review of the literature. J
ACR BI-RADS experience: learning from histo- 49. Itri JN, Kim W, Scanlon MH. Orion: a web-based Comput Assist Tomogr 2007; 31:611–619
ry. J Am Coll Radiol 2009; 6:851–860 application designed to monitor resident and fel- 65. Beckmann U, Baldwin I, Durie M, et al. Prob-
34. American College of Radiology website. Nation- low performance on-call. J Digit Imaging 2011; lems associated with nursing staff shortage: an
al Mammography Database. www.acr.org/Quali- 24:897–907 analysis of the first 3600 incident reports submit-

616 AJR:201, September 2013


Diagnostic Errors in Radiology

ted to the Australian Incident Monitoring Study surgery. JAMA 1999; 281:1310–1317 tivity. J Am Coll Radiol 2008; 5:817–826
(AIMS-ICU). Anaesth Intensive Care 1998; 68. Hsiao WC, Braun P, Becker ER, Thomas SR. The 71. Lu Y, Arenson RL. The academic radiologist’s
26:396–400 resource-based relative value scale: toward the clinical productivity: an update. Acad Radiol
66. Tarnow-Mordi WO, Hau C, Warden A, et al. Hos- development of an alternative physician payment 2005; 12:1211–1223
pital mortality in relation to staff workload: a system. JAMA 1987; 258:799–802 72. Monaghan DA, Kassak KM, Ghomrawi HM. De-
4-year study in an adult intensive care unit. Lan- 69. Duszak R, Muroff LR. Measuring and managing terminants of radiologists’ productivity in private
Downloaded from www.ajronline.org by 110.139.255.195 on 11/25/21 from IP address 110.139.255.195. Copyright ARRS. For personal use only; all rights reserved

cet 2000; 356:185–189 radiologist productivity. Part 1. Clinical metrics and group practices in California. J Am Coll Radiol
67. Pronovost PJ, Jenckes MW, Dorman T, et al. Or- benchmarks. J Am Coll Radiol 2010; 7:452–458 2006; 3:108–114
ganizational characteristics of intensive care 70. Lu Y, Zhao S, Chu PW, Arenson RL. An update 73. Cheu RA. Good vision at work. Occup Health Saf
units related to outcomes of abdominal aortic survey of academic radiologists’ clinical produc- 1998; 67:20–24

F O R YO U R I N F O R M AT I O N
This article is available for CME/SAM credit. To access the examination for this article, follow the prompts
associated with the online version of the article.

AJR:201, September 2013 617

You might also like