Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

A Framework for Improving

Radiology Reporting
Chris L. Sistrom, MD, MPHa, Curtis P. Langlotz, MD, PhDb

The interpretative reports rendered by radiologists are the only tangible manifestation of their expertise,
training, and experience. These documents are very often the primary means by which radiologists provide
patient care. Radiology reports are extremely variable in form, content, and quality. The authors propose a
framework for conceptualizing the reporting process and how it might be improved. This consists of standard
language, a structured format, and consistent content. These attributes will be realized by modifying the clinical
reporting process, including the creation, storage, transmission, and review of interpretative documents. The
authors also point out that changes in training and evaluation must be a part of the process, because they are
complementary to purely technical solutions.
Key Words: Reporting and communication, radiology reporting, speech recognition, structured reporting,
radiology lexicon, RadLex
J Am Coll Radiol 2005;2:159-167. Copyright © 2005 American College of Radiology

INTRODUCTION referring colleagues, they also challenge the way we think


about reporting.
Signed reports have long served as the primary means of
Numerous commercial vendors specifically target the
communication between radiologists and referring phy-
radiology market with computer software to create inter-
sicians. Their findings and conclusions often are the basis
for vital medical treatment decisions. Not surprisingly, pretative documents for general imaging, including large
today’s health care professionals need timely online ac- diversified medical imaging companies and smaller,
cess to well-organized, accurate image reports. Yet the early-stage companies. The presumption is that these
process of medical image interpretation, dictation, and automated radiology reporting systems are superior to
transcription has been fundamentally unchanged for de- the status quo and will lead to improved interpretation
cades. In most hospitals and clinics, radiology reports are quality, better patient outcomes, and systematic cost sav-
still prepared by radiologist dictation followed by human ings. Yet several questions arise regarding these new re-
transcription. The report documents are stored as simple porting methods. Who are the intended beneficiaries of
free text, either electronically or printed on paper. Refer- improved reporting? How might these benefits be real-
ring physicians review radiology report by reading the ized? How can improvements related to new reporting
written documents. methods be measured? What are the possible unintended
There are several important limitations to these con- consequences of replacing traditional methods of dictat-
ventional reporting methods, including slow report turn- ing while viewing images?
around, suboptimal report quality and accuracy, and the In this paper, we provide a framework for answering
unsuitability of report information for quality improve- these questions and for improving radiology reporting, in
ment and research. New technologies, such as speech part by defining and justifying a hierarchy of relevant
recognition and so-called structured reporting systems, reporting terms. We also discuss the potential unintended
have been developed to address these shortcomings. Al- consequences of the trend toward automated reporting and
though these new technologies provide opportunities for suggest some approaches to minimizing detrimental effects
radiologists to improve the services they provide to their of automation. Our framework articulates three specific as-
pects of reports and how they may be improved: standard
language, a structured format, and consistent content.
a
University of Florida, Department of Radiology, Gainesville, Florida. These improvements may be achieved by transforming
b
Department of Radiology, University of Pennsylvania, Philadelphia, Penn- the processes of ordering, production, storage, distribu-
sylvania.
tion, and review of radiology tests and associated reports.
Corresponding author and reprints: Chris L. Sistrom, University of Florida,
Department of Radiology, P.O. Box 100374, Gainesville, FL 32610; e-mail: Innovative software for creating, archiving, transmitting,
sistrc@radiology.ufl.edu. and displaying reports is only one of several ways that

© 2005 American College of Radiology 159


0091-2182/05/$30.00 ● DOI 10.1016/j.jacr.2004.06.015
160 Journal of the American College of Radiology/ Vol. 2 No. 2 February 2005

radiology communication might be improved. Two ible definition within hospital information systems.
other important methods are the targeted education of Rather than being simply a long string of words and
radiology trainees and practitioners and the adoption of punctuation, documents may now contain self-defining
widespread standards for radiology report content, lan- formatting information, images, links to other docu-
guage, and style. ments, and embedded computer language code. A perfect
It is easy to focus solely on how changes in the inter- example of a single type of document that may take on
pretative task affect radiologists, but this temptation many diverse forms and functions is a Web page found on
must be resisted. Studying the entire process of reporting the Internet. The corollary term radiology reporting is de-
gives radiologists a unique opportunity to examine and fined as the aggregate of all clinical processes directly related
refine their role in medical care. In particular, it is useful to the creation, storage, distribution, and display of inter-
to focus on the consumers of radiologists’ work products, pretive documents. We intend for radiology reporting to
referring clinicians. It is through clinicians that reports represent the entire chain of activities, beginning with
have their effect on patients’ outcomes. Reports take on the time a radiologist makes a determination of findings
added importance as methods for electronic delivery be- and their clinical impact through the receipt of that in-
come more widespread and radiologists cede exclusive formation by the clinician taking care of the patient.
“ownership” of diagnostic images. Because radiologists
must deliver clearly perceived value via the interpreta- STRUCTURED REPORTING IS NOT
tions that they render, report structure and content ulti- ENOUGH
mately must be tailored to suit the needs of clinicians.
The unalloyed desires of referring physicians must, how- The term structured reporting recently has been used in
ever, be tempered by what is practical for radiologists as the medical literature to represent newer automated re-
well as by cost constraints, patients’ concerns, and ethical porting methods, often involving point-and-click inter-
considerations. faces for data capture. We propose that the term struc-
tured reporting represents simply one set of computer
tools aimed at reducing variability and enhancing the
BACKGROUND clinical utility of formal radiology interpretations. The
term structured reporting is suboptimal because, as we
In 1922, Hickey [1] made a plea for the standardization describe below, the structure of reports is only one of
of radiology reports. Now, more than 80 years later, the several attributes targeted for efforts at improvement. A
study of the variability and quality of interpretative re- dialogue about improved radiology reporting expands
ports remains an active area of clinical research, with little the scope of discussion beyond a narrow focus on the
resulting change in clinical practice. Several studies have form and content of a document itself into the realms of
shown that referring clinicians frequently express the training, professionalism, interpersonal interaction, and
need for improvements in report quality at their institu- clinical decision making. Furthermore, radiology reports
tions [2-6]. Some specialists, most notably orthopedists, are already structured, albeit in a rather loose and imper-
have questioned the practice of the formal interpretation fectly standardized manner. Virtually all of them have an
by radiologists of routine x-rays of their patients [7-12]. implicit format that includes patient-identifying infor-
Frequently, these arguments are predicated on the fact mation, clinical indications, comparison studies, current
that radiologists’ reports often reach their intended recip- examination details, findings, and an impression.
ients days (or even weeks) after the images have been
viewed and acted on. Newer automated reporting sys-
KEY ATTRIBUTES OF RADIOLOGY
tems, such as computer speech recognition and struc-
REPORTS
tured reporting systems, reduce median report turn-
around times to minutes or hours rather than days. The goal of improved radiology reporting may be artic-
However, if clinicians perceive the substantive content of ulated as a series of relatively independent subgoals. The
radiology reports as being irrelevant to patient care, it most immediately relevant of these are decreasing varia-
makes no difference how rapidly they are available. tion and reducing error in clinical reports. Other sub-
There are two terms that must be carefully defined goals include enabling research, quality assessment, and
within the construct of radiology reporting. Radiology process improvement. We explicitly identify these goals
report shall be taken to mean the document containing because they have motivated many past attempts to study
the official interpretation of a single radiology examina- and improve reporting. But if radiology researchers hope
tion or procedure. This document generally contains a to achieve these goals by studying clinical processes, mea-
variable amount of text and in a large and growing num- suring their effects, and implementing changes to im-
ber of settings is stored and viewed electronically. The prove them, these same researchers also must have an
term document has already begun to acquire a more flex- explicit framework to guide their analyses. Any such
Sistrom, Langlotz/Improving Radiology Reporting 161

framework must articulate the attributes of interpretative Table 1. Example of abdominal ultrasound
reports that should serve as specific targets for study and findings in a structured format
modification. The following sections identify and discuss LIVER: Demonstrates diffuse increased
three distinct attributes of radiology reports: format, lan- echogenicity, likely due to fatty infiltration. There
guage, and content. For each of these, we include an are no focal lesions.
adjective to further specify the type of improvement be- GALLBLADDER: Normally distended with no
ing contemplated. gallstones. There is no pericholecystic fluid, wall
thickening, or sonographic Murphy’s sign.
Structured Report Format
BILIARY: No intrahepatic ductal dilatation is
The concept of structure applies most naturally to the identified. The common duct measures 6 mm at
format of reports and should be considered from the the porta hepatis.
point of view of their readers. In other words, report PANCREAS: Limited visualization due to gas in
format is what referring clinicians will actually see on the stomach and colon.
paper or computer screens when reading the reports of SPLEEN: Measures 9.9 cm in length and is
studies that have been ordered. This notion has received normal.
little formal attention, though it may be central to “cus- KIDNEYS: The right kidney measures 11.9 cm.
tomer satisfaction” and the efficiency of information There is an echogenic structure within the
transfer. One format option is to have a radiology report inferior pole of the right kidney with posterior
laid out as an ordered list of headings (often anatomic or shadowing, likely a renal stone. It measures 8
mm. There is no right hydronephrosis or
physiologic functional units), with findings listed under
hydroureter. The left kidney measures 12.3 cm
each one. This format is a natural extension of the ad hoc
and is normal.
organization currently used by most radiologists for re-
VASCULAR: The abdominal aorta is
ports (i.e., indication, comparison, examination, find- nonaneurysmal.
ings, and impression). But such a subdivision of the OTHER FINDINGS: The bladder was empty and
report only achieves its objectives when the subparts are not evaluated.
explicitly and consistently labeled. Additionally, the IMPRESSION: No gallstones and no evidence of
findings (and possibly the impression) would be further cholecystitis. There is an 8 mm stone within the
divided and labeled as appropriate to the examination inferior pole of the right kidney without evidence
and indication. of hydronephrosis.
Many automated reporting systems provide report
templates that can assist radiologists producing organized
and formatted interpretative documents. A radiology re-
port template is analogous to the checklists that airlines radiologists and their clinical colleagues receive training
use to ensure safe flight operations. Therefore, the term that produces a shared conceptual model of human
format refers to the actual layout of the content of a report pathophysiology using mutually understood language.
as seen by both its producer (a radiologist) and its reader Ideally, this training prepares these physicians to com-
(a referring physician). Table 1 shows an example of a municate effectively, both orally and by creating and
report of an abdominal ultrasound examination to illus- reading narrative text documents. Of course, we do not
trate these principles. Simple report structure is a power- live in that ideal world; the problems of miscommunica-
ful idea that has great potential for improving outcomes tion are well documented [14].
in patients having imaging studies, even when radiolo- Perhaps the most well-known, widely used, and success-
gists have no restrictions on the text that can be placed ful effort to standardize report language is the ACR’s Breast
under each heading. Imaging Reporting and Data System (BI-RADS), which
has been in use for almost a decade [15,16]. Several efforts in
Standard Report Language other specialty areas have defined standard terminology for
Radiology reports are generally written using the radiol- interpretative reports, including lower limb veins [17], brain
ogists’ “dialect” of the allopathic physicians’ language. arteriovenous malformations [18], lumbar disk pathology
Unfortunately, this dialect is quite variable; a single radi- [19,20], and chest disease [21-23].
ologist reporting two examinations with similar findings Recently, the Radiological Society of North America
can use very different words and phrases [13,14]. initiated the RadLex project, a comprehensive approach
In our framework, standard language refers to the ex- to develop a unified imaging terminology resource for
tent to which the terminology in a radiology report is clinical reports, teaching files, and research data. This
consistent with respect to clinical indications, anatomy, effort, which is supported by the ACR and subspecialty
imaging findings, diagnoses, and uncertainty. In theory, organizations, will seek to build and promulgate a con-
162 Journal of the American College of Radiology/ Vol. 2 No. 2 February 2005

sensus of terminology for radiology reports, teaching of structure and process such as those just described likely
files, and research data [24]. A preliminary version of the will lead to measurable improvements in outcomes.
RadLex thoracic lexicon, developed in collaboration with The clinical benefits of improved radiology reporting
the Fleischner Society and the Society of Thoracic Radi- will be mediated by modifying the structure and process
ology, is available on the RadLex Web site (http:// of the ordering, preparation, storage, distribution, and
mirc.rsna.org/radlex/service/). review of radiology report documents. The benefits will
be realized during ordering of radiology examinations by
Consistent Report Content clinicians, the preparation of reports by radiologists, and
The consistent content of medical imaging reports rep- the review of the reports by referring physicians. Consid-
resents the goal radiologists should be trying to achieve in erations about the storage and distribution of report doc-
their communications. The medical content in radiology uments are mostly invisible to practitioners but are inte-
reports can vary widely [13,14,25-28]. This variation gral to technical innovations in reporting. Many
exists among institutions, among radiologists, and even radiologists and clinicians have already encountered
among reports created by the same radiologist over time. some of these concepts through experience with
Consistent content means that for a given examination BI-RADS for mammographic interpretation. The sys-
and clinical context, reports have the same elements tem, as originally conceived, represented a national effort
mentioned in the same order, regardless of the author. to standardize the entire process from ordering through
Thus, the concept of consistent content is distinct from review. Although several studies have shown that there is
the notion of structured format. A radiologist could cer- still measurable variation among radiologists using
tainly create reports with a structured format such as the BI-RADS [15,31,32], other studies have shown signifi-
one shown in Table 1 yet vary ordering and the choice of cant improvement in interobserver agreement after spe-
report headings. Thus, the content might not be consis- cific training about the BI-RADS lexicon and assessment
tent, even though the format was structured. categories [33,34].
The brevity of the above description of consistent con- One of the greatest benefits of the entire BI-RADS
tent should not be construed to represent the importance initiative arises from the mandated forced choice be-
we ascribe to this third attribute. To the contrary, the tween clinically meaningful diagnostic categories. The
consistency of report content is a critical issue for both conclusions of all mammogram reports now contain
radiologists and clinicians. Even though it is simple to widely understood terms (standard language and consis-
articulate, the idea that imaging interpretations should tent content). This clarity not only improves mammo-
have common contents presented in a consistent order is gram reports but also makes the interpretative process
both powerful and potentially controversial. Physicians itself easier to teach and perform. The system also facili-
are rewarded for being autonomous and authoritative, tates clinicians’ review of radiology reports. For example,
traits that are sometimes at odds with externally imposed the literature provides data on the probability of malig-
practice proscriptions. Radiologists are no exception, and nancy for each BI-RADS category. Such information
any such program perceived as restricting their indepen- about the positive predictive value of a mammographic
dence concerning the interpretation process, the essence result can be essential to clinical decision making. Al-
of what radiologists do, will be resisted. though it is entirely possible to produce fully compliant
BI-RADS reports using traditional dictation and tran-
scription methods, there are now several automated
IMPROVING THE RADIOLOGY REPORTING mammography reporting and tracking systems that
PROCESS make compliance easier. Inherent in the design of these
Donabedian’s [29,30] schema of structure, process, and products is a database from which reports and profiles
outcomes for studying the quality of medical care is use- can be extracted to comply with the legal requirements
ful for thinking about how to improve radiology report- for generating periodic practice audits.
ing. Structure refers to the characteristics of settings in
Report Storage
which medical care is delivered. Process describes all of the
activities undertaken by members of the health care sys- The issue of report storage has received a tremendous
tem to render care to patients. For example, to say that a amount of attention over the past several years. Most
hospital has a picture archiving and communication sys- notably, the Structured Reporting Working Group of
tem tightly coupled with a speech recognition system the Digital Image Communications in Medicine
(SRS) is to speak about structure. If we go on to describe (DICOM) Standards Committee (Working Group 8)
a common set of examination-specific radiology report- has rigorously defined a complex schema for storing
ing templates, agreed on and used by all the radiologists structured reports [35-37]. This schema is now an ap-
in the hospital, we are describing process. Modifications proved part of the DICOM standard for medical im-
Sistrom, Langlotz/Improving Radiology Reporting 163

age representation, usually referred to as DICOM-SR computerized alerts can be triggered to notify clinicians
[38,39]. In its basic form, DICOM-SR describes a of important developments in their patients’ courses.
series of interrelated and hierarchical container objects New findings or events might “pop up” when clinicians
for imaging findings, which can in turn be associated sign on to their accounts or during existing sessions.
with specific locations on the images themselves. The There is considerable interest in distributing small wire-
imaging findings may be explicitly coded either using less computers to clinicians. These can enhance access to
a widely available or vendor defined terminology or as electronic patient records as well as supporting order
free-text items. Because DICOM-SR defines only a entry and clinical documentation. Through these de-
storage schema, it does not itself directly affect either vices, alerts about significant clinical events can reach the
the production or review of reports. However, soft- relevant physician almost instantly. Another popular new
ware that adheres to the new standard can provide distribution method creates and sends alphanumeric
interoperability between systems for radiologists and pages with summaries of critical events or findings [40].
clinicians to use in report generation and review. Ad- This solution has the advantage of using preexisting in-
ditionally, several DICOM working groups are devel- frastructure, because almost all physicians and ancillary
oping reporting templates for specific imaging modal- personnel already carry pagers. Laboratory, pathology,
ities, including cardiology, vascular ultrasound, and pharmacy departments share this need to actively
obstetric ultrasound, thoracic imaging, and breast im- distribute information to clinicians.
aging. If these content-specific report formats are
widely adopted, the DICOM-SR standard will have a Report Preparation
direct effect on future report language, format, and Recent efforts to improve on traditional dictation and
content. human transcription for the preparation of reports rely
on computerized speech recognition and/or some form
Report Distribution
of screen-mouse-keyboard interaction. Each alternative
The classic method for distributing reports of inpatients’ offers significant improvement in report turnaround
radiology examinations consists of printing paper copies time by eliminating the transcription portion of the re-
and placing them in patients’ charts. Reports of outpa- port production cycle. Both SRSs, when used in self-
tients’ examinations are routinely mailed and/or faxed to correction mode, and menu-driven interfaces upload re-
referring physicians’ offices. These methods for distrib- ports to information systems immediately on electronic
uting reports have traditionally been supplemented by signature by radiologists. However, problems with
individual radiologists’ efforts to personally contact rele- speech recognition accuracy cause radiologists to spend
vant physicians or nurses with urgent or unexpected find- considerable extra time on reporting.
ings. With the advent of picture archiving and commu- Discrete SRSs became available in the mid-1980s. Ra-
nication systems and radiology management and diology has presented an attractive area for the applica-
information systems, reports and images are no longer tion of this technology. Robbins et al. [41,42] described
stored as paper documents and hard-copy films. The preliminary experience with such systems having 1000-
archival repositories for both items are now primarily word and then 5000-word dictionaries. In these early
digital. Many health care institutions also have capability experiments, the use of key words or phrases to “trigger”
for the electronic distribution and display of radiology text expansion was anticipated. In fact, Dershaw [43]
reports (and often images) throughout the enterprise. found this to be the only way to use such a system,
Computerized methods for handling reports, though because free-text recognition was problematic. Hansen et
more rapid and consistent, are still essentially passive in al. [44] were even more pessimistic and articulated the
that those caring for patients must retrieve the docu- problem of having radiologists spend too much cognitive
ments to read the contents. Radiologists still are obliged effort and time with the mechanics of producing reports
to notify referring physicians about findings that are with discrete SRSs. About 7 years ago, Kurzweil and
deemed critical to patients’ care. This direct method for others finally achieved the “holy grail” of speech recogni-
the distribution of interpretative information must not tion in real-time on personal computers with no special
be discounted or neglected. In a significant fraction of hardware. This software allows the user to speak at a
cases, timely personal contact between radiologists and natural (continuous) pace and achieve accuracy of well
clinicians is critical to positive patient outcomes; report over 90% for most individuals. Continuous SRSs with
documents serve only as passive static records. large radiology specific vocabularies and context-sensi-
There is growing interest in “push technology” for tive language modeling represent a significant improve-
report distribution. This refers to methods for alerting ment over discrete SRS predecessors, and commercial
clinicians of imaging test results at the time they become versions have been brought to market and adopted at an
available. In settings with electronic medical records, increasing rate [45-47]. Initial experience with the full-
164 Journal of the American College of Radiology/ Vol. 2 No. 2 February 2005

scale implementation of SRSs in various settings leads to semantic model or knowledge base with explicitly stan-
several conclusions. There is nearly universal agreement dardized language. These reporting systems use en-
that report turnaround times are significantly reduced hanced storage concepts and keep a dedicated database of
almost immediately, and transcription costs decrease reports, often stored in a highly formalized and codified
steadily within a year of implementation [47-51]. format. Because most hospital information systems are
The benefits of SRSs come at the expense of increased not prepared to accept these complex documents, reports
effort by radiologists to edit and verify reports during are usually rendered into simple text before being up-
clinical interpretation, and these tasks may increase re- loaded to the official electronic record. However, the
porting times by 50% to 100%, even after considerable structured elements remain in the reporting system’s da-
experience with the systems [48,49,51-53]. A further tabase and are available for quality assessment and re-
disadvantage to SRSs has to do with the loss of the other search activities.
functions sometimes performed by transcriptionists. Speech recognition and menu-driven interfaces are likely
Many radiologists are in the habit of making verbal re- to converge into a single comprehensive reporting product.
quests directed to transcriptionists at the end of dicta- Some vendors of menu-driven reporting interfaces have al-
tions. For example, a radiologist might say, “Please fax ready added speech recognition capabilities to allow voice
this report to Dr. Jones” and be certain that it would be activation and menu navigation to their products. Likewise,
done. An experienced and motivated transcriptionist can makers of SRSs likely will expand the scope and sophistica-
often greatly assist a radiologist and improve the quality tion of their templates and macro features to incorporate
of care by pointing out potential errors or inconsistencies some menu-driven features. In both cases, free text dictation
in reports. Such a productive working relationship be- will still be allowed for sections of the report for which no
tween a radiologist and a dedicated departmental tran- predefined menu (or template) choices exist.
scriptionist is much more likely to develop than between Another vision for the future of report creation inter-
other physicians and workers in a large pool of transcrip- faces might be called the ⬙talking template.⬙ This would
tionists employed by a hospital or a large commercial
enable radiologists to interact with the reporting inter-
transcription vendor.
face almost entirely through a microphone or headset,
Newer modes of interaction have been termed menu-
using speech recognition for user input and speech gen-
driven or “point-and-click” interfaces because they typi-
eration for feedback from the system. Instead of requir-
cally require mouse navigation through lists or nested
ing users to look away from images to navigate through a
menus to select relevant findings and diagnoses. Before
report structure, a talking template would deliver voice
the advent of continuous SRSs, several groups at aca-
prompts to indicate reporting progress. Spoken key
demic centers experimented with computer-generated
radiology reports. Workers at Johns Hopkins University phrases or buttons on the microphone would cause
[54-57], the University of California [58-60], Harvard movement through report elements. A brief example
University [61-63], and others [64,65] have developed may clarify this concept. Consider that the report outline
and described such systems. Often, the scope of reports shown in Table 1 for general abdominal ultrasound is
generated by these systems is limited to a specific practice appropriate for the study being interpreted. On activat-
area, such as musculoskeletal imaging, mammography, ing the system and selecting the examination for inter-
or obstetric ultrasound. Different methods for user inter- pretation, in addition to seeing the images come up on
actions were used, including mark-sense forms, keyboard the picture archiving and communication system work-
input, mouse navigation, touch-screen input, and voice station, a radiologist would hear “liver” from the micro-
activation. Using barcodes to quickly input patients’ de- phone (or a headphone). He or she would then dictate an
mographics or to implement a menu-driven reporting appropriate description in free text and press the “next”
scheme has been described as well [66-68]. button on the microphone. At this time, the system
Computer-mediated (menu-driven) report generation would say “gallbladder” and await dictation about the
remains a viable option for several reasons in addition to gallbladder. Again, the “next” button triggers movement
presenting an alternative to the frustrations of SRS inac- to the next section (biliary). In addition to a “next” but-
curacies. In current SRS implementations, no attempt is ton, a “back” button would give the user complete free-
made to codify the findings or standardize a radiologist’s dom to move around in the report. Within a named
choice of language. By design, SRSs produce simple text section (liver, gallbladder, biliary, etc.), standard “re-
documents that are easily stored and displayed in a wide wind,” “playback,” and “dictate” buttons would be ac-
variety of information system configurations. Rudimen- tive. Alternatively, voice commands could serve the same
tary report shells (templates) are supported, but their functions as the microphone buttons if desired. Talking
creation is left up to individual users [69]. On the other templates free radiologists to view and manipulate the
hand, menu-driven reporting systems are built around a images (if using soft copy) during the entire dictation.
Sistrom, Langlotz/Improving Radiology Reporting 165

Report Review hand. Additionally, the medical record will more accu-
rately reflect the context in which a radiology interpreta-
Clinicians still primarily read their radiology reports in
tion was rendered. Eventually, the complete integration
printed form. Even in situations in which clinicians view
of information systems will allow computerized POE
their radiology reports from computer screens, they are
choices to trigger a complex chain of events [76]. These
almost always simple blocks of text. The clinician review
will include the automatic setting of procedural protocols
phase of the reporting cycle has received very little atten-
at the examination site and triggering predefined report
tion by researchers or by vendors. The DICOM-SR stan-
templates for radiologists. These protocols and templates
dard is designed to support sophisticated report review func-
can be specific to the clinical problem, anatomic area
tions that display key images dynamically linked to report examined, and imaging modality employed.
content. Such capabilities are not yet widely available. A complaint commonly articulated by radiologists is
In many implementations, reading a radiology report that of incomplete, inadequate, or inappropriate clinical
on a computer screen is often rather more difficult than information from physicians requesting imaging studies
from a printed page because only part of the text is [77,78]. At the same time, clinicians express frustration
displayed at once. Little consideration has been given to that interpretations are often not relevant to the clinical
electronic report display methods that enhance readabil- questions they seek to answer with imaging studies. In
ity and improve efficiency of information transfer. For keeping with the tenets of the quality improvement
example, should a radiology interpretation look like a movement, we believe that this is a classic systems prob-
laboratory report, with organs or structures listed in stan- lem, rather than personal failings on the part of clinicians
dard order, perhaps with indicators of normal and abnor- and/or radiologists. Often, both parties are doing their
mal? Would it be more efficient for readers if detailed best in good faith to ask and answer appropriate clinical
findings for individual organ systems were nested under questions but are stymied by a layer of intermediary
their respective headings and visible only when activated? people and systems that distort the communication. Ide-
Should key terms, such as right and left, always be capi- ally, a clinician should be permitted to ask one or more
talized or color coded to reduce error? specific questions about his or her patient that would be
answered directly by the interpreting radiologist as part
Examination Ordering of the reporting process. During his keynote address at
As for improving the process of ordering radiology exami- the 2004 meeting of the Association of University Radi-
nations, we briefly mention two relevant activities. The first ologists, Thrall used the phrase “the electronic round
is the ACR’s Appropriateness Criteria. This document is the trip” to refer to the ideal situation, whereby a clinician is
result of a large-scale consensus development effort by the guided by computerized online order entry, and the ex-
ACR begun in 1993. It consists of over 140 individual amination and its report shell are automatically activated.
guidelines that deal with more than 820 specific clinical
conditions. Under each of these, a number of candidate
imaging procedures is listed, each annotated with an appro- CONCLUDING REMARKS
priateness score [70,71]. One of us has undertaken to con- The attributes and methods described above are by no
vert the Appropriateness Criteria document into a relational means mutually exclusive; in many implementations of
database for use in hospital information systems to provide improved radiology reporting, they will be quite comple-
the underlying medical content for radiology test ordering mentary. Our framework for improvement does not pre-
decision support and compliance logic [72]. ordain the medium for the ordering, production, storage,
The second is that the concept of physician order entry distribution, and review of reports. For example, a well-
(POE) by computer has become quite popular, especially trained radiologist can look at images and then have a
with respect to pharmacy. In fact, the LeapFrog group verbal conversation with an experienced clinical col-
has designated POE system implementation as one of league in structured format, using standard language and
several critical quality indicators for hospitals [73]. Radi- having consistent content. Complex, unusual, or critical
ology POE promises to improve the ordering process by situations often require just such personal and interactive
ensuring that appropriate studies are scheduled for the collaboration between radiologists and clinicians for op-
clinical problem at hand [74,75]. The same features of timal outcomes. This highlights the transcendent impor-
radiology POE that support compliance and appropri- tance of training, experience, and acquired skills to effec-
ateness will also serve to provide more consistent and tive communication between medical providers.
complete clinical indication data. This information can Technological gimmickry will never replace these essen-
be fed directly through and included in a report docu- tially human qualities and can only complement and
ment itself. This richer ordering process will serve to extend them. Nonetheless, the methods that we have
better inform interpreting radiologists about the task at described thus far for improving radiology reporting have
166 Journal of the American College of Radiology/ Vol. 2 No. 2 February 2005

been mostly technical in nature. A healthy degree of 11. Zohman GL, Watts HG. Is a routine radiological consultation cost-
effective for pediatric orthopedic radiographs? J Pediatr Orthop 1998;
skepticism about solutions relying entirely on informa-
18(4):549-51.
tion technology is warranted.
12. Crockett HC, Wright JM, Burke S, Boachie-Adjei O. Idiopathic scoliosis.
The complex cognitive task of radiology reporting is The clinical value of radiologists’ interpretation of pre- and postoperative
mostly learned during a resident’s education and training. radiographs with interobserver and interdisciplinary variability. Spine
Specific didactic instruction, supervised practice, and the 1999;24(19):2007-9.
rigorous evaluation of reporting skills are vital components 13. Robinson PJ. Radiology’s Achilles’ heel: error and variation in the inter-
of any comprehensive program to improve radiology re- pretation of the Roentgen image. Br J Radiol 1997;70(839):1085-98.
porting. With this in mind, consider that a recently com- 14. Sobel JL, Pearson ML, Gross K, et al. Information content and clarity of
pleted national survey of accredited radiology residency pro- radiologists’ reports for chest radiography. Acad Radiol 1996;3(9):709-17.
gram directors showed that 86% of training programs 15. Baker JA, Kornguth PJ, Floyd CE Jr. Breast imaging reporting and data
devote 1 hour or less per year to formal instruction in radi- system standardized mammography lexicon: observer variability in lesion
ology reporting [79]. Likewise, 82% of programs evaluate description. Am J Roentgenol 1996;166(4):773-8.
fewer than 1% of their residents’ clinical reports. There 16. Starren J, Johnson SM. Expressiveness of the Breast Imaging Reporting and
clearly is room for improvement in education about report- Database System (BI-RADS). Proc AMIA Annu Fall Symp 1997;655-9.

ing at both the residency and postgraduate levels. 17. Caggiati A, Bergan JJ, Gloviczki P, Jantet G, Wendell-Smith CP, Partsch
We have described a framework for conceptualizing H. Nomenclature of the veins of the lower limbs: an international inter-
disciplinary consensus statement. J Vasc Surg 2002;36(2):416-22.
the process of radiology reporting with the aim of im-
18. Reporting terminology for brain arteriovenous malformation clinical and
proving the quality, consistency, and outcomes of care
radiographic features for use in clinical trials. Stroke 2001;32(6):1430-42.
rendered by radiologists through their primary work
19. Appel B. Nomenclature and classification of lumbar disc pathology. Neu-
product: the interpretative report document. These laud- roradiology 2001;43(12):1124-5.
able goals have been loosely associated with the term
20. Fardon DF, Milette PC. Nomenclature and classification of lumbar disc
structured reporting and are gaining increased attention in pathology. Recommendations of the Combined Task Forces of the North
academic and commercial venues. The framework de- American Spine Society, American Society of Spine Radiology, and
fines standard language, structured format, and consis- American Society of Neuroradiology. Spine 2001;26(5):E93-113.
tent content as three key attributes of radiology reports 21. Austin J, Simon M, Trapnell D, Fraser RG. The Fleischner Society
and shows how these attributes may be improved by glossary: critique and revisions. Am J Roentgenol 1985;145(5):1096-8.
transforming the ordering, production, storage, distribu- 22. Austin JH, Muller NL, Friedman PJ, et al. Glossary of terms for CT of the
tion, and review of radiology examinations and reports. lungs: recommendations of the Nomenclature Committee of the Fleis-
chner Society. Radiology 1996;200(2):327-31.
23. Tuddenham WJ. Glossary of terms for thoracic radiology: recommenda-
REFERENCES tions of the Nomenclature Committee of the Fleischner Society. Am J
1. Hickey PM. Standardization of roentgen-ray reports. Am J Roentgenol Roentgenol 1984;143(3):509-17.
1922;9:442-5. 24. Radiological Society of North America. RadLex: overview of lexicon
2. Clinger NJ, Hunter TB, Hillman BJ. Radiology reporting: attitudes of organization. November 19, 2003.
referring physicians. Radiology 1988;169(3):825-6. 25. Siegle RL, Baram EM, Reuter SR, Clarke EA, Lancaster JL, McMahan
3. Lafortune M, Breton G, Baudouin JL. The radiological report: what is CA. Rates of disagreement in imaging interpretation in a group of com-
useful for the referring physician? Can Assoc Radiol J 1988;39(2):140-3. munity hospitals. Acad Radiol 1998;5(3):148-54.

4. McLoughlin RF, So CB, Gray RR, Brandt R. Radiology reports: how 26. Robinson PJ, Wilson D, Coral A, Murphy A, Verow P. Variation between
much descriptive detail is enough? Am J Roentgenol 1995;165(4):803-6. experienced observers in the interpretation of accident and emergency
radiographs. Br J Radiol 1999;72(856):323-30.
5. Gagliardi RA. The evolution of the x-ray report. Am J Roentgenol 1995;
164(2):501-2. 27. Lev MH, Rhea JT, Bramson RT. Avoidance of variability and error in
radiology. Lancet 1999;354(9175):272.
6. Naik SS, Hanbidge A, Wilson SR. Radiology reports: examining radiol-
ogist and clinician preferences regarding style and content. Am J Roent- 28. Goddard P, Leslie A, Jones A, Wakeley C, Kabala J. Error in radiology.
genol 2001;176(3):591-8. Br J Radiol 2001;74(886):949-51.
7. Bosse MJ, Brumback RJ, Hash C. Medical cost containment: analysis of 29. Donabedian A. The quality of care. How can it be assessed? JAMA
dual orthopedic/radiology interpretation of X-rays in the trauma patient. 1988;260(12):1743-8.
J Trauma 1995;38(2):220-2. 30. Donabedian A. Quality assurance. Structure, process and outcome Nurs
8. Turen CH, Mark JB, Bozman R. Comparative analysis of radiographic Stand 1992;7(11 suppl QA):4-5.
interpretation of orthopedic films: is there redundancy? J Trauma 1995;
31. Berg WA, Campassi C, Langenberg P, Sexton MJ. Breast Imaging Re-
39(4):720-1.
porting and Data System: inter- and intraobserver variability in feature
9. Clark R, Anderson MB, Johnson BH, Moore DE, Herbert FD. Clinical analysis and final assessment. Am J Roentgenol 2000;174(6):1769-77.
value of radiologists’ interpretations of perioperative radiographs of or-
32. Kerlikowske K, Grady D, Barclay J, et al. Variability and accuracy in mam-
thopedic patients. Orthopedics 1996;19(12):1003-7.
mographic interpretation using the American College of Radiology Breast
10. Anglen J, Marberry K, Gehrke J. The clinical utility of duplicate readings Imaging Reporting and Data System. J Natl Cancer Inst
for musculoskeletal radiographs. Orthopedics 1997;20(11):1015-9. 1998;90(23):1801-9.
Sistrom, Langlotz/Improving Radiology Reporting 167

33. Lehman CD, Miller L, Rutter CM, Tsu V. Effect of training with the 56. Wheeler PS, Raymond S. The computer-based cumulative report: im-
American College of Radiology Breast Imaging Reporting and Data Sys- provement in quality and efficiency. Radiology 1992;182(2):355-7.
tem lexicon on mammographic interpretation skills in developing coun-
57. Bluemke DA, Eng J. An automated radiology reporting system that uses
tries. Acad Radiol 2001;8(7):647-50.
HyperCard. Am J Roentgenol 1993;160(1):185-7.
34. Berg WA, D’Orsi CJ, Jackson VP, et al. Does training in the Breast
58. Mani RL, Jones MD. MSF: a computer-assisted radiologic reporting
Imaging Reporting and Data System (BI-RADS) improve biopsy recom-
system. I. Conceptual framework. Radiology 1973;108(3):587-96.
mendations or feature analysis agreement with experienced breast imagers
at mammography? Radiology 2002;224(3):871-80. 59. Mani RL. RAPORT radiology system: results of clinical trials. Am J
35. Bidgood WD Jr. Documenting the information content of images. Proc Roentgenol 1976;127(5):811-6.
AMIA Annu Fall Symp 1997;424-8. 60. Seltzer RA, Reimer GW, Cooperman LR, Rossiter SB. Computerized
36. Mori AR, Galeazzi E, Consorti F, Bidgood WD Jr. Conceptual schemata radiographic reporting in a community hospital: a consumer’s report.
for terminology: a continuum from headings to values in patient records Am J Roentgenol 1977;128(5):825-9.
and messages. Proc AMIA Annu Fall Symp 1997;650-4. 61. Simon M, Leeming BW, Bleich HL, et al. Computerized radiology re-
37. Lee KP, Hu J. XML Schema representation of DICOM structured re- porting using coded language. Radiology 1974;113(2):343-9.
porting. J Am Med Inform Assoc 2003;10(2):213-23. 62. Leeming BW, Simon M, Jackson JD, Horowitz GL, Bleich HL. Advances
38. The DICOM Standards Committee. DICOM part 16: content mapping in radiologic reporting with computerized language information process-
resource. Roslyn (VA): National Electronic Manufacturers Association; ing (CLIP). Radiology 1979;133(2):349-53.
2003. 63. Leeming BW, Porter D, Jackson JD, Bleich HL, Simon M. Computerized
39. Clunie DA. DICOM structured reporting. Bangor (PA): Pixelmed Pub- radiologic reporting with voice data-entry. Radiology 1981;138(3):585-8.
lishing; 2000. 64. Bramble JM, Chang CH, Martin NL. A report-coding system for inte-
40. Tellis WM. Using the Java message service to enable delivery of urgent gration into a digital radiology department. Am J Roentgenol 1989;
radiological exam results at the point of care. Proc Soc Comp Appl 152(5):1109-12.
Radiology 2004;63-5.
65. Frank MS, Green DW, Sasewich JA, Johnson JA. Integration of a per-
41. Robbins AH, Horowitz DM, Srinivasan MK, et al. Speech-controlled sonal computer workstation and radiology information system for obstet-
generation of radiology reports. Radiology 1987;164(2):569-73. ric sonography. Am J Roentgenol 1992;159(6):1329-33.
42. Robbins AH, Vincent ME, Shaffer K, Maietta R, Srinivasan MK. Radi- 66. Choplin RH, Boehme JM, Cowan RJ, et al. A computer-assisted radio-
ology reports: assessment of a 5,000-word speech recognizer. Radiology logic reporting system. Radiology 1984;150(2):345-8.
1988;167(3):853-5.
67. Gillespy T III. Advanced applications of personal computers in the radi-
43. Dershaw DD. Voice-activated radiology reports [letter]. Radiology 1994; ologist’s office. Radiographics 1993;13(1):163-8.
167(1):284.
68. Adams HG, Campbell AF. Automated radiographic report generation
44. Hansen GC, Falkenbach KH, Yaghmai I. Voice recognition system [let- using barcode technology. Am J Roentgenol 1985;145(1):177-80.
ter]. Radiology 1988;169(2):580.
69. Sistrom CL, Honeyman JC, Mancuso A, Quisling RG. Managing pre-
45. Herman SJ. Accuracy of a voice-to-text personal dictation system in the defined templates and macros for a departmental speech recognition
generation of radiology reports. Am J Roentgenol 1995;165(1):177-80. system using common software. J Digit Imaging 2001;14(3):131-41.
46. Schwartz LH, Kijewski P, Hertogen H, Roossin PS, Castellino RA. Voice 70. American College of Radiology. American College Of Radiology Appro-
recognition in radiology reporting. Am J Roentgenol 1997;169(1):27-9. priateness Criteria 2000. Radiology 2000;215(suppl):1-1511.
47. Mehta A, Dreyer KJ, Schweitzer A, Couris J, Rosenthal D. Voice recog-
71. Cascade PN. The American College of Radiology. ACR Appropriateness
nition—an emerging necessity within radiology: experiences of the Mas-
Criteria project. Radiology 2000;214(suppl):3-46.
sachusetts General Hospital. J Digit Imaging 1998;11(4 suppl 2):20-3.
72. Sistrom CL, Honeyman JC. Relational data model for the American
48. Rosenthal DI, Chew FS, Dupuy DE, et al. Computer-based speech rec-
College of Radiology Appropriateness Criteria. J Digit Imaging 2002;
ognition as a replacement for medical transcription. Am J Roentgenol
15(4):216-25.
1998;170(1):23-5.
73. Doolan DF, Bates DW. Computerized physician order entry systems in
49. Ramaswamy MR, Chaljub G, Esch O, Fanning DD, vanSonnenberg E.
hospitals: mandates and incentives. Health Aff (Millwood )
Continuous speech recognition in MR imaging reporting: advantages,
2002;21(4):180-8.
disadvantages, and impact. Am J Roentgenol 2000;174(3):617-22.
50. Lemme PJ, Morin RL. The implementation of speech recognition in an 74. Khorasani R. Computerized physician order entry and decision support:
electronic radiology practice. J Digit Imaging 2000;13(2 suppl 1):153-154. improving the quality of care. Radiographics 2001;21(4):1015-8.

51. Houston JD, Rupp FW. Experience with implementation of a radiology 75. Harpole LH, Khorasani R, Fiskio J, Kuperman GJ, Bates DW. Auto-
speech recognition system. J Digit Imaging 2000;13(3):124-8. mated evidence-based critiquing of orders for abdominal radiographs:
impact on utilization and appropriateness. J Am Med Inform Assoc 1997;
52. Heilman RS. Voice recognition transcription: surely the future but is it 4(6):511-21.
ready? [editorial]. Radiographics 1999;19(1):2.
76. Honeyman JC. Information systems integration in radiology. J Digit
53. Gale B, Safriel Y, Lukban A, Kalowitz J, Fleischer J, Gordon D. Radiology Imaging 1999;12(2 suppl 1):218-22.
report production times: voice recognition vs. transcription. Radiol Man-
age 2001;23(2):18-22. 77. Dacher JN, Lechevallier J. The exam request seen by the radiologist, the
report seen by the clinician. J Radiol 1999;80(8):855-8.
54. Wheeler PS, Simborg DW, Gitlin JN. The Johns Hopkins radiology
reporting system. Radiology 1976;119(2):315-9. 78. Gunderman RB, Phillips MD, Cohen MD. Improving clinical histories
on radiology requisitions. Acad Radiol 2001;8(4):299-303.
55. Simborg DW, Krajci EJ, Wheeler PS, Gitlin JN, Goldstein KS. Com-
puter-assisted radiology reporting: quality of reports. Radiology 1977; 79. Sistrom C, Lanier L, Mancuso A. Reporting instruction for radiology
125(3):587-9. residents. Acad Radiol 2004;11(1):76-84.

You might also like