Blado Reject Analysis SCAR 2002

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/11272119

Is Reject Analysis Necessary after Converting to


Computed Radiography?
Article in Journal of Digital Imaging February 2002
Impact Factor: 1.19 DOI: 10.1007/s10278-002-5028-7 Source: PubMed

CITATIONS

READS

22

590

3 authors, including:
Maria Elissa Blado
Texas Children's Hospital
4 PUBLICATIONS 44 CITATIONS
SEE PROFILE

Available from: Maria Elissa Blado


Retrieved on: 09 May 2016

Title: Is Reject Analysis Necessary after Converting to Computed Radiography?


Primary Author: Rosemary Honea A.R.R.T., A.R.D.M.S.,
Secondary Author: Maria Elissa Blado

Secondary Author: Yinlin Ma

Affiliation: Edward L. Singleton Diagnostic Imaging Services

Texas Children's Hospital

Address: Edward L. Singleton Diagnostic Imaging Services

Texas Children's Hospital

6621 Fannin MC 2-2521

Houston, Texas 77030-2399

Phone: (832) 824-5563 voice

(832) 825-5370 facsimile

Internet Addresses: rhonea@TexasChildrensHospital.org (Rosemary Honea)


meblado@texaschildrenshospital.org
yxma@texaschildrenshospital.org

(Maria Elissa Blado)

(Yinlin Ma)

Topic 1. Modality Image Acquisition (Detectors, Imaging Physics, Quality Assurance)

ABSTRACT

Reject Analysis is an accepted standard of practice for Quality Assurance (QA) in conventional
radiology. The needs for Reject Analysis has been challenged by the introduction of Computed
Radiography (CR) because of low reported reject rates and because criteria for improperly exposed
image were lacking. Most CR systems include Quality Control (QC) workstations that are capable
of modifying the appearance of images before release, and also of deleting bad images before
being analyzed. Texas Childrens Hospital has been practicing Computed Radiography since
October 1995, and now conducts essentially filmless imaging operations using a large-scale
Picture Archival and Communications (PACS) with fourteen CR units. In our hospital, the QC
workstation is a key element in our CR QC operation, however, the extensive software tools of the
workstation are limited in terms of avoiding repeated examinations. Neither the QC Workstation
nor the PACS itself are designed to support Reject Analysis, so our task was to design a system
that accommodates identification, isolation, and archiving of repeated examinations making use of
our electronic imaging systems.

We had already developed transcription codes for our

radiologists examination critique, so we adopted these as codes for rejected images.

The

technologist at the QC workstation appends the critique code to patient demographic information,
and modifies other fields to indicate that the image is a reject, and archives as usual. Modified
routing tables prevent the release of rejected images, but ensure they are available for review. Our
frequency and reasons for repeated examinations are comparable to other reports of Reject
Analysis in the literature. The most frequent cause of a repeated examination is mis-positioning.
Developing the method for capturing repeat, collecting the data, and analyzing it is only one-half
of the battle. In order to achieve an improvement in services, it is necessary to feedback the results
to management and staff and to implement training as indicated. It is our intent to share our results
with PACS and CR vendors in the hope that they will incorporate some mechanisms for Reject
Analysis into the design of their systems.
2

INTRODUCTION

Reject Analysis (RA), a.k.a. Repeat Analysis, is an accepted standard practice for Quality
Assurance (QA) in conventional radiography. Analysis of rejected images yields information
about the efficiency of the department and is the basis for Quality Control (QC) and the education
of the individual technologist1. While no one would question the value of performing Reject
Analysis, a.k.a. Repeat Analysis, in a conventional radiology department, the advent of Computed
Radiography (CR) has prompted some to challenge its relevance to electronic radiology
operations.

This skepticism developed partly because of reports by early adopters of extremely

low reject rates using CR. The usual appearance of the CR image differs somewhat from a
conventional image and its contrast and density is automatically adjusted to improve appearance,
so criteria for improperly exposed images were slow to be recognized by practitioners. CR
systems almost universally include QC workstations that have the capability of modifying images
before release, as well as deleting unacceptable images.

Even after more than a decade of

widespread clinical practice of CR, systems to support Reject Analysis are absent from standalone
CR systems as well as those incorporated into large-scale Picture Archiving and Communications
Systems (PACS). Our challenge was to utilize our electronic image acquisition and distribution
systems to develop a system for identifying, capturing, isolating, and archiving rejected images.

MATERIALS AND METHODS

Texas Childrens Hospital operates a large-scale Agfa (Agfa Medical Systems, Ridgefield Park,
NJ) IMPAX Version 3.5 PACS. This system includes fourteen CR units with all Agfa Diagnostic
Center (ADC) production versions represented, namely the ADC70, ADC Compact, and ADC
Solo. Patient demographics and examination information are retrieved from the IDXRad Version
3

9.7 Radiology Information System (RIS) and supplied to the ADC by Identification Stations
augmented by bar code scanners as described previously2.

DICOM Modality Worklist

Management has been tested but is currently not practical for clinical operations. The Diagnostic
Imaging Service performed 141,321 examinations in calendar year 2001 (IDXRad), of which
93,386 are CR (Oracle version 7.0), consisting of 1.72 images per examination on the average.
Virtually all the primary interpretation is conducted using softcopy review stations: routine
printing and archiving of hardcopy images ceased on October 31, 2000.

The Processing Station. The ADC image is transmitted to one of eleven Processing Stations (PS,
a.k.a. VIPS) before being released for distribution in the PACS system. This QC workstation is a
key component in our imaging operation: a technologist inspects each image at the PS and
determines whether it was appropriately acquired and properly identified.

If all views are

complete and acceptable, they are transmitted to the PACS system, the examination is completed
in the RIS, and the patient is released. In the event of errors, the PS has some sophisticated
features for modifying the image. As shown in Table I, some of these features are useful in
recovering images that would otherwise be rejected, such as by annotating the image when the
Left/Right marker is obscured, correcting incorrect demographic information, or reprocessing an
image with the appropriate examination menu selection. Technologists are discouraged from
making drastic adjustments to the image at the workstation3. The PS display is not designed for
primary interpretation: the image is down-sampled for display and the monitor luminance is not
strictly controlled. Technologists are trained to recognize anatomy, not pathology, and are as
likely to obscure important clinical features as to enhance them, especially when the display does
not match the appearance on diagnostic workstations. The operator of the PS also has the ability to
make bad CR images disappear without a trace.

Reasons for Repeated Examinations. Table II shows reasons why an examination might need to
be repeated in a conventional department4. Each of these can occur with CR, although the
categories of under- and over-exposure, and lost film require further elaboration. Even
though the CR system acts to adjust the density of the image to compensate for inappropriate
radiographic technique, a CR image that is under-exposed will appear grainy and a CR image
that is over-exposed subjects the patient to more radiation dose than necessary for the
examination5. In conventional radiography, the exposure level is evident from the optical density
(OD) of the film. Because density of CR is adjustable, the exposure level is revealed by numerical
analysis of pixel values in the digital image.

An acceptable range of values was established for

this exposure indicator, called lgM or the logarithm of the Median of the grayscale histogram.
The range of acceptable values allows a factor of two under- or over-exposure around the target
value. Contrary to claims by some PACS proponents, electronic images can also be lost, either by
operator deletion or by equipment malfunctions resulting in corrupt image data files that cannot be
transferred.

Reject Codes. We previously reported a system of dictation codes for documenting radiologist
examination critiques6. Technologists team leaders used the dictation codes, shown in Table III to
classify the reason for repeating an image. The appropriate critique code, a delimiter (/), and the
responsible technologists identification number are inserted before the contents of the Patient
Name field at the PS.

Segregating Rejected Images. A rejected image sent to PACS from the PS normally would join
the diagnostic images in the patients examination folder. A procedure was developed to modify
specific fields to indicate that the examination is a reject. The text string NONE is inserted in
front of the contents of the Patient Name, Medical Record Number, and Accession Number fields.
5

When these modified images are sent to PACS, they fail validation by the RIS interface and are
sequestered from public view by PACS users.

Releasing the Rejected Images for Archiving and Review.

A PACS Analyst with

Administrative privileges retrieves the sequestered rejected images, modifies the Examination
Description field by inserting the text string NONE-, and forces the image into the public area.
(It is unfortunate that on the PS, the user is not allowed to edit the Examination Description field.
If this field was editable, the technologist will be able to enter the text NONE to its contents.
Once it is archived, there will be no need to manually modify this field by the PACS Analyst and it
will automatically route to its destinations.)

At this point the rejected image is disseminated according to rules established in the IMPAX
Routing Tables (Table IV). To avoid widespread dissemination of rejected images to clients
throughout the PACS network, the Routing Tables were extensively modified to send Rejects only
to the Archive Servers, where they would be automatically recorded on Magneto-Optical Disk
(MOD) and tape media7.

Modification of Routing Tables that were appropriate for clinical

imaging operations was a major effort, and warrants further explanation in the discussion that
follows.

If we did not have reject images sent from the acquisition station, the specialty field in the Routing
Pattern table will indicate Dont Care (it does not matter what specialty the image is coming
from that station), as shown in Table IV. A new specialty called NONE is created and the routing
tables assign any examination procedure that has the text NONE with this specialty. It is quite
unfortunate that the routing pattern of our PACS system does not allow exclusive routing by
specialty. It will allow a specific specialty to route to a certain destination; for example, route
6

ONLY FLUORO cases to a specific review station. But the design of configuring our routing
tables do not allow all specialties EXCEPT a specific specialty (in our case, the NONE specialty)
to route to a destination unless we create an entry for EACH specialty.

As an example, on Table IV the CR modality has 14 different specialties. An entry for each
specialty was created to route to the NICU review station (patient location NEO). The NONE
specialty was not included so that the NICU physician will not be able to view the rejected image.
If there were more than one patient location to an area, i.e., ER and EMC for the Emergency
Room, then 14 entries for EACH patient location will be created on the routing pattern, in this case
a total of 28 entries for just one destination, as demonstrated in Table V.

The growth of the routing tables, in order to accommodate the rejected images into our archive
servers, was exponential and dependent on the number of each criteria: specialty, referring
physician, and patient location. The more specialties there are, the more the entries will be created
on the routing table for each referring physician and / or patient location.

Analysis of Rejected Images. Once a system was in place for documenting and preserving
rejected images, tools were needed to interrogate the image database to collect meaningful data for
Performance Improvement of imaging operations.

A script, using Standard Query Language (SQL) queries, was written to query the IMPAX image
database and to generate a report of all the archived NONE files in a month. This monthly report
is then imported to an Access database, which includes the following fields:
date of examination
time of examination
modality
7

technologist number
accession number
examination procedure
number of images
and reject code.
All the data for these fields are retrieved from the IMPAX image database.

RESULTS AND DISCUSSION

Various statistical reports, using the information from NONE Access database, were generated.
These include:
Number of rejected CR images for the year 2001 and its percentage over the total number of
images in the archive (Table VI)
Number of rejected CR images broken down by reject code (Table VII).
Number of rejected CR images per shift and its percentage over the total number of images in
the archive per shift (Table VIII).
Number of rejected CR images by technologist number (Table IX).
Number of rejected CR images by examination description (Table X).

Table VI shows the number of rejected CR images for the year 2001 and its percentage over the
total number of images archived in a month. The data yields a yearly overall reject rate of 4.07%
of CR rejects for the year 2001. The average change in the monthly rates is 0.53% with the
maximum of 1.13% between August and September. Figure 1 is the chart representation of Table
VI.

Table VII reports the number of rejected CR images broken down by reject reason code for the
year 2001. The codes listed are based on Table III, the Radiologist Examination Critique List. A
code for OTHER (code number 45) and NOT INDICATED (code number 46) were added to this
list. According to this data, the most common reason for rejecting an image is mis-positioning, at
62% compared with the total number of rejects. Inadequate inspiration comes in second at
8.73% and not enough contrast at 6.74%, while 5.7% of the rejects were not labeled with a reject
code.

Two separate studies conducted at other institutions within the past 5 years also reported

positioning as the top reason for their repeated examinations, one at 57.19%4 and the other at
46.9%6.

Table VIII is a sample report on the number of rejects by shift. Figure 2 is the chart representation
of the percentages on Table VIII. These reports show that the weekend shift consistently has the
most number of rejects for the whole year.

Table IX shows a portion of the report listing the technologists numbers and the number of
rejected images they archived over a period of time. This information is further broken down by
reject codes. The report consistently shows MISPOSITIONING as being the most common reason
for rejecting an image among the technologists. The number of rejected images by a technologist
will be compared with the number of images in the examinations a technologist actually performed
during this same period. The latter piece of information still needs to be entered in the NONE
database in order to get this ratio.

The under-reporting of rejected images, the toleration of unacceptable images, and the inconsistent
following of the procedure for sending rejected images to the archive could contribute to the
inaccuracy of the statistics.

On the processing station, the technologist may send an image


9

directly to the trashcan. Once that trashcan is emptied, the image will be deleted for good. This is
one example of under-reporting of rejected images.

Table X shows that CHEST examinations have the most number of repeats at 51.66% while
ABDOMEN procedures are at 9.97%. 9.69% of the rejects have not been indicated with an
examination description by the technologists (NOT INDICATED).

The list in Table III has been modified by the area supervisors to meet the technologists needs in
being more specific and descriptive with their reasons for rejecting images. A number of the
reasons were broken down in more detail by indicating more specifics of various scenarios or
possibilities for doing repeats. This modified reject code list will assist the technologists with
consistency as well. Other reasons for repeats that have been discovered are:
double exposure (as opposed to duplicate images),
wrong marker (as opposed to no marker),
patient mis-positioned and cassette mis-positioned,
equipment malfunction,
high / low lgM (instead of too much contrast or not enough contrast),
breaking down artifacts to patient, cassette, or equipment,
radiologist request to reject,
and test images.
The categories of Availability, Identification, Appropriateness of Exam, and Diagnostic
have been modified by omitting reasons that are not used for rejecting an image. Table XI is the
modified reject code list that will be used beginning this year.

10

There is more work to be done. There are a couple of pieces of information that may be added in
the Access database of NONE files. The first is the area where the examination was performed
(i.e., portable, main radiology/fluoroscopy, outpatient, etc.) which will be useful for the
supervisors of these areas. The second is the lgM value of every rejected CR image, which can be
extracted from its DICOM header file. From this, the distribution of the lgM values can be
reported and analyzed. The automation of data transfer from the IMPAX database to the Access
database of NONE files will also be explored. The PACS team will continue to extract the data
into the NONE Access database, from which the management of Diagnostic Imaging may generate
their statistical reports for their own use. The procedure for labeling the rejected images with
NONE will also be clarified for consistency as well as to eliminate the possibility of having any
more records recorded as NOT INDICATED. Changes are expected as we evolve through this
electronic process of accommodating rejected images.

These preliminary statistical reports were presented to Diagnostic Imagings management and staff
and a number of them have viewed the rejected images. These reports, being a source for Quality
Improvement (QI), will be analyzed to investigate the causes of rejects and find ways to eliminate
them.

Retraining of staff and other corrective actions may have to be implemented and

documented.

The next focus of our departments QI efforts in Reject Analysis is expanding this process to the
areas or modalities that involve some form of radiation, such as Computed Tomography (CT),
Nuclear Medicine (NM) and scout images of Fluoroscopy examinations (not all the fluoroscopy
images are archived). How the technologists of these areas will be sending there reject images will
have to be reviewed and documented. This includes finding out where in their imaging chain do

11

we accommodate the reject analyses. Their data will be placed in the same database as the reject
data for CR. Reports will also be generated and shared with the supervisors of these areas.

Of course, all these accommodations for reject analyses have been based on current versions of the
software of the acquisition stations, processing stations, and the PACS archive. They may not
work on the next version of IMPAX or processing stations. As we upgrade to the next versions of
each system, we would have to review the versions to assure that we can continue to do the
procedure we have developed for reject analyses.

We also intend to share our results with our vendors.

They may be able to assist us in

accommodating reject images electronically, hopefully through a user-friendly mechanism, if not


in the current versions, maybe in future versions of their systems. Vendors play a vital role in the
world of reject analysis8. A few suggestions are one that allows specific fields to be edited to
include NONE- labels, allows automatic routing of rejected images, and will not allow deletion
of such images on the processing or quality control station.

The Diagnostic Imaging service formed a team to lower the reject rate. This team consists of all
Team Leaders from Nuclear Medicine, Ultrasound, Cat Scan, Portable X-ray, Outpatient,
Magnetic Resonance (MR), Main Radiology, and the PACS team. The lists of rejects (NONE
files) included discarded images from each modality. Team leaders were surprised to find that
some MR Technologists routinely acquire additional series, rather than following the appropriate
clinical protocol.

When these Technologists determine which Radiologist is to interpret the

examination, they were having the PACS Analyst split away the images not in excess of the
protocol of the individual Radiologist protocol. The Team leader commented that this practice
contributed to extended patient scan times for a service already suffering from a substantial
12

backlog as well as wasting PACS Analysts time and PACS archive space. This finding reinforced
the idea that reject analysis is valuable even for modalities that do not involve ionizing radiation.

CONCLUSIONS

Reject Analysis must be conducted routinely regardless of using conventional film/screen or CR


radiography. Attention to the sources and frequency of rejects can dramatically improve routine
image quality, provide a basis for in-service training of the individual technologist, resulting in
better patient care. The results of our reject analysis led our department to modify our entry
training program for new Technologists to emphasize averting the most common mis-positioning
errors. This is also included in evaluating job competency at 90 days post employment. The
efforts of the PACS Analysts in compiling reject reports from the image database are wasted
unless administrators are willing to implement methods of addressing the causes of rejects. Team
leaders are also key to this method: they are the ones who assure that rejects are properly reported
and they are the ones who determine whether an individual technologist or a group of technologists
need additional training.

The purpose, methodology, and importance of reject analysis must be emphasized with PACS
vendors so they can incorporate this in their software.

13

REFERENCES
1. S.Peer, R. Peer, M. Walcher, M.Pohl, W. Jaschke: Comparative reject analysis in
conventional film-screen and digital storage phosphor radiography Eur. Radiol.9, 1693-1696
(1999)
2. Shook, K.A. ONeall, D., and Honea, R. Challenges in the integration of PACS and RIS
databases. Journal of Digital Imaging Vol 11 No. 3 Suppl 1 (August) 1998: pp. 75-79.
3. Willis, C.E., Parker, B.R., Orand, M., and Wagner, M.L.: Challenges for pediatric radiology
using computed radiography. Journal of Digital Imaging. Vol. 11 No. 3 Suppl 1 (August) 1998
pp156-158.
4. Willis, C.E., Mercier, J., and Patel, M.:

Modification of conventional quality assurance

procedures to accommodate computed radiography.

13th Conference on Computer

Applications in Radiology. Denver, Colorado. June 7, 1996. pp. 275-281.


5. Willis, C.E.: Computed radiographic imaging and artifacts. Chapter 7. in Filmless Radiology.
New York: Springer-Verlag. pp. 137-154. 1999.
6. Willis, C.E.

Computed Radiography: QA/QC in Practical Digital Imaging and PACS.

Medical Physics Monograph No. 28. Madison: Medical Physics Publishing pp 157-175. 1999.
7. Willis, C. E.; McCluggage, C.W.; Orand, M.R., and Parker, B.R. Puncture Proof Picture
Archiving and Communications Systems. Journal of Digital Imaging Vol 14 No 2 Suppl 1
(June) 2001: pp 66-71.
8. Barnes, Eric. IN DIGITAL RADIOLOGY, QA MEANS NEVER HAVING TO SAY
YOURE SORRY; September 19, 2000, http://www.auntminnie.com/index.asp?sec=sea&sub=res

14

Table I. Agfa Processing Station Features


Demographic Editing
Image modifications
Reorientation
Annotation
Window and level
Collimation
Exposure field mask or removal

Examination menu selection


Measuring distance
Invert
Orientation change

Sensitometry curve selection


Image processing (MUSICA -MultiScale Image Contrast Amplification)

)
Table II. Reason for Repeated Examinations in Conventional Department
Artifacts
Mis-positioning
Over-collimation
Patient motion
Double exposure
Inadequate inspiration
Overexposed - too dark
Underexposed - too light
Marker missing or wrong
Wrong examination
Wrong patient
Film lost in processor

15

Table III: Radiologist Examination Critique List


RADIOLOGIST EXAM CRITIQUE
Media
F=Film
S=Soft Copy

Comment Category
Availability

Fault

Not Local
Not on System or Cache

1
2

Prior Exam

Not Local

Not on System

Missing Images

Duplicate Images

Wrong Sequence

Combined Exam

Wrong MRN

Wrong Patient

10

Wrong Name

11

Wrong DOB

12

Wrong Accession Number

13

Wrong Exam Procedure

14

Incorrect Orientation

15

Improper Placement of Marker

16

No Marker
Wrong Diagnosis

17
18

Wrong Exam Performed

19

Inappropriate Exam
No History Provided
Mis-positioned

20
21
22

Inadequate Inspiration

23

Image Sequence
Patient Identification

Exam Information
Annotation

Appropriateness of Exam

Technical

24

Motion
Collimation

Shielding

Quality

Density
Blurred
Contrast
Noisy
Image Size
Artifact

Diagnostic

Dictation Code

Current Exam

Number of Images

Identification

Specification

Not Enough

25

Too Much

26

No Collimation

27

Image Artifacts (Holding)

28

Inappropriate Shielding

29

No Shielding

30

Too Dark

31

Too Light

32

Monitor

33

Image

34

Too Much Contrast

35

Not Enough Contrast

36

Magnified

37
38

Minified

39

(Note: Please Specify)

40

Repeat: Non Diagnostic


Save For Teaching File

41
42

Save for Green Dot

43

16

Table IV: Routing Pattern, not allowing the routing of a rejected image to the user, but only
to the archive servers.

Table V: An entry for each specialty with each patient location or referring physician for
each destination is created on the routing table, giving an exponential growth to the table.

17

Table VI: Number of Rejected Images reported for the year 2001.
Month

Jan

Feb

Mar

Apr

May

Modality

Jun

Jul

Aug

Computed Radiology
(CR)
493
446
458

# of NONE 559
617 541
573 456
Images
# of Total 13028 12931 13388 12966 13336 12074 12520
Images in
Archive
%
4.29 4.77 4.04 4.42 3.42 4.08
3.56

Sept

Oct

Nov

Dec

622

611

542

613

TOTAL AVERAGE

6531

12842 13235 15206 14428 14667 160621

3.57

4.7

4.02

3.76

4.18

4.07

544
13385

4.07

Figure 1: Monthly CR rejects for the year 2001 versus the total number of image archived.

Monthly Distribution of Rejects

%of RejectedImages to Total


ArchivedImages

6.00
4.77

5.00
4.29
4.00

4.70

4.42

4.08

4.04
3.42

4.02
3.56 3.57

4.18
3.76

3.00
2.00
1.00
0.00
Jan Feb Mar Apr May Jun Jul Aug Sept Oct Nov Dec

Month

18

Table VII: Number of Images per Reject Code for 2001.


Sum of Reject Reason (CR, By Image)
Reject Code

Reject Description

Number of
Images

22
23
36
46
40
35
24
41
6
32
16
17
34
26
19
28
31
15
33
5
14
29
10
43
21
25
11
45
20
2
7
13
38
8
42
27
4

MISPOSITIONED
INADEQUATE INSPIRATION
CONTRAST--NOT ENOUGH CONTRAST
NOT INDICATED
ARTIFACT
CONTRAST--TOO MUCH CONTRAST
MOTION
REPEAT: DIAGNOSTIC
# OF IMAGES--DUPLICATE IMAGE
DENSITY--TOO LIGHT
ANNOTATION--IMPROPER PLACEMENT
ANNOTATION--NO MARKER
BLURRED--IMAGE
COLLIMATION--TOO MUCH
WRONG EXAM PERFORMED
SHIELDING--IMAGE ARTIFACTS
DENSITY--TOO DARK
ANNOTATION--INCORRECT ORIENTATION
BLURRED--MONITOR
# OF IMAGES--MISSING IMAGES
EXAM INFORMATION -WRONG EXAM PROCEDURE
SHIELDING--INAPPROPRIATE SHIELDING
PATIENT ID--WRONG PATIENT
DUPLICATE
NO HISTORY PROVIDED
COLLIMATION-NOT ENOUGH
PATIENT ID--WRONG NAME
OTHER
INAPPROPRIATE EXAM
CURRENT EXAM NOT ON SYSTEM OR CACHE
IMAGE SEQUENCE - WRONG SEQUENCE
EXAM INFORMATION - WRONG ACCESSION NUMBER
IMAGE SIZE--MAGNIFIED
COMBINED EXAM
SAVE FOR TEACHING FILE
COLLIMATION--NO COLLIMATION
PRIOR EXAM NOT ON SYSTEM

4035
570
440
372
286
139
131
83
74
69
75
54
28
23
24
21
13
13
11
10
8
7
7
7
5
5
4
3
3
2
2
2
1
1
1
1
1

TOTAL

6531

Percentage to
Total # of Rejects
%
61.78
8.73
6.74
5.70
4.38
2.13
2.01
1.27
1.13
1.06
1.15
0.83
0.43
0.35
0.37
0.32
0.20
0.20
0.17
0.15
0.12
0.11
0.11
0.11
0.08
0.08
0.06
0.05
0.05
0.03
0.03
0.03
0.02
0.02
0.02
0.02
0.02

19

Table VIII: CR Rejected compared with # of Archived Images by SHIFT for the year 2001.
2001
Shift 1
7 a - 3 pm

Shift 2
3 - 11 pm

Shift 3
11 pm - 7
am

Weekends

TOTAL

Jan

Feb

March

April

May

June

July

Aug

Sept

Oct

Nov

Dec

TOTAL AVERAGE

# of NONE Images

227

274

175

195

199

205

161

177

224

240

233

269

2579

215

Total # of Images in PACS


Archive

5254

4986

5174

5122

5425

5014

4908

5494

4833

6178

5662

5298

63348

5279

% (NONE vs Total Images)

4.32

5.50

3.38

3.81

3.67

4.09

3.28

3.22

4.63

3.88

4.12

5.08

4.07

4.07

# of NONE Images

142

143

142

180

102

107

96

137

188

165

136

151

1689

141

Total # of Images in PACS


Archive

4021

4012

3974

3857

4004

3351

3724

3816

4144

4638

4180

4109

47830

3986

% (NONE vs Total Images)

3.53

3.56

3.57

4.67

2.55

3.19

2.58

3.59

4.54

3.56

3.25

3.67

3.53

3.53

# of NONE Images

48

52

90

73

50

63

55

55

63

64

61

56

730

61

Total # of Images in PACS


Archive

1732

1754

1809

1724

1824

1537

1608

1515

1607

1991

2039

1936

21076

1756

% (NONE vs Total Images)

2.77

2.96

4.98

4.23

2.74

4.10

3.42

3.63

3.92

3.21

2.99

2.89

3.46

3.46

# of NONE Images

142

148

134

125

105

118

134

89

147

142

112

137

1533

128

Total # of Images in PACS


Archive

2021

2179

2431

2263

2083

2172

2280

2017

2651

2399

2539

3324

28359

2363

% (NONE vs Total Images)

7.03

6.79

5.51

5.52

5.04

5.43

5.88

4.41

5.55

5.92

4.41

4.12

5.41

5.41

# of NONE Images

559

617

541

573

456

493

446

458

622

611

542

613

6531

544

Total # of Images in PACS


Archive

13028

12931

13388

12966

13336

12074

12520

12842

13235

15206

14420

14667

160613

13384

20

Figure 2: CR Rejected Images by Shift for 2001

8.00

SHIFT 1
SHIFT 2

7.00

SHIFT 3

6.00

WEEKEND

5.00
4.00
3.00
2.00
1.00
0.00

Ja
n
Fe
M b
ar
ch
Ap
ril
M
ay
Ju
ne
Ju
ly
Au
g
Se
pt
Oc
t
No
v
De
c

%of CRRejects over Total ArchivedImages by


Shift

%of CRRejects by SHIFTs

Month

21

Table IX:

A portion of the report showing the Number of Rejects broken down by

Technologist Number and Reject Codes.

Technologist Number Reject Code

Number of Studies

Number of Images

11

22

40

17

22

15

15

23

35

40

16

16

21

34

22

22

23

16

22

101

102

23

18

18

29

31

40

41

12

22

Table X: CR Rejects by Exam Description


Exam Description

Number of Rejected Images

Chest
Abdomen
NOT INDICATED
Spine
Upper_Ext
Head
Lower_Ext
Pelvis
Body
Abdomen/KUB
Renal
Bone
Neck
TOTAL

3374
651
633
496
426
412
397
106
18
7
6
3
2
6531

51.66
9.97
9.69
7.59
6.52
6.31
6.08
1.62
0.28
0.11
0.09
0.05
0.03

23

Table XI: Reject Code List


REJECT CODE LIST
Comment Category
Images

Fault
Number of Images

Wrong Sequence

Combined Exam

Patient Identification

Wrong Patient

Exam Information

Wrong Exam Procedure

Annotation

Incorrect Orientation

Improper Placement of Marker

No Marker

Wrong Marker

10

Appropriateness of Exam

Wrong Exam Performed

Technical

Mis-positioned

11
Patient

12

Cassette

13

Inadequate Inspiration

14

Motion

15

Collimation

Shielding

Quality

Missing Images

Not Enough

16

Too Much

17

No Collimation

18

Image Artifacts (Holding)

19

Inappropriate Shielding

20

No Shielding

21

Double Exposure

22

Equipment Malfunction

23

Density
Blurred
Contrast

Too Dark

24

Too Light

25

Monitor

26

Image

27

High LGM

28

Low LGM

29
30

Noisy
Image Size
Artifact

Diagnostic

Dictation
Code
1

Duplicate Images
Image Sequence
Identification

Specification

Magnified

31

Minified

32

Patient

33

Cassette

34

Equipment

35

Radiologist's Request to Reject

36

Save for Green Dot/Teaching File

37

Test

38

Other

39

24

You might also like