Reject Analysis: pictures or it didn’t happen
November 05, 2019
By Ana Dolcet, Medical Physicist and Steve Nzitunga, Radiographer from Qaelum, published on Diagnostic Imaging
As a radiographer working in general radiology, you will be familiar with famous sentences like ‘let me take one more, just to be sure’ or ‘I don’t think this is lateral enough to visualize the pathology’. If you work in a radiology department, you’ll certainly have experienced some of these cases in your professional life:
1) The patient moves during the examination.
2) The quality of the image is insufficient due to the technique.
3) The field is more collimated than it should be and leaves out part of the region of interest
4) An artifact appears on the image
5) The wrong body part is examined
6) The wrong patient is examined
7) Etc
What do you do in these cases? You repeat the examination. But what happens with the previous discarded image? If you send it to PACS, along with the new examination, it will be considered in the patient radiology history. Furthermore, if your hospital has a Dose Management System such as DOSE by Qaelum, the radiation dose delivered to the patient for the first image will be taken into account.
But what happens if you don’t send the image or you delete it? Then, it’s like it never happened! Nevertheless, it did, and the patient received that radiation dose even though it wasn’t recorded. In fact, according to Rogers at al. [1], 14% of patient exposure is due to repeated images. And not only that, patient workflow and healthcare resources are also affected by image rejection.
So, what can we do?
Goal 1: Reduce image retakes.
Many studies have shown that the most common reason for rejection is ‘positioning and anatomical cut off’’, a reason that frequently increases for larger body parts. Lin et al. [2] showed that regarding the projection, “lateral projection view exhibited significantly higher retake rate than the anteroposterior view”, which can be explained by the greater positioning difficulty of the first.
As patient positioning, anatomy cut off and presence of artifacts are the most common causes for image rejection, Lin et al. [2] have proposed useful tips to reduce them:
● Remind the radiographer to check grid status (yes/no) to prevent machine problem or improper exposure
● Use foot marks for positioning right and left feet to prevent position error
● Train radiographers to adjust the proper posture of the patient in lateral views
● Put a poster to remind patient and radiographer to assure the artifacts have been removed
● Check if a patient’s chest fits the chosen detector and the field during chest examination to prevent anatomy cutoff
After setting these guidelines, they achieved a significant reduction in the rejection rate over a period of six months.
Goal 2: Re-think the need for rejection.
An image has diagnostic value when it contains an element that can help the ‘reporter’ make a diagnosis. This is true, not only when it has been sent for revision, but also when it has been ‘deleted’. In fact, a study performed by Rosenkrantz et al. [3] showed that 93.1% of rejected chest radiographs could have been adequate for diagnosis. The decision to reject an image is based on the ‘subjective judgment’ of the radiographer(s), as Waler and Hofmann [2] affirmed. It is difficult to measure and to understand on what grounds this decision is made. Therefore, the decision to reject an image should be based more on protocols, where the criteria of a good radiographic image should stand. By doing this, radiographers would have less responsibility when deciding to reject an image.
Anyway, a rejection rate of zero is not the goal. That figure would mean that every image is being sent for report regardless of their quality. Radiographers should be able to recognize and reject non-diagnostic images, in fact the rejection rate recommended by AAPM TG151[5] is 8%.
The Tool: DOSE Reject Analysis module by Qaelum
A reject analysis program should be carried out in every Radiology Department to assess quality in clinical practice. DOSE module “Reject Analysis” connects directly to the device, ‘looking’ for the rejected or deleted images. Sometimes these can be found in a bin or in a special folder. DOSE extracts these images automatically, avoiding the need for the user to go physically to the machine with a USB stick to extract them [6]. DOSE then analyses the rejected images within the module, allowing the user to review and verify the reason for rejection. Furthermore, it shows trend analysis graphs where cumulative and average “rejected” dose can be visualized for quality purposes. And of course, the “rejected” dose is added to the patient’s history/dose passport.
The module also shows the original reason for rejection, to check if it matches the actual cause and corrects it, if necessary. This can be also used to standardize the rejection reasons, a task that will be much easier with the future publication of TG305 Development of Standards for Vendor-Neutral Reject Analysis in Radiography.
Reject analysis from DOSE gives you all the necessary information that helps you understand where more training is needed to improve the quality of images within your department and reduce the radiation dose to the patient. By doing that, you can lower the risk of stochastic effects of radiographic imaging and therefore improve the safety of the patient.
For more information about DOSE, please visit us at RSNA 2019, booth 8303.
References
[1] K.D. Rogers, I.P. Matthews, and C.J.Roberts. Variation in Repeat Rates between 18 Radiology Departments. Brit J. Radiol. 60:463-468, 1987
[2] Chih-Sheng Lin et al. Guidelines for reducing image retakes of general digital radiography. Advances in Mechanical Engineering 2016, Vol. 8(4) 1–6
[3] Rosenkrantz et al “Technologist-Directed Repeat Musculoskeletal and Chest Radiographs: How Often Do They Impact Diagnosis?” American Journal of Roentgenology. 2017;209: 1297-1301. 10.2214/AJR.17.18030
[4] D. Waaler and B. Hofmann IMAGE REJECTS/RETAKES—RADIOGRAPHIC CHALLENGES, Radiation Protection Dosimetry (2010), Vol. 139, No. 1–3, pp. 375–379
[5] Jones et al. Ongoing quality control in digital radiography. Report of AAPM Imaging
Physics Committee Task Group 151. Medical Physics, Vol. 42, No. 11, November 2015
[6] Little, Kevin J. et al.Unified Database for Rejected Image Analysis Across Multiple Vendors in Radiography. Journal of the American College of Radiology, Volume 14, Issue 2, 208 - 216