Purpose: Over the last decade, radiation oncology departments have seen an increase in the clinical use of custom software to improve efficiency and to increase patient safety. While many published studies often discuss software-specific details and/or clinical implementation experience, quality assurance (QA) programs for these clinically-used tools are rarely discussed. In this study, we report our experience implementing a QA program for an in-house developed auto-contouring system. Our hypothesis here is that daily QA tests would ensure there are no interruptions (i.e. down-time) while this system is in clinical-use.
Methods: Daily QA tests begin with an end-to-end test of the auto-contouring system. First, DICOM image files for an anonymized patient are sent via DICOM transfer from the treatment planning system (Eclipse, Varian Medical Systems) to the auto-contouring pipeline. This step is performed in such a way that it replicates our clinical workflow and typical use of this tool. Once the automatically-generated contours are ready and automatically imported into Eclipse, verification of the time difference between send and receipt of contours is made to ensure system performance. For this study, we evaluated the efficiency of daily QA in detecting simulated errors.
Results: The auto-contouring QA program was established at our institution in January 2022. Daily QA tests were able to detect 100% of errors in all simulated scenarios including DICOM files not received by system, system being shutoff, contours running on CPU (to simulate “broken” GPU) resulting in long contouring times, among other tests. Daily QA test took consistently 8.01±0.1 minutes.
Conclusion: We’ve successfully implemented a QA program to ensure the safe and efficient clinical use of an in-house developed auto-contouring system. Since there is a lack of formal recommendations, based on our clinical experience we recommend all physicists to consider developing QA programs for clinically-used custom software tools.