Purpose: Robotic radiosurgery allows for marker-less lung tumor tracking by detecting pattern variations in tumor density in 2D orthogonal x-ray images. The ability to detect and track a lung lesion depends on its size, density, and location, and has to be evaluated on a case-by-case basis. The current method for identifying which patient can be successfully treated with marker-less lung tumor tracking is a time-consuming multi-step process involving CT acquisition, generation of a simulation plan, creation of the patient breathing model, and execution of the simulation plan on the treatment delivery platform. The aim of the study is to develop a tool based on binary classification of trackable and non-trackable lung tumors for automatic selection of optimal tracking methods for patient undergoing robotic radiosurgery.
Methods: We developed a deep learning classification model and tested 5 different network architectures (AlexNet, Densenet201, InceptionResnetV2, VGG19, and NasNetLarge) to classify lung cancer lesions from digitally reconstructed radiographs (DRR) generated from planning CTs. This study included 138 IRB-approved patients. There were 109 patients with 115 lesions that were trackable and 20 patients with 29 lesions that were non-trackable, for a total of 271 images. We kept 80% of the images for training, 10% for validation, and the remaining 10% for testing.
Results: The training time on a single 1.6GHz Intel i5 processor and 16GB of RAM was 18 minutes for AlexNet, 103 minutes for DenseNet, 170 minutes for InceptionResnetV2, 203 minutes for VGG19, and 245 minutes for NasNetLarge. There were no false classifications. The binary classification accuracy reached 100% after training, both in the validation and the testing phase, for all 5 network architectures.
Conclusion: For patients undergoing robotic lung radiosurgery, candidates for marker-less lung tumor tracking can be successfully identified by classifying with a deep learning model DRR images sourced from simulation CT scans.
Not Applicable / None Entered.