Click here to

Session: Science Council Session: Innovative Technologies to Advance Diagnosis and Treatment [Return to Session]

Biologically Guided Deep Learning for Post-Radiation PET Image Outcome Prediction: A Feasibility Study of Oropharyngeal Cancer Application

C Wang1*, H Ji2, A Bertozzi2, D Brizel1, Y Mowery1, F Yin1, K Lafata1, (1) Duke University Medical Center, Durham, NC, (2) University of California, Los Angeles, Los Angeles, CA

Presentations

TU-EF-TRACK 4-3 (Tuesday, 7/27/2021) 3:30 PM - 5:30 PM [Eastern Time (GMT-4)]

Purpose: To develop a method of biologically guided deep learning for post-radiation ¹⁸FDG-PET image outcome prediction based on pre-radiation images and radiotherapy dose information.

Methods: Based on the classic reaction-diffusion mechanism, a novel biological model was proposed using a partial differential equation that incorporates spatial radiation dose distribution as a patient-specific treatment information variable. A 7-layer encoder-decoder based convolutional neural network (CNN) was designed and trained to learn the proposed biological model. As such, the model could generate post-radiation ¹⁸FDG-PET image outcome predictions with possible time-series transition from pre-radiotherapy image states to post-radiotherapy states. The proposed method was developed using 64 oropharyngeal patients with paired ¹⁸FDG-PET studies before and after 20Gy delivery (2Gy/fx) by IMRT. In a two-branch deep learning execution, the proposed CNN learns specific terms in the biological model from paired ¹⁸FDG-PET images and spatial dose distribution as in one branch, and the biological model generates post-20Gy 18FDG-PET image prediction in the other branch. As in 2D execution, 718/233/230 axial slices from 38/13/13 patients were used for training/validation/independent test. The prediction image results in test cases were compared with the ground-truth results quantitatively.

Results: The proposed method successfully generated post-20Gy ¹⁸FDG-PET image outcome predictions and the associated time-series evolutions with breakdown illustrations of biological model components. SUV mean values in ¹⁸FDG high-uptake regions of predicted images (2.45±0.25) were similar to ground-truth results (2.51±0.33). In 2D-based Gamma analysis, the median/mean Gamma Index (<1) passing rate of test images was 96.5%/92.8% using 5%/5mm criterion; such result was improved to 99.9%/99.6% when 10%/10mm was adopted.

Conclusion: The developed biologically guided deep learning method achieved post-20Gy ¹⁸FDG-PET image outcome predictions in good agreement with ground-truth results. With break-down biological modelling components, the outcome image predictions could be used in adaptive radiotherapy decision-making to optimize personalized plans for the best outcome in future.

Handouts

    Keywords

    PET, Image-guided Therapy, Modeling

    Taxonomy

    TH- Response Assessment: Modeling: Machine Learning

    Contact Email

    Share: