Click here to

Session: Imaging: Nuclear Medicine, PET [Return to Session]

Predicting Breast Cancer Hormone Receptor Status with PET/MR Images Using Dual-Input Deep Convolutional Neural Network (DCNN)

S Choi1*, H Cho2, C Lee2, E Kong3, (1) Electronics and Telecommunications Research Institute (ETRI), Yuseong-gu, Daejeon, KR, (2) Korea Research Institute of Standards and Science (KRISS), Daejeon, KR, (2) Korea Research Institute of Standards and Science (KRISS), Daejeon, KR, (3) Yeungnam University Medical School And Hospital

Presentations

WE-IePD-TRACK 2-5 (Wednesday, 7/28/2021) 12:30 PM - 1:00 PM [Eastern Time (GMT-4)]

Purpose: The use of endocrine agents in breast cancer therapy depends on the response to estrogen hormone receptor. Tissue sampling and surrogate genetic testing are the gold standard methodology to figure out the hormone receptor (HR) status which are the basis of genetic expression of cancer cells. However, it is sub-optimal to personalized treatment plan due to its time consuming process. A simple deep convolutional neural network (DCNN) approach with dual-image (PET and MR) input has been proposed to classify the estrogen receptor (ER) status (positive/negative). The purpose of the study is to develop an automatic pipeline to predict prognostic factors.

Methods: Dynamic contrast enhancement MR sequences and 18F-FDG PET imaging were performed using a synchronized PET/MR scanner. A total 212 patients were patients were retrospectively collected who had received immunohistochemistry (IHC) genetic testing after tissue sampling. We annotated the breast tumors by drawing binary masks on PET images and then warped them into the corresponding MR images for each patient. Image preprocessing includes MR and PET slice thickness and field-of-view matching. A total 2,565 image patches were independently trained with three different neural networks to predict the binary classes of ER positive and ER negative status. We used cross-validation scheme to avoid possible unseen coincidence when shuffling the dataset during training.

Results: The resulted accuracies from three different DCNNs of single MR input, single PET input, and dual PET/MR input were 54%, 63%, and 76%, respectively, which demonstrates that combining the features of both PET and MR images can predict the ER status well, rather than solely training the images with one image type.

Conclusion: Image-based prediction of hormone receptor status without IHC genetic testing would demonstrate that the extracted PET and MR image features from DCNN may include prognostic factors, which is a potential for radiogenetic imaging biomarker.

Keywords

PET, MR, Cell Kinetics

Taxonomy

IM/TH- Image Analysis (Single Modality or Multi-Modality): Computer-aided decision support systems (detection, diagnosis, risk prediction, staging, treatment response assessment/monitoring, prognosis prediction)

Contact Email

Share: