Click here to

Session: Multi-Disciplinary General ePoster Viewing [Return to Session]

Multimodality Image Registration for Application in Brachytherapy Based On Automated Organ Segmentation

K Qing1*, X Feng2,3, S Glaser1, W Watkins1, D Du1, Y Chen1, C Han1, J Liang1, J Liu1, B Liu3, Q Chen3,4, A Liu1, (1) City of Hope National Medical Center, Duarte, CA, (2) University of Virginia, Charlottesville, VA, (3) Carina Medical, Lexington, KY , (4) Rutgers Cancer Institute of New Jersey, New Brunswick, NJ, (5) University of Kentucky, Lexington, KY

Presentations

PO-GePV-M-139 (Sunday, 7/25/2021)   [Eastern Time (GMT-4)]

Purpose: Treatment planning for modern brachytherapy requires imaging acquisitions such as computed tomography (CT) and magnetic resonance imaging (MRI). CT remains the first choice for treatment planning. MRI provides detailed information about treatment target delineation and soft tissue anatomy. Multimodality image registration is usually needed but often needs manual intervention. A novel automated multimodality image registration method based on automated organ segmentation is proposed and validated in this study, with the expectation to improve efficiency of treatment planning workflow for brachytherapy.

Methods: CT and MRI images acquired on the same day from fourteen patients with locally advanced cervical cancer (stages IB3-III) treated with brachytherapy were used. A U-Net model was first trained to automatically segment the bladder in both CT and MRI. Data from eight patients were used for training, and data from the other six patients were used for validation. The Dice scores were used to evaluate segmentation accuracy. Rigid image registration was performed based on automated alignment of the contours. The average L2 differences between the translation vectors from the automatic registration and those from human were calculated to evaluate quality of registration. The proposed method was compared with conventional registration method based on mutual information provided by Advanced Normalization Tools (ANTs).

Results: The Dice metrics for bladder segmentation in MRI and CT datasets were 0.89±0.04 and 0.80±0.06 respectively. Compared to ANTs using MRI and CT images directly (23.5±22.2mm, 3.5-52.3mm,), the U-Net-based registration significantly reduced the L2 distance error (5.9±4.0mm, 2.0-10.9mm) in the six testing patients against the ground truth.

Conclusion: In this pilot study, a new multimodality image registration method based on automated organ segmentation is proposed and tested. Initial results showed a relatively high accuracy and robustness of this method with potential for future clinical applications.

Funding Support, Disclosures, and Conflict of Interest: Xue Feng and Quan Chen receives salary from 2Carina Medical,Lexington, KY 40513 This research is supported by R44CA254844 funded by NCI

ePosters

    Keywords

    Registration, Segmentation, Brachytherapy

    Taxonomy

    IM/TH- Image Registration Techniques: Machine Learning

    Contact Email

    Share: