Click here to

Session: Multi-Disciplinary General ePoster Viewing [Return to Session]

Automated Multimodality Image Registration for Application in Brachytherapy Using U-Net Based Organ Segmentation

K Qing1*, X Feng2, 3, S Glaser1, W Watkins1, D Du1, Y Chen1, C Han1, J Liang1, J Liu1, B Liu1, Q Chen1, A Liu1, (1) City of Hope National Medical Center, Duarte, CA, (2) University of Virginia, Charlottesville, VA, (3) Carina Medical LLC, Lexington, KY

Presentations

PO-GePV-M-182 (Sunday, 7/10/2022)   [Eastern Time (GMT-4)]

ePoster Forums

Purpose: Multimodality image registration is frequently used in radiation oncology but requires manual intervention. An automated multimodality image registration method based on automated organ segmentation was proposed previously for application in brachytherapy. It showed better accuracy compared to conventional automated registration tools. However, it is primarily based on automated contouring of bladder, which could change over time. This work illustrates a new approach including bony structures as a new reference organ.

Methods: Data from 14 brachytherapy patients with locally advanced cervical cancer (stages IA-III) are retrospectively collected. Data from 8 patients was used to train a U-NET segmentation model, and data from the other 6 patients was used for validation. Ground-truth pelvic bones were contoured in the axial view with 1cm beyond the clinical target volume (CTV) from the superior to inferior direction. Rigid image registration is performed based on automated alignment of the pelvic bone and bladder contours together. The quality of registration is evaluated with the average L2 differences between the translation vectors from the registrations of interest and ground truth. Conventional registration method based on mutual information provided by Advanced Normalization Tools (ANTs) is used for comparison. Accuracy of segmentation is evaluated using dice metrics.

Results: Compared to the automated registration provided by ANTs, the U-Net based registration showed a highly consistent accuracy in all six testing datasets. The average L2 differences between the proposed method and ground truth is 4.9±2.9mm, lower than automated registration using bladder only (7.5±5.1mm), and much lower than conventional ANTs method (22.9±22.2mm). The Dice metrics for pelvic bone and bladder segmentations are 0.62±0.14 and 0.76±0.11 in the CT datasets, and 0.69±0.08 and 0.86±0.05 in the MRI datasets.

Conclusion: Automated multimodality image registration using U-NET based multiple organ segmentation is a highly accurate and robust method with great potential for future clinical usage.

Funding Support, Disclosures, and Conflict of Interest: Xue Feng is employee of Carina Medical LLC. Quan Chen is shareholder of Carina Medical LLC.

Keywords

Image Fusion, Brachytherapy, MR

Taxonomy

IM/TH- Image Registration: Multi-modality registration

Contact Email

Share: