Frustrations in Ultrasound-guided Radiotherapy: Automated Solutions
Dr. Emma Harris
The Joint Department of Physics, Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London, UK
Ultrasound can image soft tissues with high resolution, in real-time and also offers volumetric imaging. These attributes make it ideal to guide radiation therapy. Ultrasound is also recognised as a user dependent modality, with the quality of the acquired images and the eventual interpretation of those images highly dependent on the skill and experience of the user. Clinical staff in the radiation therapy department often have little or zero experience with ultrasound. This has posed a significant barrier to the implementation of ultrasound in guidance in radiotherapy. The additional training and time required to use ultrasound, adds significant pressure to an already highly-pressured clinical environment and often under-resourced departments. On a practical level, ultrasound does not compare well with tools such as cone beam CT, now effectively single push button acquisition and complemented with a suite of image processing tools that automate image registration. It is eminently possible to improve the "user friendliness" of ultrasound. Alternative hardware which is more suited to the radiation therapy clinic, to improve speed of set up and provide a more seamless integration with radiotherapy workflow can be employed. In diagnostic ultrasound setting, new software tools are being developed to allow unskilled users to acquire images, and automatic segmentation of tissues is being addressed. This talk will review these new innovations and explore how a new generation of ultrasound guided radiation therapy can be successfully translated into the clinic.
1. Understand the practical barriers to clinical implementation of ultrasound guided radiotherapy.
2. Knowledge of innovations in hardware that could be applied to ultrasound to improve integration of ultrasound with the radiotherapy workflow.
3. Knowledge of state-of-the-art machine learning approaches that are being applied to ultrasound to improve the ability of users to acquire good quality images and interpret images.
Ultrasound guidance during radioablation treatment of cardiac arrhythmias
Dr. Saskia Camps
External Beam Ablation Medical Devices (EBAMed) SA, Geneva, Switzerland
Heart arrythmias are disruptions in the normal heartbeat, which can result in too fast, too slow or irregular heartbeats. As the occurrence of these arrhythmias is strongly linked to heart attacks and strokes, this can be a life threatening condition. Nowadays, catheter ablations are the main curative treatment option for atrial fibrillation and ventricular tachycardia. These procedures require hospitalization of the patient, can be lengthy and risky and they have shown varying outcomes depending on e.g. the arrhythmia type and skills of the treating physician. For this reason, there has been a growing interest for treating cardiac arrhythmia patients non-invasively and potentially more effectively by means of radiation.
To safely and effectively perform radioablation of the heart, several motion components which can affect the shape and position of the heart and therefore also the shape and position of the treatment target should be taken into account. One could think of motion resulting from the heartbeat of the patient, as well as of inter-fraction motion and intra-fraction drift of the heart. This talk will focus on how ultrasound imaging can be used to monitor these motion components in order to provide the clinicians with real-time information on the motion of the heart during radioablation treatment of cardiac arrhythmias.
1. To learn more about the use of radiation for the treatment of cardiac arrhythmias.
2. To understand how ultrasound imaging can be used for motion monitoring during cardiac radioablation treatments.
Platform for ultrasound-guided autonomous minimally invasive surgery: application to the knee
Dr. Maria Antico
Queensland University of Technology, School of Mechanical, Medical & Process Engineering, Science and Engineering Faculty, Brisbane, Australia
Ultrasound imaging can be employed to improve automation levels in minimally invasive surgery and eventually enable full automation by providing real-time situational awareness to medical robots via intra-operative, real-time, volumetric mapping of the surgical site.
This talk describes the development of a prototype automated image guidance system, combining high-refresh-rate, volumetric ultrasound (US) imaging (or 4D US) and advanced deep learning strategies for an autonomous robotic platform, currently being investigated for knee surgery. The feasibility of using 4D US for guidance in knee minimally invasive surgery was first proved through cadaver and volunteer studies. The workflow to enable automatic interpretation of 4D US imaging for robotic guidance was then developed and implemented. The essential steps include automatic image quality assessment, tissue segmentation and tracking (in combination with the assessment of uncertainties) and surgical tool tracking. The results obtained show US imaging's potential for quantitative autonomous tasks and for the creation of autonomous, intelligent robotic surgical systems that will hopefully soon make surgery more sustainable and improve people's quality of life.
1. US imaging combined with deep learning strategies can provide a solution for real-time volumetric guidance for autonomous systems.
2. The algorithms developed and implemented can be extended to US images of other body districts.
3. The workflow elucidated provides solutions to the issues currently limiting the application of US imaging to quantitative autonomous tasks.