S1605
Physics - Autosegmentation
ESTRO 2026
Oncology, Copenhagen University Hospital - Rigshospitalet, Copenhagen, Denmark. 3 Department of Clinical Medicine, Faculty of Health and medical sciences, University of Copenhagen, Copenhagen, Denmark. 4 Department of Health Technology, Technical University of Denmark, Copenhagen, Denmark Purpose/Objective: Manual delineation of organs-at-risk (OARs) remains a critical yet time-consuming procedure in radiotherapy planning [1]. While automated segmentation models have shown promise, manual expert corrections are still needed. This process can be time-consuming and does not naturally integrate with human visual-based evaluation and communication. In this work, we explore an innovative approach that leverages eye movements to communicate human evaluation of auto-segmentation to the computer using artificial intelligence (AI). Material/Methods: Our eye-tracking-based approach integrates real-time eye-tracking data points as prompts to MedSAM [2] – a promptable model that automatically provides the segmentation for an object of interest (e.g., organ) once the user identifies its location. In our setup, the eye-tracker monitors the expert’s gaze positions, which are sent to MedSAM to generate segmentation masks in real time. The expert can see and correct segmentation prediction immediately, as prediction updates iteratively until new gaze points are tracked and sent to the model. We adapted MedSAM and trained it on abdominal CT slices from the open WORD dataset [3] using synthetic 20-point prompts. 80% of points in prompts were located inside the target organ and 20% outside to emulate natural gaze behaviour. For training a comparative click-based model, a single point was generated within the ground-truth organ mask. During experiments, two medical observers performed segmentation using our approach and alternative interactive methods, including manual segmentation, original MedSAM with bounding box selection, and a mouse click-based MedSAM. All these methods were evaluated on 16 abdominal organs across 160 CT slices from two cancer patients from the WORD test dataset using our software connected with a Tobii eye-tracker mounted below the diagnostic monitor. Performance was assessed through the Dice similarity coefficient (DSC) and annotation time per CT slice.
Conclusion: This study has demonstrated that all eight DL-based commercial solutions were able to provide satisfying quality in OAR delineation for H&N cases. However, as geometric similarity correlated weakly with the dosimetric impact, automated segmentation still requires case-specific correction and approval by a radiation oncologist. References: [1] Kosmin M, Ledsam J, Romera-Paredes B, Mendes R, Moinuddin S, de Souza D, Gunn L, Kelly C, Hughes CO, Karthikesalingam A, Nutting C, Sharma RA. Rapid advances in auto-segmentation of organs at risk and target volumes in head and neck cancer. Radiother Oncol. 2019 Jun;135:130-140. doi: 10.1016/j.radonc.2019.03.004. Epub 2019 Mar 22. PMID: 31015159 Keywords: Head and Neck, deep learning, autocontouring Digital Poster Highlight 4429 Integrating eye-tracking into abdominal CT segmentation: a pilot study Leila Khaertdinova 1 , Ivan Richter Vogelius 2,3 , Ane L Appelt 2,4 , Bulat Ibragimov 1 1 Department of Computer Science, University of Copenhagen, Copenhagen, Denmark. 2 Department of
Made with FlippingBook - Share PDF online