A Two-Point Association Tracking System Incorporated With YOLOv11 for Real-Time Visual Tracking of Laparoscopic Surgical Instruments

The application of real-time visual tracking in laparoscopic surgery has gained significant attention in recent years, driven by the growing demand for precise and automated surgical assistance. Instrument tracking, in particular, is critical for enhancing the safety and efficacy of minimally invasi...

Full description

Saved in:
Bibliographic Details
Main Authors: Nyi Nyi Myo, Apiwat Boonkong, Kovit Khampitak, Daranee Hormdee
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10840191/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The application of real-time visual tracking in laparoscopic surgery has gained significant attention in recent years, driven by the growing demand for precise and automated surgical assistance. Instrument tracking, in particular, is critical for enhancing the safety and efficacy of minimally invasive surgery, where direct visibility is often limited. Real-time tracking of surgical instruments allows for more accurate maneuvering, reduces the risk of accidental tissue damage, and enables the development of advanced computer-assisted surgical systems. In this context, advancements in deep learning, particularly through detection models and modern tracking algorithms, have opened new avenues for addressing the challenges posed by real-time laparoscopic instrument tracking. However, according to the preliminary results, the existing combination of the detection model and tracking algorithm could not often handle the remaining challenges, including fast-motion speed, occlusion, overlapping, and close proximity of surgical instruments. This paper proposes a novel two-point association approach for surgical instrument tracking using a combination of YOLOv11 for object detection and refined ByteTrack for tracking. The proposed system is evaluated on a comprehensive dataset of surgical videos. The experimental results demonstrate superior performance in terms of segmentation accuracy (via F1-score), tracking robustness (via MOTA and HOTA), and real-time processing speed (via FPS). In order to validate the effectiveness of this research, real-time surgical instrument tracking is performed with the streaming of laparoscopic gynecologic surgery on a donated soft-tissue cadaver. The results indicate that the proposed system significantly improves the segmentation and tracking of surgical instruments, offering a reliable tool for enhancing intraoperative navigation and reducing the risk of surgical errors. This work contributes to the advancement of intelligent surgical systems, providing a foundation for further integration of machine learning techniques in the operating room.
ISSN:2169-3536