Assessing YOLACT++ for real time and robust instance segmentation of medical instruments in endoscopic procedures
Academic Article in Scopus
-
- Overview
-
- Identity
-
- Additional document info
-
- View All
-
Overview
abstract
-
© 2021 IEEE.Image-based tracking of laparoscopic instruments plays a fundamental role in computer and robotic-assisted surgeries by aiding surgeons and increasing patient safety. Computer vision contests, such as the Robust Medical Instrument Segmentation (ROBUST-MIS) Challenge, seek to encourage the development of robust models for such purposes, providing large, diverse, and high-quality datasets. To date, most of the existing models for instance segmentation of medical instruments were based on two-stage detectors, which provide robust results but are nowhere near to the real-time, running at 5 frames-per-second (fps) at most. However, for the method to be clinically applicable, a real-time capability is utmost required along with high accuracy. In this paper, we propose the addition of attention mechanisms to the YOLACT architecture to allow real-time instance segmentation of instruments with improved accuracy on the ROBUST-MIS dataset. Our proposed approach achieves competitive performance compared to the winner of the 2019 ROBUST-MIS challenge in terms of robustness scores, obtaining 0.313 ML-DSC and 0.338 MLNSD while reaching real-time performance at >45 fps.
status
publication date
published in
Identity
Digital Object Identifier (DOI)
PubMed ID
Additional document info
has global citation frequency
start page
end page