A weakly-supervised follicle segmentation method in ultrasound images

Abstract Accurate follicle segmentation in ultrasound images is crucial for monitoring follicle development, a key factor in fertility treatments. However, obtaining pixel-level annotations for fully supervised instance segmentation is often impractical due to time and workload constraints. This pap...

Full description

Saved in:
Bibliographic Details
Main Authors: Guanyu Liu, Weihong Huang, Yanping Li, Qiong Zhang, Jing Fu, Hongying Tang, Jia Huang, Zhongteng Zhang, Lei Zhang, Yu Wang, Jianzhong Hu
Format: Article
Language:English
Published: Nature Portfolio 2025-04-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-95957-0
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Accurate follicle segmentation in ultrasound images is crucial for monitoring follicle development, a key factor in fertility treatments. However, obtaining pixel-level annotations for fully supervised instance segmentation is often impractical due to time and workload constraints. This paper presents a weakly supervised instance segmentation method that leverages bounding boxes as approximate annotations, aiming to assist clinicians with automated tools for follicle development monitoring. We propose the Weakly Supervised Follicle Segmentation (WSFS) method, a novel one-stage weakly supervised segmentation technique model designed to enhance the ultrasound images of follicles, which incorporates a Convolutional Neural Network (CNN) backbone augmented with a Feature Pyramid Network (FPN) module for multi-scale feature representation, critical for capturing the diverse sizes and shapes of follicles. By leveraging Multiple Instance Learning (MIL), we formulated the learning process in a weakly supervised manner and developed an end-to-end trainable model that efficiently addresses the issue of annotation scarcity. Furthermore, the WSFS can be used as a prompt proposal to enhance the performance of the Segmentation Anything Model (SAM), a well-known pre-trained segmentation model utilizing few-shot learning strategies. In addition, this study introduces the Follicle Ultrasound Image Dataset (FUID), addressing the scarcity in reproductive health data and aiding future research in computer-aided diagnosis. The experimental results on both the public dataset USOVA3D and private dataset FUID showed that our method performs competitively with fully supervised methods. Our approach achieves performance with mAP of 0.957, IOU of 0.714 and Dice Score of 0.83, competitive to fully supervised methods that rely on pixel-level labeled masks, despite operating with less detailed annotations.
ISSN:2045-2322