MIF-YOLO: An Enhanced YOLO with Multi-Source Image Fusion for Autonomous Dead Chicken Detection

Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the auto...

Full description

Saved in:
Bibliographic Details
Main Authors: Jiapan Li, Yan Zhang, Yong Zhang, Hongwei Shi, Xianfang Song, Chao Peng
Format: Article
Language:English
Published: Elsevier 2025-12-01
Series:Smart Agricultural Technology
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2772375525003375
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the autonomous identification of dead chicken. The proposed approach commences with the application of progressive illumination-ware fusion (PIA Fusion) to amalgamate thermal infrared and visible-light imagery, thereby accentuating the salient features indicative of dead chickens and counteracting the impact of non-uniform illumination. To address the challenge of feature extraction under conditions of significant occlusion, the model incorporates the Rep-DCNv3 module, which augments the backbone network's capacity to discern subtle characteristics of dead chickens. Additionally, an exponential moving average (EMA) attention mechanism is strategically embedded within the YOLO algorithm architecture's neck region to bolster the model's ability to discern targets under low-light scenarios, enhancing both its accuracy rates and adaptability. The loss function of the model is refined through the implementation of Modified Partial Distance-IoU (MPDIoU), facilitating a more nuanced evaluation of the overlap of objects. Validated against a dataset comprising caged white-feathered chickens procured from a farm in Suqian, Jiangsu Province, the empirical findings indicate that the model attains a precision of 99.2% and a mAP@0.5 metric of 98.9%, surpassing the performance of existing cutting-edge methodologies. The innovative detection methodology for dead chickens ensures not only rapid detection, but also marked improvement in detection fidelity, aligning with the demands of real-time monitoring in operational agricultural contexts.
ISSN:2772-3755