Few-shot cow identification via meta-learning

Cow identification is a prerequisite for precision livestock farming. Biometric-based methods have made significant progress in cow identification. However, substantial labelling costs and frequent identification task changes are still hamper model application. In this work, a novel method called “M...

Full description

Saved in:
Bibliographic Details
Main Authors: Xingshi Xu, Yunfei Wang, Yuying Shang, Guangyuan Yang, Zhixin Hua, Zheng Wang, Huaibo Song
Format: Article
Language:English
Published: Elsevier 2025-03-01
Series:Information Processing in Agriculture
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2214317324000210
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cow identification is a prerequisite for precision livestock farming. Biometric-based methods have made significant progress in cow identification. However, substantial labelling costs and frequent identification task changes are still hamper model application. In this work, a novel method called “MFCI” was proposed to achieve accurate cow identification under few-shot and task-changing conditions. Specifically, the proposed method comprises two components: cow location and cow identification. First, an improved YOLOv5n with Ghost module was adopted to quickly detect cow locations in images. Then, the Model-Agnostic Meta-Learning (MAML) framework was introduced for accurate identification under few-shot conditions and for fast adaptation to frequent changes in individual cows. Moreover, an autoencoder was adopted to allow Base-Learner learn more generalized features by combining both supervised and unsupervised approaches. The experimental results showed that the proposed cow location model achieved a mAP of 99.5 %. The proposed cow identification model attained an accuracy of 90.43 % with only five samples per cow for 20 cows, outperforming other state-of-the-art methods. The results demonstrate the broad applicability and significant value of the proposed method.
ISSN:2214-3173