Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI

This article starts from the perspective of breaking the integrity of the feature matrix, dividing it into retained and sacrificed parts, and using the sacrificed parts to strengthen the retained parts. We propose SpiltAtt and ShuSpilt modules to sacrifice some features to enhance the backbone witho...

Full description

Saved in:
Bibliographic Details
Main Authors: Hai Peng Wang, Zhi Qing Geng
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10810386/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This article starts from the perspective of breaking the integrity of the feature matrix, dividing it into retained and sacrificed parts, and using the sacrificed parts to strengthen the retained parts. We propose SpiltAtt and ShuSpilt modules to sacrifice some features to enhance the backbone without introducing any parameters. To ensure that this enhancement is effective, we also propose STloss function based on a specific structure. During training, only a slight increase in computation is required, and the SpiltAtt structure is deleted after the training is completed. This article selected three datasets from GAI for experiments, using Macro-F1 score as the evaluation index. Through a series of comparative experiments, the effectiveness of these methods was demonstrated. Due to the fact that the method proposed in this article does not increase the number of parameters, the added computational cost can be ignored. Therefore, it has certain advantages compared to other methods, and the construction ideas of these methods have certain reference significance when doing other tasks. This approach of sacrificing some features to enhance retained features elucidates the essence of neural networks from a new perspective.
ISSN:2169-3536