Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI

This article starts from the perspective of breaking the integrity of the feature matrix, dividing it into retained and sacrificed parts, and using the sacrificed parts to strengthen the retained parts. We propose SpiltAtt and ShuSpilt modules to sacrifice some features to enhance the backbone witho...

Full description

Saved in:
Bibliographic Details
Main Authors: Hai Peng Wang, Zhi Qing Geng
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10810386/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832586902840016896
author Hai Peng Wang
Zhi Qing Geng
author_facet Hai Peng Wang
Zhi Qing Geng
author_sort Hai Peng Wang
collection DOAJ
description This article starts from the perspective of breaking the integrity of the feature matrix, dividing it into retained and sacrificed parts, and using the sacrificed parts to strengthen the retained parts. We propose SpiltAtt and ShuSpilt modules to sacrifice some features to enhance the backbone without introducing any parameters. To ensure that this enhancement is effective, we also propose STloss function based on a specific structure. During training, only a slight increase in computation is required, and the SpiltAtt structure is deleted after the training is completed. This article selected three datasets from GAI for experiments, using Macro-F1 score as the evaluation index. Through a series of comparative experiments, the effectiveness of these methods was demonstrated. Due to the fact that the method proposed in this article does not increase the number of parameters, the added computational cost can be ignored. Therefore, it has certain advantages compared to other methods, and the construction ideas of these methods have certain reference significance when doing other tasks. This approach of sacrificing some features to enhance retained features elucidates the essence of neural networks from a new perspective.
format Article
id doaj-art-fd34f2f9b1324a1f99439b104ff7e8e1
institution Kabale University
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-fd34f2f9b1324a1f99439b104ff7e8e12025-01-25T00:01:47ZengIEEEIEEE Access2169-35362025-01-0113127771278810.1109/ACCESS.2024.352050310810386Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAIHai Peng Wang0https://orcid.org/0009-0003-6560-5393Zhi Qing Geng1https://orcid.org/0009-0007-6211-4352School of Computer Science and Technology, Hebei University of Engineering, Handan, ChinaSchool of Computer Science and Technology, Hebei University of Engineering, Handan, ChinaThis article starts from the perspective of breaking the integrity of the feature matrix, dividing it into retained and sacrificed parts, and using the sacrificed parts to strengthen the retained parts. We propose SpiltAtt and ShuSpilt modules to sacrifice some features to enhance the backbone without introducing any parameters. To ensure that this enhancement is effective, we also propose STloss function based on a specific structure. During training, only a slight increase in computation is required, and the SpiltAtt structure is deleted after the training is completed. This article selected three datasets from GAI for experiments, using Macro-F1 score as the evaluation index. Through a series of comparative experiments, the effectiveness of these methods was demonstrated. Due to the fact that the method proposed in this article does not increase the number of parameters, the added computational cost can be ignored. Therefore, it has certain advantages compared to other methods, and the construction ideas of these methods have certain reference significance when doing other tasks. This approach of sacrificing some features to enhance retained features elucidates the essence of neural networks from a new perspective.https://ieeexplore.ieee.org/document/10810386/Fine-grained imageattentionloss functionfeature enhancement
spellingShingle Hai Peng Wang
Zhi Qing Geng
Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI
IEEE Access
Fine-grained image
attention
loss function
feature enhancement
title Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI
title_full Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI
title_fullStr Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI
title_full_unstemmed Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI
title_short Improve Fine-Grained Feature Learning in Fine-Grained DataSet GAI
title_sort improve fine grained feature learning in fine grained dataset gai
topic Fine-grained image
attention
loss function
feature enhancement
url https://ieeexplore.ieee.org/document/10810386/
work_keys_str_mv AT haipengwang improvefinegrainedfeaturelearninginfinegraineddatasetgai
AT zhiqinggeng improvefinegrainedfeaturelearninginfinegraineddatasetgai