Self-Attention Multilayer Feature Fusion Based on Long Connection

Feature fusion is an important part of building high-precision convolutional neural networks. In the field of image classification, though widely used in processing multiscale features of the same layer and short connections in the same receptive field, feature fusion is rarely used in long connecti...

Full description

Saved in:
Bibliographic Details
Main Authors: Chu Yuezhong, Wang Jiaqing, Liu Heng
Format: Article
Language:English
Published: Wiley 2022-01-01
Series:Advances in Multimedia
Online Access:http://dx.doi.org/10.1155/2022/9973814
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832553480521252864
author Chu Yuezhong
Wang Jiaqing
Liu Heng
author_facet Chu Yuezhong
Wang Jiaqing
Liu Heng
author_sort Chu Yuezhong
collection DOAJ
description Feature fusion is an important part of building high-precision convolutional neural networks. In the field of image classification, though widely used in processing multiscale features of the same layer and short connections in the same receptive field, feature fusion is rarely used in long connection operations across receptive fields. In order to fuse the high- and low-level features of image classification, a feature fusion module SCFF (selective cross-layer feature fusion) for long connections is designed in this work. The SCFF can connect the long-distance feature maps in different receptive fields in a top-down order and apply the self-attention mechanism to fuse them two by two. The final fusion result is used as the input of the classifier. In order to verify the effectiveness of the model, the image classification experiment was done on a number of typical datasets. The experimental results prove that the model can fit the existing convolutional neural network well and effectively improve the classification accuracy of the convolutional network only at the cost of a small amount of calculation.
format Article
id doaj-art-863bf3f92c7f43b3834c1f381b0c9c52
institution Kabale University
issn 1687-5699
language English
publishDate 2022-01-01
publisher Wiley
record_format Article
series Advances in Multimedia
spelling doaj-art-863bf3f92c7f43b3834c1f381b0c9c522025-02-03T05:53:50ZengWileyAdvances in Multimedia1687-56992022-01-01202210.1155/2022/9973814Self-Attention Multilayer Feature Fusion Based on Long ConnectionChu Yuezhong0Wang Jiaqing1Liu Heng2School of Computer Science and TechnologySchool of Computer Science and TechnologySchool of Computer Science and TechnologyFeature fusion is an important part of building high-precision convolutional neural networks. In the field of image classification, though widely used in processing multiscale features of the same layer and short connections in the same receptive field, feature fusion is rarely used in long connection operations across receptive fields. In order to fuse the high- and low-level features of image classification, a feature fusion module SCFF (selective cross-layer feature fusion) for long connections is designed in this work. The SCFF can connect the long-distance feature maps in different receptive fields in a top-down order and apply the self-attention mechanism to fuse them two by two. The final fusion result is used as the input of the classifier. In order to verify the effectiveness of the model, the image classification experiment was done on a number of typical datasets. The experimental results prove that the model can fit the existing convolutional neural network well and effectively improve the classification accuracy of the convolutional network only at the cost of a small amount of calculation.http://dx.doi.org/10.1155/2022/9973814
spellingShingle Chu Yuezhong
Wang Jiaqing
Liu Heng
Self-Attention Multilayer Feature Fusion Based on Long Connection
Advances in Multimedia
title Self-Attention Multilayer Feature Fusion Based on Long Connection
title_full Self-Attention Multilayer Feature Fusion Based on Long Connection
title_fullStr Self-Attention Multilayer Feature Fusion Based on Long Connection
title_full_unstemmed Self-Attention Multilayer Feature Fusion Based on Long Connection
title_short Self-Attention Multilayer Feature Fusion Based on Long Connection
title_sort self attention multilayer feature fusion based on long connection
url http://dx.doi.org/10.1155/2022/9973814
work_keys_str_mv AT chuyuezhong selfattentionmultilayerfeaturefusionbasedonlongconnection
AT wangjiaqing selfattentionmultilayerfeaturefusionbasedonlongconnection
AT liuheng selfattentionmultilayerfeaturefusionbasedonlongconnection