Sa-SNN: spiking attention neural network for image classification

Spiking neural networks (SNNs) are known as third generation neural networks due to their energy efficient and low power consumption. SNNs have received a lot of attention due to their biological plausibility. SNNs are closer to the way biological neural systems work by simulating the transmission o...

Full description

Saved in:
Bibliographic Details
Main Authors: Yongping Dan, Zhida Wang, Hengyi Li, Jintong Wei
Format: Article
Language:English
Published: PeerJ Inc. 2024-11-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-2549.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850270472479965184
author Yongping Dan
Zhida Wang
Hengyi Li
Jintong Wei
author_facet Yongping Dan
Zhida Wang
Hengyi Li
Jintong Wei
author_sort Yongping Dan
collection DOAJ
description Spiking neural networks (SNNs) are known as third generation neural networks due to their energy efficient and low power consumption. SNNs have received a lot of attention due to their biological plausibility. SNNs are closer to the way biological neural systems work by simulating the transmission of information through discrete spiking signals between neurons. Influenced by the great potential shown by the attention mechanism in convolutional neural networks, Therefore, we propose a Spiking Attention Neural Network (Sa-SNN). The network includes a novel Spiking-Efficient Channel Attention (SECA) module that adopts a local cross-channel interaction strategy without dimensionality reduction, which can be achieved by one-dimensional convolution. It is implemented by convolution, which involves a small number of model parameters but provides a significant performance improvement for the network. The design of local inter-channel interactions through adaptive convolutional kernel sizes, rather than global dependencies, allows the network to focus more on the selection of important features, reduces the impact of redundant features, and improves the network’s recognition and generalisation capabilities. To investigate the effect of this structure on the network, we conducted a series of experiments. Experimental results show that Sa-SNN can perform image classification tasks more accurately. Our network achieved 99.61%, 99.61%, 94.13%, and 99.63% on the MNIST, Fashion-MNIST, N-MNIST datasets, respectively, and Sa-SNN performed well in terms of accuracy compared with mainstream SNNs.
format Article
id doaj-art-2d0a4e8c7aed48ca8562641b2fb7e117
institution OA Journals
issn 2376-5992
language English
publishDate 2024-11-01
publisher PeerJ Inc.
record_format Article
series PeerJ Computer Science
spelling doaj-art-2d0a4e8c7aed48ca8562641b2fb7e1172025-08-20T01:52:38ZengPeerJ Inc.PeerJ Computer Science2376-59922024-11-0110e254910.7717/peerj-cs.2549Sa-SNN: spiking attention neural network for image classificationYongping Dan0Zhida Wang1Hengyi Li2Jintong Wei3School of Electronic and Information, Zhongyuan University of Technology, Zhengzhou, Henan, ChinaSchool of Electronic and Information, Zhongyuan University of Technology, Zhengzhou, Henan, ChinaResearch Organization of Science and Technology, Ritsumeikan University, Kusatsu, JapanSchool of Electronic and Information, Zhongyuan University of Technology, Zhengzhou, Henan, ChinaSpiking neural networks (SNNs) are known as third generation neural networks due to their energy efficient and low power consumption. SNNs have received a lot of attention due to their biological plausibility. SNNs are closer to the way biological neural systems work by simulating the transmission of information through discrete spiking signals between neurons. Influenced by the great potential shown by the attention mechanism in convolutional neural networks, Therefore, we propose a Spiking Attention Neural Network (Sa-SNN). The network includes a novel Spiking-Efficient Channel Attention (SECA) module that adopts a local cross-channel interaction strategy without dimensionality reduction, which can be achieved by one-dimensional convolution. It is implemented by convolution, which involves a small number of model parameters but provides a significant performance improvement for the network. The design of local inter-channel interactions through adaptive convolutional kernel sizes, rather than global dependencies, allows the network to focus more on the selection of important features, reduces the impact of redundant features, and improves the network’s recognition and generalisation capabilities. To investigate the effect of this structure on the network, we conducted a series of experiments. Experimental results show that Sa-SNN can perform image classification tasks more accurately. Our network achieved 99.61%, 99.61%, 94.13%, and 99.63% on the MNIST, Fashion-MNIST, N-MNIST datasets, respectively, and Sa-SNN performed well in terms of accuracy compared with mainstream SNNs.https://peerj.com/articles/cs-2549.pdfBrain-like computingSpiking neural networksImage classificationAttention mechanism
spellingShingle Yongping Dan
Zhida Wang
Hengyi Li
Jintong Wei
Sa-SNN: spiking attention neural network for image classification
PeerJ Computer Science
Brain-like computing
Spiking neural networks
Image classification
Attention mechanism
title Sa-SNN: spiking attention neural network for image classification
title_full Sa-SNN: spiking attention neural network for image classification
title_fullStr Sa-SNN: spiking attention neural network for image classification
title_full_unstemmed Sa-SNN: spiking attention neural network for image classification
title_short Sa-SNN: spiking attention neural network for image classification
title_sort sa snn spiking attention neural network for image classification
topic Brain-like computing
Spiking neural networks
Image classification
Attention mechanism
url https://peerj.com/articles/cs-2549.pdf
work_keys_str_mv AT yongpingdan sasnnspikingattentionneuralnetworkforimageclassification
AT zhidawang sasnnspikingattentionneuralnetworkforimageclassification
AT hengyili sasnnspikingattentionneuralnetworkforimageclassification
AT jintongwei sasnnspikingattentionneuralnetworkforimageclassification