Fourier or Wavelet bases as counterpart self-attention in spikformer for efficient visual classification

Energy-efficient spikformer has been proposed by integrating the biologically plausible spiking neural network (SNN) and artificial transformer, whereby the spiking self-attention (SSA) is used to achieve both higher accuracy and lower computational cost. However, it seems that self-attention is not...

Full description

Saved in:
Bibliographic Details
Main Authors: Qingyu Wang, Duzhen Zhang, Xinyuan Cai, Tielin Zhang, Bo Xu
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-01-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2024.1516868/full
Tags: Add Tag
No Tags, Be the first to tag this record!