An Efficient Neural Cell Architecture for Spiking Neural Networks

Neurons in a Spiking Neural Network (SNN) communicate using electrical pulses or spikes. They fire or trigger conditionally, and learning is sensitive to such triggers' timing and duration. The Leaky Integrate and Fire (LIF) model is the most widely used SNN neuron model. Most existing LIF-base...

Full description

Saved in:
Bibliographic Details
Main Authors: Kasem Khalil, Ashok Kumar, Magdy Bayoumi
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of the Computer Society
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10972324/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849726854937706496
author Kasem Khalil
Ashok Kumar
Magdy Bayoumi
author_facet Kasem Khalil
Ashok Kumar
Magdy Bayoumi
author_sort Kasem Khalil
collection DOAJ
description Neurons in a Spiking Neural Network (SNN) communicate using electrical pulses or spikes. They fire or trigger conditionally, and learning is sensitive to such triggers' timing and duration. The Leaky Integrate and Fire (LIF) model is the most widely used SNN neuron model. Most existing LIF-based neurons use a fixed spike frequency, which prevents them from attaining near-optimal accuracy. A research challenge is to design energy and area-efficient SNN neural cells that provide high learning accuracy and are scalable. Recently, the idea of tuning the spiking pulses in SNN was proposed and found promising. This work builds on the pulse-tuning idea by proposing an area and energy-efficient, stable, and reconfigurable SNN cell that generates spikes and reconfigures its pulse width to achieve near-optimal learning. It auto-adapts spike rate and duration to attain near-optimal accuracies for various SNN applications. The proposed cell is designed in mixed-signal, known to be beneficial to SNN, implemented using 45-nm technology, occupies an area of 27 <inline-formula><tex-math notation="LaTeX">$\mu {\rm m}^{2}$</tex-math></inline-formula>, incurs 1.86 <inline-formula><tex-math notation="LaTeX">$\mu {\rm W}$</tex-math></inline-formula>, and yields a high learning performance of 99.12%, 96.37%, and 78.64% in N-MNIST, MNIST, and N-Caltech101 datasets, respectively. The proposed cell attains higher accuracy, scalability, energy, and area economy than the state-of-the-art SNN neurons. Its energy efficiency and compact design make it highly suitable for sensor network applications and embedded systems requiring real-time, low-power neuromorphic computing.
format Article
id doaj-art-bb4baad08dd045f8a0bfb2e63cc3d10a
institution DOAJ
issn 2644-1268
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Open Journal of the Computer Society
spelling doaj-art-bb4baad08dd045f8a0bfb2e63cc3d10a2025-08-20T03:10:03ZengIEEEIEEE Open Journal of the Computer Society2644-12682025-01-01659961210.1109/OJCS.2025.356342310972324An Efficient Neural Cell Architecture for Spiking Neural NetworksKasem Khalil0https://orcid.org/0000-0002-9659-8566Ashok Kumar1https://orcid.org/0000-0002-9740-1219Magdy Bayoumi2https://orcid.org/0000-0002-0630-5273Electrical and Computer Engineering Department, University of Mississippi, Oxford, MS, USAThe Center for Advanced Computer Studies, University of Louisiana at Lafayette, Lafayette, LA, USADepartment of Electrical and Computer Engineering, University of Louisiana at Lafayette, Lafayette, LA, USANeurons in a Spiking Neural Network (SNN) communicate using electrical pulses or spikes. They fire or trigger conditionally, and learning is sensitive to such triggers' timing and duration. The Leaky Integrate and Fire (LIF) model is the most widely used SNN neuron model. Most existing LIF-based neurons use a fixed spike frequency, which prevents them from attaining near-optimal accuracy. A research challenge is to design energy and area-efficient SNN neural cells that provide high learning accuracy and are scalable. Recently, the idea of tuning the spiking pulses in SNN was proposed and found promising. This work builds on the pulse-tuning idea by proposing an area and energy-efficient, stable, and reconfigurable SNN cell that generates spikes and reconfigures its pulse width to achieve near-optimal learning. It auto-adapts spike rate and duration to attain near-optimal accuracies for various SNN applications. The proposed cell is designed in mixed-signal, known to be beneficial to SNN, implemented using 45-nm technology, occupies an area of 27 <inline-formula><tex-math notation="LaTeX">$\mu {\rm m}^{2}$</tex-math></inline-formula>, incurs 1.86 <inline-formula><tex-math notation="LaTeX">$\mu {\rm W}$</tex-math></inline-formula>, and yields a high learning performance of 99.12%, 96.37%, and 78.64% in N-MNIST, MNIST, and N-Caltech101 datasets, respectively. The proposed cell attains higher accuracy, scalability, energy, and area economy than the state-of-the-art SNN neurons. Its energy efficiency and compact design make it highly suitable for sensor network applications and embedded systems requiring real-time, low-power neuromorphic computing.https://ieeexplore.ieee.org/document/10972324/Spiking neural cellspiking neural network (SNN)spike generation with pulse width controlsensor network
spellingShingle Kasem Khalil
Ashok Kumar
Magdy Bayoumi
An Efficient Neural Cell Architecture for Spiking Neural Networks
IEEE Open Journal of the Computer Society
Spiking neural cell
spiking neural network (SNN)
spike generation with pulse width control
sensor network
title An Efficient Neural Cell Architecture for Spiking Neural Networks
title_full An Efficient Neural Cell Architecture for Spiking Neural Networks
title_fullStr An Efficient Neural Cell Architecture for Spiking Neural Networks
title_full_unstemmed An Efficient Neural Cell Architecture for Spiking Neural Networks
title_short An Efficient Neural Cell Architecture for Spiking Neural Networks
title_sort efficient neural cell architecture for spiking neural networks
topic Spiking neural cell
spiking neural network (SNN)
spike generation with pulse width control
sensor network
url https://ieeexplore.ieee.org/document/10972324/
work_keys_str_mv AT kasemkhalil anefficientneuralcellarchitectureforspikingneuralnetworks
AT ashokkumar anefficientneuralcellarchitectureforspikingneuralnetworks
AT magdybayoumi anefficientneuralcellarchitectureforspikingneuralnetworks
AT kasemkhalil efficientneuralcellarchitectureforspikingneuralnetworks
AT ashokkumar efficientneuralcellarchitectureforspikingneuralnetworks
AT magdybayoumi efficientneuralcellarchitectureforspikingneuralnetworks