Loss shaping enhances exact gradient learning with Eventprop in spiking neural networks

Event-based machine learning promises more energy-efficient AI on future neuromorphic hardware. Here, we investigate how the recently discovered Eventprop algorithm for gradient descent on exact gradients in spiking neural networks (SNNs) can be scaled up to challenging keyword recognition benchmark...

Full description

Saved in:
Bibliographic Details
Main Authors: Thomas Nowotny, James P Turner, James C Knight
Format: Article
Language:English
Published: IOP Publishing 2025-01-01
Series:Neuromorphic Computing and Engineering
Subjects:
Online Access:https://doi.org/10.1088/2634-4386/ada852
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Event-based machine learning promises more energy-efficient AI on future neuromorphic hardware. Here, we investigate how the recently discovered Eventprop algorithm for gradient descent on exact gradients in spiking neural networks (SNNs) can be scaled up to challenging keyword recognition benchmarks. We implemented Eventprop in the GPU-enhanced neural networks framework (GeNN) and used it for training recurrent SNNs on the Spiking Heidelberg Digits (SHD) and Spiking Speech Commands (SSC) datasets. We found that learning depended strongly on the loss function and extended Eventprop to a wider class of loss functions to enable effective training. We then tested a large number of data augmentations and regularisations as well as exploring different network structures; and heterogeneous and trainable timescales. We found that when combined with two specific augmentations, the right regularisation and a delay line input, Eventprop networks with one recurrent layer achieved state-of-the-art performance on SHD and good accuracy on SSC. In comparison to a leading surrogate-gradient-based SNN training method, our GeNN Eventprop implementation is 3× faster and uses 4× less memory. This work is a significant step towards a low-power neuromorphic alternative to current machine learning paradigms.
ISSN:2634-4386