Details
Abstract
This paper proposes a methodology for training axonal delays in Spiking Neural Networks (SNNs), intending to efficiently solve machine learning tasks on data with rich temporal dependencies. We conduct a study of the effects of axonal delays on model performance for the Adding task and for the Spiking Heidelberg Digits dataset (SHD). Numerical results on the SHD show that SNNs with axonal delays achieve state-of-the-art, over 90% test accuracy while needing less than half trainable synapses. Estimations of energy and memory requirements based on the number of synaptic operations show an approximate 90% reduction in digital hardware implementations.