Skip to main content
Video s3
    Details
    Author(s)
    Affiliation
    Affiliation
    Instituto de Microelectronica de Sevilla, IMSE-CNM (CSIC/Universidad de Sevilla)
    Affiliation
    Affiliation
    imec
    Display Name
    Guangzhi Tang
    Affiliation
    Affiliation
    imec
    Display Name
    Federico Corradi
    Affiliation
    Affiliation
    Eindhoven University of Technology
    Affiliation
    Affiliation
    Instituto de Microelectronica de Sevilla (IMSE-CNM), CSIC and Universidad de Sevilla
    Display Name
    Manolis Sifalakis
    Affiliation
    Affiliation
    imec
    Abstract

    This paper proposes a methodology for training axonal delays in Spiking Neural Networks (SNNs), intending to efficiently solve machine learning tasks on data with rich temporal dependencies. We conduct a study of the effects of axonal delays on model performance for the Adding task and for the Spiking Heidelberg Digits dataset (SHD). Numerical results on the SHD show that SNNs with axonal delays achieve state-of-the-art, over 90% test accuracy while needing less than half trainable synapses. Estimations of energy and memory requirements based on the number of synaptic operations show an approximate 90% reduction in digital hardware implementations.