Skip to main content
Video s3
    Details
    Presenter(s)
    Yulong Yan Headshot
    Display Name
    Yulong Yan
    Affiliation
    Affiliation
    Fudan University
    Country
    Abstract

    Dedicated hardware for spiking neural networks (SNN) reduces energy consumption with spike-driven computing. This paper proposes a graph-based spatio-temporal backpropagation (G-STBP) to train SNN, aiming to enhance spike sparsity for energy efficiency, while ensuring the accuracy. A differentiable leaky integrate-and-fire (LIF) model is suggested to establish the backpropagation path. The sparse regularization is proposed to reduce the spike firing rate with a guaranteed accuracy. GSTBP enables training in any network topologies thanks to graph representation. A recurrent network is demonstrated with spike-sparse rank order coding. The experimental result on rank order coded MNIST shows that the recurrent SNN trained by G-STBP achieves the accuracy of 96.8% using 341 spikes per inference.

    Slides
    • Graph-Based Spatio-Temporal Backpropagation for Training Spiking Neural Networks (application/pdf)