Skip to main content
Video s3
    Details
    Presenter(s)
    Sayma Nowshin Chowdhury Headshot
    Affiliation
    Affiliation
    University of Maryland, College Park
    Country
    Author(s)
    Affiliation
    Affiliation
    University of Maryland, College Park
    Display Name
    Shah Sahil
    Affiliation
    Affiliation
    University of Maryland, College Park
    Abstract

    The work presents a hardware aware modeling and learning rule for mixed-signal Spiking Neural Network (SNN). The Python-based models are compared with transistor simulation in 65nm CMOS technology. The general approach to training and inferring with mixed-signal SNN is to learn the weights offline and then deploy them on neuromorphic hardware. However, in the case of analog or mixed-signal hardware, the circuits employed for computation are non-linear and have significant variability. This work is an initial proof-of-concept towards fully modeling this non-linearity and learning with these circuits. The study specifically employs Floating-Gate (FG) transistor-based synapses for storing the weights of the synapses and adaptive Leaky-Integrate and Fire circuit for modeling the neurons in the SNN. Additionally, the study develops a localized gradient-based algorithm to learn the FG voltage of the analog synapses SNN. This approach enables integrating the non-linearity of analog synapses into the learning framework. Using these models and learning algorithm the work shows system-level classification using multiple layers, neurons and time-steps.

    Slides
    • Hardware Aware Modeling of Mixed-Signal Spiking Neural Network (application/pdf)