Skip to main content
Video s3
    Details
    Abstract

    Neuromorphic spiking sensors including the event-driven Dynamic Audio Sensor (DAS) and Dynamic Vision Sensor (DVS) event camera are inspired by the functionality of the biological retina and cochlea equivalents. The asynchronous outputs of these event-driven sensors can enable always-on sensing at possible low-latency system-level response time than conventional sampled sensors for Internet of Things (IoT) and Brain-Machine Interface (BMI) applications. Recent developments in deep networks, spiking networks, analog and in-memory computing and non-volatile novel memory devices have led to very low-power neuromorphic systems that combine these sensors and networks for these application domains. Through supervised and unsupervised learning methods from deep learning and neuroscience fields, we show configurations of these systems and equivalent hardware to achieve high accuracy and low latency on benchmark tasks. The novel memory devices help to reduce power from off-chip memory access and can additionally, support different learning algorithms useful for adaptive neuromorphic systems. Analog In-memory computing methods reduce energy required to implement matrix multiplies, nonlinearities and other signal-processing operations over digital systems