Skip to main content
Video s3
    Details
    Presenter(s)
    Jonah Sengupta Headshot
    Display Name
    Jonah Sengupta
    Affiliation
    Affiliation
    Johns Hopkins University
    Country
    Abstract

    Neuromorphic cameras that offer low latency and dynamic scene sensing are emerging as a viable technology for energy-aware embedded perceptual systems. In this paper we report on neuromorphic architecture and algorithm exploration for an event-based accelerator for neuromorphic cameras. The system includes a RISC-V CPU and associated peripherals that capture and process event-based visual data coming from a neuromorphic dynamic vision sensor. Mapped into a reconfigurable computing platform (FPGA), we demonstrate a set of event-based visual processing tasks including noise filtering, corner detection, and object tracking.

    Slides
    • Architecture and Algorithm Co-Design Framework for Embedded Processors in Event-Based Cameras (application/pdf)