Skip to main content
Video s3
    Details
    Presenter(s)
    Corey Lammie Headshot
    Display Name
    Corey Lammie
    Affiliation
    Affiliation
    James Cook University
    Country
    Country
    Australia
    Abstract

    Stochastic Computing (SC) is a computing paradigm that allows for the low-cost and low-power computation of various arithmetic operations using stochastic bit streams and digital logic. In contrast to conventional representation schemes used within the binary domain, the sequence of bit streams in the stochastic domain is inconsequential, and computation is usually non-deterministic. In this paper, we exploit the stochasticity during switching of probabilistic Conductive Bridging RAM (CBRAM) devices to efficiently generate stochastic bit streams in order to perform Deep Learning (DL) parameter optimization, reducing the size of Multiply and Accumulate (MAC) units by 5 orders of magnitude. We demonstrate that in using a 40-nm Complementary Metal Oxide Semiconductor (CMOS) process our scalable architecture occupies 1.55mm2 and consumes approximately 167uW when optimizing parameters of a Convolutional Neural Network (CNN) while it is being trained for a character recognition task, observing no notable reduction in accuracy post-training.

    Slides
    • Memristive Stochastic Computing for Deep Learning Parameter Optimization (application/pdf)