Skip to main content
Video s3
    Details
    Poster
    Presenter(s)
    Vinay Joshi Headshot
    Display Name
    Vinay Joshi
    Affiliation
    Affiliation
    IBM Research - Zurich
    Country
    Abstract

    Stochastic computing is an efficient alternative to floating-point multiplications given that operands are in [0, 1]. We propose ESSOP, efficient and scalable architecture to generalize stochastic computing for weight update computation in DNNs with unbounded activation functions, required by many state-of-the-art networks. We show that the ResNet-32 network with 34 layers can be trained with ESSOP on the CIFAR-10 dataset to achieve baseline comparable accuracy. Hardware design of ESSOP at 14nm technology node shows that, compared to a highly pipelined FP16 multiplier design, ESSOP is 82.2% and 93.7% better in energy and area efficiency respectively, for outer product computation.