Skip to main content
Video s3
    Details
    Presenter(s)
    Po-Chang Li Headshot
    Display Name
    Po-Chang Li
    Affiliation
    Affiliation
    National Sun Yat-sen University
    Country
    Author(s)
    Display Name
    Shen-Fu Hsiao
    Affiliation
    Affiliation
    National Sun Yat-Sen University
    Display Name
    Hung-Ching Li
    Affiliation
    Affiliation
    National Sun Yat-sen University
    Display Name
    Yu-Che Yen
    Affiliation
    Affiliation
    National Sun Yat-sen University
    Display Name
    Po-Chang Li
    Affiliation
    Affiliation
    National Sun Yat-sen University
    Abstract

    We propose a digit-serial Deep Neural Network (DNN) hardware accelerator design using flexible digit-serial multiplication and early termination mechanism which supports multi-precision computation with different bit-widths in different DNN layers. The dynamically swappable digit-serial multiplier design employs radix-4 Booth recoding for either weights or activations whichever having the smaller quantized bit-width and performs digit-serial sequential multiplication with one input in the Booth-recoded digit-serial form and the other in bit-parallel form. Furthermore, we add a so-called early-termination technique that stops DNN computation whenever the partially accumulated sums are smaller than pre-scribed thresholds, leading to additional speedup when the Rectified Linear Unit (ReLU) is used as the non-linear activation function. Implementation results show that the proposed design can achieve more than 50% speedup compared with the baseline 16-bit fixed-point design.

    Slides
    • Dynamically Swappable Digit-Serial Multi-Precision Deep Neural Network Accelerator with Early Termination (application/pdf)