Skip to main content
Video s3
    Details
    Presenter(s)
    Van Thien Nguyen Headshot
    Display Name
    Van Thien Nguyen
    Affiliation
    Affiliation
    CEA-Leti
    Country
    Country
    France
    Author(s)
    Display Name
    Van Thien Nguyen
    Affiliation
    Affiliation
    CEA-Leti
    Display Name
    William Guicquero
    Affiliation
    Affiliation
    CEA-Leti
    Display Name
    Gilles Sicard
    Affiliation
    Affiliation
    CEA-Leti
    Abstract

    Adjusting the quantization according to the data or to the model loss seems mandatory in order to enable a high accuracy in the context of quantized neural networks. This work presents Histogram-Equalized Quantization (HEQ), a novel adaptive framework for linear and symmetric quantization. HEQ automatically adapts the quantization thresholds using a unique step size optimization. We empirically show that HEQ achieves state-of-the-art performances on CIFAR-10. Moreover, experiments on the STL-10 dataset show that HEQ enables a proper training of our proposed logic-gated (OR, MUX) residual networks with a higher accuracy at a lower hardware complexity than previous work.

    Slides
    • Histogram-Equalized Quantization for Logic-Gated Residual Neural Networks (application/pdf)