Skip to main content
Video s3
    Details
    Presenter(s)
    Shinsei Yoshikiyo Headshot
    Display Name
    Shinsei Yoshikiyo
    Affiliation
    Affiliation
    University of Tokyo
    Country
    Country
    Japan
    Author(s)
    Display Name
    Shinsei Yoshikiyo
    Affiliation
    Affiliation
    University of Tokyo
    Display Name
    Naoko Misawa
    Affiliation
    Affiliation
    University of Tokyo
    Display Name
    Chihiro Matsui
    Affiliation
    Affiliation
    University of Tokyo
    Display Name
    Ken Takeuchi
    Affiliation
    Affiliation
    University of Tokyo
    Abstract

    This paper proposes a Computation-in-Memory (CiM) architecture for in-situ class-incremental learning. The proposed CiM updates only the final fully connected (fc) layer. CiM does not need backpropagation and the number of rewrites to CiM devices is small. The proposed CiM realizes knowledge distillation by the cooperation of digital processor and CiM, and can be retrained even when the old class data are not available. As a result, the accuracy keeps above 80% on CIFAR-10 when the bit precision of convolution and fc layer are 6 bits and 3 bits, and those bit-error rates are less than 0.001% and 1%, respectively.

    Slides
    • Edge Computation-in-Memory for In-Situ Class-Incremental Learning with Knowledge Distillation (application/pdf)