Details
Presenter(s)
Display Name
Shinsei Yoshikiyo
- Affiliation
-
AffiliationUniversity of Tokyo
- Country
-
CountryJapan
Abstract
This paper proposes a Computation-in-Memory (CiM) architecture for in-situ class-incremental learning. The proposed CiM updates only the final fully connected (fc) layer. CiM does not need backpropagation and the number of rewrites to CiM devices is small. The proposed CiM realizes knowledge distillation by the cooperation of digital processor and CiM, and can be retrained even when the old class data are not available. As a result, the accuracy keeps above 80% on CIFAR-10 when the bit precision of convolution and fc layer are 6 bits and 3 bits, and those bit-error rates are less than 0.001% and 1%, respectively.