Details
Poster
Presenter(s)
![Cecilia Eugenia De la Parra Headshot](https://confcats-catavault.s3.amazonaws.com/CATAVault/ieeecass/master/files/styles/cc_user_photo/s3/user-pictures/12571.jpg?h=3bbcbd73&itok=curBdXDt)
Display Name
Cecilia Eugenia De la Parra
- Affiliation
-
AffiliationRobert Bosch GmbH
- Country
Abstract
Approximate Computing is a promising paradigm for mitigating computational requirements of DNNs by taking advantage of their error resilience. Specifically, the use of approximate multipliers in DNN inference can lead to significant improvements in power consumption of embedded DNN applications. This paper presents a methodology for efficient approximate multiplier selection and for full and uniform approximation of large DNNs, through retraining and minimization of the approximation error. We evaluate our methodology using 422 approximate multipliers from the EvoApprox library, with three Residual architectures trained with Cifar10, and achieve energy savings of up to 58% with an accuracy loss of 0.73%.