Details
Presenter(s)
![Muhammad Awais Hussain Headshot](https://confcats-catavault.s3.amazonaws.com/CATAVault/ieeecass/master/files/styles/cc_user_photo/s3/user-pictures/10021_0.jpg?h=e89ddd09&itok=AQhdddGm)
Display Name
Muhammad Awais Hussain
- Affiliation
-
AffiliationNational Central University
- Country
-
CountryTaiwan
Abstract
Deep neural networks are widely used in computer vision applications due to their high performance. However, DNNs involve a large number of computations in the training and inference phase. Among the different layers of a DNN, the softmax layer has one of the most complex computations as it involves exponent and division operations. So, a hardware efficient implementation is required to reduce the on-chip resources. In this paper, we propose a new hardware-efficient and fast implementation of the softmax activation function. The proposed hardware implementation consumes fewer hardware resources and works at high speed as compared to the state-of-the-art techniques.