Details
Poster
Presenter(s)
![Rupesh Raj Karn Headshot](https://confcats-catavault.s3.amazonaws.com/CATAVault/ieeecass/master/files/styles/cc_user_photo/s3/user-pictures/15481.jpg?h=04d92ac6&itok=3kGu-LBn)
Display Name
Rupesh Raj Karn
- Affiliation
-
AffiliationKhalifa University
- Country
Abstract
Task progressive learning is often required where the training data are available in batches over the time. Artificial Neural Networks (ANNs) have a high capacity for progressive learning due to the availability of a large number of ANN parameters. But most of these progressive models uses fully connected ANNs. This results in a large number of network parameters resulting in long training time, overfitting, and excessive resource usage. In this paper, an algorithm is presented to generate a partially connected compact neural network by expanding and pruning the network dynamically based on requirements exerted by new tasks for progressive learning.