Skip to main content
Video s3
    Details
    Poster
    Presenter(s)
    Rupesh Raj Karn Headshot
    Display Name
    Rupesh Raj Karn
    Affiliation
    Affiliation
    Khalifa University
    Country
    Abstract

    Task progressive learning is often required where the training data are available in batches over the time. Artificial Neural Networks (ANNs) have a high capacity for progressive learning due to the availability of a large number of ANN parameters. But most of these progressive models uses fully connected ANNs. This results in a large number of network parameters resulting in long training time, overfitting, and excessive resource usage. In this paper, an algorithm is presented to generate a partially connected compact neural network by expanding and pruning the network dynamically based on requirements exerted by new tasks for progressive learning.