Skip to main content
    Details
    Author(s)
    Display Name
    Alessio Canepa
    Affiliation
    Affiliation
    Università di Genova
    Display Name
    Edoardo Ragusa
    Affiliation
    Affiliation
    University of Genoa
    Affiliation
    Affiliation
    University of Genoa
    Display Name
    Paolo Gastaldo
    Affiliation
    Affiliation
    University of Genoa
    Display Name
    Rodolfo Zunino
    Affiliation
    Affiliation
    University of Genoa
    Abstract

    State of the Art solutions for image classification are nowadays mostly based on Convolutional Neural Networks (CNNs). With respect to running them for inference, training CNNs is a much more resource demanding job. For this reason, embedded devices designed to execute neural networks are normally not suitable to train the same networks. Hand image classification is becoming increasingly popular in several application areas, by empowering embedded systems with the ability to recognise hand gestures or grasp types. In this field, making it possible to retrain the hand image classifier directly on the embedded device would allow an easy customization of classification task according to application specific needs. In this paper we present a method based on transfer learning which solves this problem by first extracting hand keypoints, using with a dedicated neural network, and then applying on the resulting data a lightweight classifier, easily retrainable directly on the target embedded system. The proposal has been validated on a new freely distributed dataset, outperforming state of the art solutions in the specific task.