Skip to main content
Video s3
    Details
    Poster
    Presenter(s)
    Cheng-Jie Yang Headshot
    Display Name
    Cheng-Jie Yang
    Affiliation
    Affiliation
    National Chiao Tung University
    Country
    Abstract

    In this paper, we presented an affective computing engine using the Long-term Recurrent Convolutional Network (LRCN) on electroencephalogram (EEG) physiological signal with 8 emotion-related channels. LRCN was chosen as the emotional classifier because compared to the traditional CNN, it integrates memory units allowing the network to discard or update previous hidden states which are particularly adapted to explore the temporal emotional information in the EEG sequence data. To achieve a real-time AI-edge affective computing system, the LRCN model was implemented on a 16nm Fin-FET technology chip. The core area and total power consumption of the LRCN chip are respectively 1.13x1.14 mm2 and 48.24 mW. The computation time was 1.9µs and met the requirements to inference every sample. The training process cost 5.5µs per sample on clock frequency 125Hz which was more than 20 times faster than 128µs achieved with GeForce GTX 1080 Ti using python. The proposed model was evaluated on 52 subjects with cross-subject validation and achieved average accuracy of 88.34%, and 75.92% for respectively 2-class, 3-class.

    Slides
    • Real-Time EEG-Based Affective Computing Using On-Chip Learning Long-Term Recurrent Convolutional Network (application/pdf)