Real-time facial affective computing on mobile devices

Yuanyuan Guo, Yifan Xia, Jing Wang*, Hui Yu*, Rung-Ching Chen

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    147 Downloads (Pure)

    Abstract

    Convolutional Neural Networks (CNNs) have become one of the state-of-the-art methods for various computer vision and pattern recognition tasks including facial affective computing. Although impressive results have been obtained in facial affective computing using CNNs, the computational complexity of CNNs has also increased significantly. This means high performance hardware is typically indispensable. Most existing CNNs are thus not generalizable enough for mobile devices, where the storage, memory and computational power are limited. In this paper, we focus on the design and implementation of CNNs on mobile devices for real-time facial affective computing tasks. We propose a light-weight CNN architecture which well balances the performance and computational complexity. The experimental results show that the proposed architecture achieves high performance while retaining the low computational complexity compared with state-of-the-art methods. We demonstrate the feasibility of a CNN architecture in terms of speed, memory and storage consumption for mobile devices by implementing a real-time facial affective computing application on an actual mobile device.
    Original languageEnglish
    Article number870
    Number of pages15
    JournalSensors
    Volume20
    Issue number3
    DOIs
    Publication statusPublished - 6 Feb 2020

    Keywords

    • facial affective computing
    • convolutional neural networks
    • deep learning
    • mobile development
    • RCUK
    • EPSRC
    • EP/N025849/1

    Fingerprint

    Dive into the research topics of 'Real-time facial affective computing on mobile devices'. Together they form a unique fingerprint.

    Cite this