Realistic facial expression reconstruction for VR HMD users

Jianwen Lou, Yiming Wang, Charles Nduka, Mahyar Hamedi, Ifigeneia Mavridou, Fei-Yue Wang, Hui Yu

    Research output: Contribution to journalArticlepeer-review

    1126 Downloads (Pure)

    Abstract

    We present a system for sensing and reconstructing facial expressions of the virtual reality (VR) head-mounted display (HMD) user. The HMD occludes a large portion of the user’s face, which makes most existing facial performance capturing techniques intractable. To tackle this problem, a novel hardware solution with electromyography (EMG) sensors being attached to the headset frame is applied to track facial muscle movements. For realistic facial expression recovery, we first reconstruct the user’s 3D face from a single image and generate the personalized blendshapes associated with seven facial action units (AUs) on the most emotionally salient facial parts (ESFPs). We then utilize preprocessed EMG signals for measuring activations of AU-coded facial expressions to drive pre-built personalized blendshapes. Since facial expressions appear as important nonverbal cues of the subject’s internal emotional states, we further investigate the relationship between six basic emotions - anger, disgust, fear, happiness, sadness and surprise, and detected AUs using a fern classifier. Experiments show the proposed system can accurately sense and reconstruct high-fidelity common facial expressions while providing useful information regarding the emotional state of the HMD user.
    Original languageEnglish
    Pages (from-to)730-743
    Number of pages14
    JournalIEEE Transactions on Multimedia
    Volume22
    Issue number3
    DOIs
    Publication statusPublished - 8 Aug 2019

    Keywords

    • RCUK
    • EPSRC
    • EP/N025849/1

    Fingerprint

    Dive into the research topics of 'Realistic facial expression reconstruction for VR HMD users'. Together they form a unique fingerprint.

    Cite this