Realistic facial animation generation based on facial expression mapping

Hui Yu, Oliver Garrod, Rachael Jack, Philippe Schyns

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    551 Downloads (Pure)

    Abstract

    Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being’s sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation.
    Original languageEnglish
    Title of host publicationFifth International Conference on Graphic and Image Processing (ICGIP 2013)
    EditorsYulin Wang, Xudong Jiang, Ming Yang, David Zhang, Xie Yi
    PublisherSociety of Photographic Instrumentation Engineers
    Pages906903
    DOIs
    Publication statusPublished - 2014

    Publication series

    NameSPIE proceedings
    PublisherSPIE
    Volume9069

    Fingerprint

    Dive into the research topics of 'Realistic facial animation generation based on facial expression mapping'. Together they form a unique fingerprint.

    Cite this