Skip to content

Facial expression-aware face frontalization

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Face frontalization is a rising technique for view-invariant face analysis. It enables a non-frontal facial image to recover its general facial appearances to frontal view. A few pioneering works have been proposed very recently. However, face frontalization with detailed facial expression recovering is still very challenging due to the non-linear relationships between head-pose and expression variations. In this paper, we propose a novel facial expression-aware face frontalization method aiming at reconstructing the frontal view while maintaining vivid appearances with regards to facial expressions. First of all, we design multiple face shape models as the reference templates in order to fit in with various shape of facial expressions. Each template describes a set of typical facial actions referred to Facial Action Coding System (FACS). Then a template matching strategy is applied by measuring a weighted Chi Square error such that the input image can be matched with the most approximate template. Finally, Robust Statistical face Frontalization (RSF) method is employed for the task of frontal view recovery. This method is validated on a spontaneous facial expression database and the experimental results show that the proposed method outperforms the state-of-the-art methods.
Original languageEnglish
Title of host publicationLNCS Proceedings of ACCV16
PublisherSpringer
Pages375-388
Number of pages14
DOIs
Publication statusEarly online - 11 Mar 2017
EventACCV2016 - Taipei International Conference Center, Taipei, Taiwan, Province of China
Duration: 20 Nov 201624 Nov 2016
http://www.accv2016.org/

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743

Conference

ConferenceACCV2016
CountryTaiwan, Province of China
CityTaipei
Period20/11/1624/11/16
Internet address

Documents

  • Facial_Exp_Aware_Frontalization_ACCV16

    Rights statement: The final publication is available at Springer via http://dx.doi.org/10.1007/978*3-319-54187-7_25.

    Accepted author manuscript (Post-print), 1.01 MB, PDF document

Relations Get citation (various referencing formats)

ID: 5035340