Skip to content

A novel convolutional neural network for facial expression recognition

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Facial expression recognition is becoming a hot topic due to its wide applications in computer vision research fields. Traditional methods adopt hand-crafted features combined with classifiers to achieve the recognition goal. However, the accuracy of these methods often relies heavily on the extracted features and the classifier’s parameters, and thus cannot get good result with unseen data. Recently, deep learning, which simulates the mechanism of human brain to interpret data, has shown remarkable results in visual object recognition. In this paper, we present a novel convolutional neural network which consists of local binary patterns and improved Inception-ResNet layers for automatic facial expression recognition. We apply the proposed method to three expression datasets, i.e., the Extended Cohn-kanade Dataset (CK+), the Japanese Female Expression Database (JAFFE), and the FER2013 Dataset. The experimental results demonstrate the feasibility and effectiveness of our proposed network.
Original languageEnglish
Title of host publicationProceedings of the 3rd International Conference on Cognitive Systems and Information Processing
Publication statusAccepted for publication - 16 Oct 2018
EventInternational Conference on Cognitive Systems and Information Processing - Beijing, China
Duration: 24 Nov 201826 Nov 2018


ConferenceInternational Conference on Cognitive Systems and Information Processing
Abbreviated titleICCSIP 2018


  • ICCSIP_PostPrint

    Rights statement: The embargo end date of 2050 is a temporary measure until we know the publication date. Once we know the publication date the full text of this article will be able to view shortly afterwards.

    Accepted author manuscript (Post-print), 8 MB, PDF-document

    Due to publisher’s copyright restrictions, this document is not freely available to download from this website until: 1/01/50

Related information

Relations Get citation (various referencing formats)

ID: 11992684