Relation-aware facial expression recognition

Yifan Xia, Hui Yu*, Xiao Wang, Muwei Jian, Fei-Yue Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

372 Downloads (Pure)


Research on facial expression recognition has been moving from the constrained lab scenarios to the in-the-wild situations and has made progress in recent years. However, it is still very challenging to deal with facial expression in the wild due to large poses and occlusion as well as illumination and intensity variations. Generally, existing methods mainly take the whole face as a uniform source of features for facial expression analysis. Actually, physiology and psychology research shows that some crucial regions such as the eye and mouth reflect the differences of different facial expressions, which have close relationships with emotion expression. Inspired by this observation, a novel relation-aware facial expression recognition method called Relation Convolutional Neural Network (ReCNN) is proposed in this paper, which can adaptively capture the relationship between crucial regions and facial expressions leading to the focus on the most discriminative regions for recognition. We have evaluated the proposed ReCNN on two large in-the-wild databases: AffectNet and RAF-DB. Extensive experiments on these databases show that our method has superior recognition accuracy compared with state-of-the-art methods and the relationship between crucial regions and facial expressions is beneficial to improve the performance of facial expression recognition.
Original languageEnglish
Number of pages12
JournalIEEE Transactions on Cognitive and Developmental Systems
Early online date26 Jul 2021
Publication statusEarly online - 26 Jul 2021


  • relation-aware
  • facial expression recognition
  • deep convolutional neural networks
  • ReCNN
  • facial expression in the wild


Dive into the research topics of 'Relation-aware facial expression recognition'. Together they form a unique fingerprint.

Cite this