MGEED: a multimodal genuine emotion and expression detection database

Yiming Wang, Hui Yu, Weihong Gao, Yifan Xia, Charles Nduka

Research output: Contribution to journalArticlepeer-review

25 Downloads (Pure)

Abstract

Multimodal emotion recognition has attracted increasing interest from academia and industry in recent years, since it enables emotion detection using various modalities, such as facial expression images, speech and physiological signals. Although research in this field has grown rapidly, it is still challenging to create a multimodal database containing facial electrical information due to the difficulty in capturing natural and subtle facial expression signals, such as optomyography (OMG) signals. To this end, we present a newly developed Multimodal Genuine Emotion and Expression Detection (MGEED) database in this paper, which is the first publicly available database containing the facial OMG signals. MGEED consists of 17 subjects with over 150K facial images, 140K depth maps and different modalities of physiological signals including OMG, electroencephalography (EEG) and electrocardiography (ECG) signals. The emotions of the participants are evoked by video stimuli and the data are collected by a multimodal sensing system. With the collected data, an emotion recognition method is developed based on multimodal signal synchronisation, feature extraction, fusion and emotion prediction. The results show that superior performance can be achieved by fusing the visual, EEG and OMG features. The database can be obtained from<uri>https://github.com/YMPort/MGEED</uri>.

Original languageEnglish
Number of pages13
JournalIEEE Transactions on Affective Computing
Early online date15 Jun 2023
DOIs
Publication statusEarly online - 15 Jun 2023

Keywords

  • affective sensing and analysis
  • Databases
  • Electrocardiography
  • Electroencephalography
  • Electromyography
  • Emotion recognition
  • facial expression analysis
  • Feature extraction
  • multi- modal emotion database
  • Physiology
  • UKRI
  • EPSRC
  • EP/N025849/1

Cite this