Towards an Affect Space for robots to display emotional body language

Aryel Beck*, Lola Cañamero, Kim A. Bard

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In order for robots to be socially accepted and generate empathy it is necessary that they display rich emotions. For robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve its sociability. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by robots. To create an Affect Space for body language, one has to establish the contribution of the different positions of the joints to the emotional expression. The experiment reported in this paper investigated the effect of varying a robot's head position on the interpretation, Valence, Arousal and Stance of emotional key poses. It was found that participants were better than chance level in interpreting the key poses. This finding confirms that body language is an appropriate medium for robot to express emotions. Moreover, the results of this study support the conclusion that Head Position is an important body posture variable. Head Position up increased correct identification for some emotion displays (pride, happiness, and excitement), whereas Head Position down increased correct identification for other displays (anger, sadness). Fear, however, was identified well regardless of Head Position. Head up was always evaluated as more highly Aroused than Head straight or down. Evaluations of Valence (degree of negativity to positivity) and Stance (degree to which the robot was aversive to approaching), however, depended on both Head Position and the emotion displayed. The effects of varying this single body posture variable were complex.

Original languageEnglish
Title of host publication19th International Symposium in Robot and Human Interactive Communication, RO-MAN 2010
Number of pages6
Publication statusPublished - 13 Dec 2010
Event19th IEEE International Conference on Robot and Human Interactive Communication - Viareggio, Italy
Duration: 12 Sept 201015 Sept 2010


Conference19th IEEE International Conference on Robot and Human Interactive Communication
Abbreviated titleRO-MAN 2010


Dive into the research topics of 'Towards an Affect Space for robots to display emotional body language'. Together they form a unique fingerprint.

Cite this