Abstract
Complex and natural social interaction between artificial agents (computer generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant and accepted, and to generate the natural emotional responses that humans show in the context of social interaction, such as engagement or empathy. Whereas some robots use faces to display (simplified) emotional expressions, for other robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve naturalness. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by humanoid robots. To do so, three experiments investigating how emotional body language displayed by agents is interpreted were conducted.
The first experiment compared the interpretation of emotional body language displayed by humans and agents. The results showed that emotional body language displayed by an agent or a human is interpreted in a similar way in terms of recognition. Following these results, emotional key poses were extracted from an actor‟s performances and implemented in a Nao robot. The interpretation of these key poses was validated in a second study where it was found that participants were better than chance at interpreting the key poses displayed. Finally, an Affect Space was generated by blending key poses and validated in a third study.
Overall, these experiments confirmed that body language is an appropriate medium for robots to display emotions and suggest that an Affect Space for body expressions can be used to improve the expressiveness of humanoid robots.
Original language | English |
---|---|
Pages (from-to) | 2 |
Number of pages | 1 |
Journal | ACM Transactions on Interactive Intelligent Systems |
Volume | 2 |
Issue number | 1 |
DOIs | |
Publication status | Published - Mar 2012 |