Human-human interaction recognition based on spatial and motion trend feature

Bangli Liu, Haibin Cai, Xiaofei Ji, Honghai Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

484 Downloads (Pure)


Human-human interaction recognition has attracted increasing attention in recent years due to its wide applications in computer vision fields. Currently there are few publicly available RGBD-based human-human interaction datasets collected. This paper introduces a new dataset for human-human interaction recognition. Furthermore, a novel feature descriptor based on spatial relationship and semantic motion trend similarity between body parts is proposed for human-human interaction recognition. The motion trend of each skeleton joint is firstly quantified into the specific semantic word and then a Kernel is built for measuring the similarity of either intra or inter body parts by histogram interaction. Finally, the proposed feature descriptor is evaluated on the SBU interaction dataset and the collected dataset. Experimental results demonstrate the outperformance of our method over the state-of-the-art methods.
Original languageEnglish
Title of host publication2017 IEEE International Conference on Image Processing (ICIP)
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages5
ISBN (Electronic)978-1-5090-2175-8
ISBN (Print)978-1-5090-2176-5
Publication statusPublished - 22 Feb 2018
Event2017 IEEE International Conference on Image Processing - Beijing, China
Duration: 17 Sept 201720 Sept 2017

Publication series

NameIEEE ICIP Proceedings Series
ISSN (Electronic)2381-8549


Conference2017 IEEE International Conference on Image Processing
Abbreviated titleICIP 2017
Internet address


Dive into the research topics of 'Human-human interaction recognition based on spatial and motion trend feature'. Together they form a unique fingerprint.

Cite this