Human-human interaction recognition based on spatial and motion trend feature

Bangli Liu, Haibin Cai, Xiaofei Ji, Honghai Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

246 Downloads (Pure)

Abstract

Human-human interaction recognition has attracted increasing attention in recent years due to its wide applications in computer vision fields. Currently there are few publicly available RGBD-based human-human interaction datasets collected. This paper introduces a new dataset for human-human interaction recognition. Furthermore, a novel feature descriptor based on spatial relationship and semantic motion trend similarity between body parts is proposed for human-human interaction recognition. The motion trend of each skeleton joint is firstly quantified into the specific semantic word and then a Kernel is built for measuring the similarity of either intra or inter body parts by histogram interaction. Finally, the proposed feature descriptor is evaluated on the SBU interaction dataset and the collected dataset. Experimental results demonstrate the outperformance of our method over the state-of-the-art methods.
Original languageEnglish
Title of host publication2017 IEEE International Conference on Image Processing (ICIP)
PublisherIEEE
Pages4547-4551
ISBN (Electronic)978-1-5090-2175-8
ISBN (Print)978-1-5090-2176-5
DOIs
Publication statusPublished - 22 Feb 2018
Event2017 IEEE International Conference on Image Processing: ICIP 2017 - Beijing, China
Duration: 17 Sep 201720 Sep 2017
http://2017.ieeeicip.org/

Publication series

NameIEEE ICIP Proceedings Series
ISSN (Electronic)2381-8549

Conference

Conference2017 IEEE International Conference on Image Processing
Country/TerritoryChina
CityBeijing
Period17/09/1720/09/17
Internet address

Fingerprint

Dive into the research topics of 'Human-human interaction recognition based on spatial and motion trend feature'. Together they form a unique fingerprint.

Cite this