Dynamic hand gesture recognition based on 3D hand pose estimation for human-robot interaction

Qing Gao, Yongquan Chen, Yi Liang, Zhaojie Ju

Research output: Contribution to journalArticlepeer-review

1104 Downloads (Pure)

Abstract

Dynamic hand gesture recognition is a challenging problem in the area of hand-based human-robot interaction (HRI), such as issues of a complex environment and dynamic perception. In the context of this problem, we learn from the principle of the data-glove-based hand gesture recognition method and propose a dynamic hand gesture recognition method based on 3D hand pose estimation. This method uses 3D hand pose estimation, data fusion and deep neural network to improve the recognition accuracy of dynamic hand gestures. First, a 2D hand pose estimation method based on OpenPose is improved to obtain a fast 3D hand pose estimation method. Second, the weighted sum fusion method is utilized to combine the RGB, depth and 3D skeleton data of hand gestures. Finally, a 3DCNN + ConvLSTM framework is used to identify and classify the combined dynamic hand gesture data. In the experiment, the proposed method is verified on a developed dynamic hand gesture database for HRI and gets 92.4% accuracy. Comparative experiment results verify the reliability and efficiency of the proposed method.
Original languageEnglish
JournalIEEE Sensors Journal
Early online date10 May 2021
DOIs
Publication statusEarly online - 10 May 2021

Fingerprint

Dive into the research topics of 'Dynamic hand gesture recognition based on 3D hand pose estimation for human-robot interaction'. Together they form a unique fingerprint.

Cite this