Skip to content

Dynamic hand gesture recognition based on 3D hand pose estimation for human-robot interaction

Research output: Contribution to journalArticlepeer-review

Dynamic hand gesture recognition is a challenging problem in the area of hand-based human-robot interaction (HRI), such as issues of a complex environment and dynamic perception. In the context of this problem, we learn from the principle of the data-glove-based hand gesture recognition method and propose a dynamic hand gesture recognition method based on 3D hand pose estimation. This method uses 3D hand pose estimation, data fusion and deep neural network to improve the recognition accuracy of dynamic hand gestures. First, a 2D hand pose estimation method based on OpenPose is improved to obtain a fast 3D hand pose estimation method. Second, the weighted sum fusion method is utilized to combine the RGB, depth and 3D skeleton data of hand gestures. Finally, a 3DCNN + ConvLSTM framework is used to identify and classify the combined dynamic hand gesture data. In the experiment, the proposed method is verified on a developed dynamic hand gesture database for HRI and gets 92.4% accuracy. Comparative experiment results verify the reliability and efficiency of the proposed method.
Original languageEnglish
JournalIEEE Sensors Journal
Early online date10 May 2021
DOIs
Publication statusEarly online - 10 May 2021

Documents

  • Dynamic hand gesture

    Rights statement: © 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

    Accepted author manuscript (Post-print), 3.62 MB, PDF document

Related information

Relations Get citation (various referencing formats)

ID: 26499085