TY - JOUR
T1 - Dynamic hand gesture recognition based on 3D hand pose estimation for human-robot interaction
AU - Gao, Qing
AU - Chen, Yongquan
AU - Liang, Yi
AU - Ju, Zhaojie
PY - 2021/5/10
Y1 - 2021/5/10
N2 - Dynamic hand gesture recognition is a challenging problem in the area of hand-based human-robot interaction (HRI), such as issues of a complex environment and dynamic perception. In the context of this problem, we learn from the principle of the data-glove-based hand gesture recognition method and propose a dynamic hand gesture recognition method based on 3D hand pose estimation. This method uses 3D hand pose estimation, data fusion and deep neural network to improve the recognition accuracy of dynamic hand gestures. First, a 2D hand pose estimation method based on OpenPose is improved to obtain a fast 3D hand pose estimation method. Second, the weighted sum fusion method is utilized to combine the RGB, depth and 3D skeleton data of hand gestures. Finally, a 3DCNN + ConvLSTM framework is used to identify and classify the combined dynamic hand gesture data. In the experiment, the proposed method is verified on a developed dynamic hand gesture database for HRI and gets 92.4% accuracy. Comparative experiment results verify the reliability and efficiency of the proposed method.
AB - Dynamic hand gesture recognition is a challenging problem in the area of hand-based human-robot interaction (HRI), such as issues of a complex environment and dynamic perception. In the context of this problem, we learn from the principle of the data-glove-based hand gesture recognition method and propose a dynamic hand gesture recognition method based on 3D hand pose estimation. This method uses 3D hand pose estimation, data fusion and deep neural network to improve the recognition accuracy of dynamic hand gestures. First, a 2D hand pose estimation method based on OpenPose is improved to obtain a fast 3D hand pose estimation method. Second, the weighted sum fusion method is utilized to combine the RGB, depth and 3D skeleton data of hand gestures. Finally, a 3DCNN + ConvLSTM framework is used to identify and classify the combined dynamic hand gesture data. In the experiment, the proposed method is verified on a developed dynamic hand gesture database for HRI and gets 92.4% accuracy. Comparative experiment results verify the reliability and efficiency of the proposed method.
U2 - 10.1109/JSEN.2021.3059685
DO - 10.1109/JSEN.2021.3059685
M3 - Article
SN - 1530-437X
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
ER -