Abstract
In recent years, hand gesture recognition has played a crucial role in human-robot interaction (HRI). This paper proposes a skeleton-based serial-parallel dynamic hand gesture recognition network. A set of skeleton-based physical features is designed to model the spatial relationship of joints to construct skeletal space configurations. A slow-fast double-scale parallel network is proposed to extract the temporal dynamics of gestures. The attention mechanism is used to fuse the spatiotemporal information of the gestures, and the recognition result is obtained through the serial 1DCNN structure. In addition, the data enhancement technology based on transformation is used to improve the generalization of the network. The proposed methods are evaluated on the SHREC14 and SHREC28 datasets, which show superior performance, with an accuracy of 95.11% and 92.98%, respectively. The network is fine-tuned on the customized dataset HRIGes, and the recognition results are mapped to a fivefingered dexterous manipulator to realize real-time human-robot interaction.
Original language | English |
---|---|
Title of host publication | Proceedings of 29th IEEE International Conference on Mechatronics and Machine Vision in Practice |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Number of pages | 6 |
ISBN (Electronic) | 9798350325621 |
ISBN (Print) | 9798350325638 |
DOIs | |
Publication status | Published - 2 Feb 2024 |
Event | 29th IEEE International Conference On Mechatronics And Machine Vision In Practice - Queenstown, New Zealand Duration: 21 Nov 2023 → 24 Nov 2023 |
Conference
Conference | 29th IEEE International Conference On Mechatronics And Machine Vision In Practice |
---|---|
Country/Territory | New Zealand |
City | Queenstown |
Period | 21/11/23 → 24/11/23 |
Keywords
- serial-parallel network
- hand gesture recognition
- human-robot interaction