Skip to content

Gesture recognition based on multi-modal feature weight

Research output: Contribution to journalArticle

  • Haojie Duan
  • Ying Sun
  • Wentao Cheng
  • Du Jiang
  • Juntong Yun
  • Ying Liu
  • Dr Dalin Zhou
With the continuous development of sensor technology, the acquisition cost of RGB-D images is getting lower and lower, and gesture recognition based on depth images and RGB images has gradually become a research direction in the field of pattern recognition. However, most of the current processing methods for RGB-D gesture images are relatively simple, ignoring the relationship and influence between its two modes, and unable to make full use of the correlation factors between different modes. In view of the above problems, this paper optimizes the effect of RGB-D information processing by considering the independent features and related features of multi-modal data to construct a weight adaptive algorithm to fuse different features. Simulation experiments show that the method proposed in this paper is better than the traditional RGB-D gesture image processing method and the gesture recognition rate is
higher. Comparing the current more advanced gesture recognition methods, the method proposed in this paper also achieves higher recognition accuracy, which verifies the feasibility and robustness of this method.
Original languageEnglish
JournalConcurrency and Computation: Practice and Experience
Early online date10 Sep 2020
Publication statusEarly online - 10 Sep 2020

Documents

  • Gesture recognition based on multi-modal feature weight

    Rights statement: This is the peer reviewed version of the following article: Haojie Duan, Ying Sun, Wentao Cheng, Du Jiang, Juntong Yun, Ying Liu, Yibo Liu & Dalin Zhou. 'Gesture recognition based on multi‐modal feature weight'. Concurrency and Computation: Practice and Experience., which has been published in final form at https://doi.org/10.1002/cpe.5991. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Use of Self-Archived Versions.

    Accepted author manuscript (Post-print), 955 KB, PDF document

    Due to publisher’s copyright restrictions, this document is not freely available to download from this website until: 10/09/21

Related information

Relations Get citation (various referencing formats)

ID: 21945162