Skip to content

A novel feature extraction method for machine learning based on surface electromyography from healthy brain

Research output: Contribution to journalArticlepeer-review

Feature extraction is one of most important steps in the control of multifunctional prosthesis based on surface electromyography (sEMG) pattern recognition. In this paper, a new sEMG feature extraction method based on muscle active region is proposed. This paper designs an experiment to classify four hand motions using different features. This experiment is used to prove that new features have better classification performance. The experimental results show that the new feature, active muscle regions (AMR), has better classification performance than other traditional features, mean absolute value (MAV), waveform length (WL), zero crossing (ZC) and slope sign changes (SSC). The average classification errors of AMR, MAV, WL, ZC and SSC are 13%, 19%, 26%, 24% and 22%, respectively. The new EMG features are based on the mapping relationship between hand movements and forearm active muscle regions. This mapping relationship has been confirmed in medicine. We obtain the active muscle regions data from the original EMG signal by the new feature extraction algorithm. The results obtained from this algorithm can well represent hand motions. On the other hand, the new feature vector size is much smaller than other features. The new feature can narrow the computational cost. This proves that the AMR can improve sEMG pattern recognition accuracy rate.

Original languageEnglish
JournalNeural Computing and Applications
Early online date16 Mar 2019
DOIs
Publication statusEarly online - 16 Mar 2019

Documents

  • NCAA-D-18-01928_Postprint

    Rights statement: This is a post-peer-review, pre-copyedit version of an article published in Neural Computing and Applications. The final authenticated version is available online at: http://dx.doi.org/10.1007%2Fs00521-019-04147-3.

    Accepted author manuscript (Post-print), 5.37 MB, PDF document

Related information

Relations Get citation (various referencing formats)

ID: 13590933