TY - JOUR
T1 - A wearable multisensor fusion system for neuroprosthetic hand
AU - Yin, Zongtian
AU - Meng, Jianjun
AU - Shi, Shang
AU - Guo, Weichao
AU - Yang, Xingchen
AU - Ding, Han
AU - Liu, Honghai
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2025/4/15
Y1 - 2025/4/15
N2 - A neural interface translating human motor intentions into control commands for prosthetic hands helps amputees restore upper limb function. However, commercial neural interfaces with a few surface electromyography (sEMG) sensors are constrained by limitations, such as low spatiotemporal resolution, limited number of recognizable hand gestures, and sensitivity to arm positions. Multimodal sensor fusion presents a viable approach to overcome these challenges, offering improved accuracy, versatility, and robustness in gesture recognition. In this study, we developed a wearable multisensor fusion system compact enough to be integrated into a prosthetic socket. The fusion probe had dimensions of 38.5× 20.5× 13.5 mm, and the signal acquisition/processing device measured 50× 40 × 15 mm. The fusion system incorporated three types of sensors, capturing muscle movements from morphology (A-mode ultrasound), electrophysiology (sEMG), and kinematics inertial measurement unit (IMU). Gesture recognition experiments were conducted with 20 subjects, including both healthy individuals and amputees, achieving classification accuracies of 94.8% ± 1.1 % and 96.9% ± ~1.3 % for six common gestures, respectively. Furthermore, we proposed a new control strategy based on the characteristics of sensor fusion to enhance the stability of online gesture classification. Practical online testing with amputees wearing prostheses indicated that the designed fusion system had high classification accuracy and stability during gesture recognition. These results demonstrated that the wearable multisensor fusion system is well-suited for integration into prostheses, offering a robust solution for amputees' practical use.
AB - A neural interface translating human motor intentions into control commands for prosthetic hands helps amputees restore upper limb function. However, commercial neural interfaces with a few surface electromyography (sEMG) sensors are constrained by limitations, such as low spatiotemporal resolution, limited number of recognizable hand gestures, and sensitivity to arm positions. Multimodal sensor fusion presents a viable approach to overcome these challenges, offering improved accuracy, versatility, and robustness in gesture recognition. In this study, we developed a wearable multisensor fusion system compact enough to be integrated into a prosthetic socket. The fusion probe had dimensions of 38.5× 20.5× 13.5 mm, and the signal acquisition/processing device measured 50× 40 × 15 mm. The fusion system incorporated three types of sensors, capturing muscle movements from morphology (A-mode ultrasound), electrophysiology (sEMG), and kinematics inertial measurement unit (IMU). Gesture recognition experiments were conducted with 20 subjects, including both healthy individuals and amputees, achieving classification accuracies of 94.8% ± 1.1 % and 96.9% ± ~1.3 % for six common gestures, respectively. Furthermore, we proposed a new control strategy based on the characteristics of sensor fusion to enhance the stability of online gesture classification. Practical online testing with amputees wearing prostheses indicated that the designed fusion system had high classification accuracy and stability during gesture recognition. These results demonstrated that the wearable multisensor fusion system is well-suited for integration into prostheses, offering a robust solution for amputees' practical use.
KW - A-mode ultrasound
KW - fusion system
KW - inertial measurement unit (IMU)
KW - neural prostheses
KW - surface electromyography (sEMG)
UR - http://www.scopus.com/inward/record.url?scp=105003047346&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2025.3546214
DO - 10.1109/JSEN.2025.3546214
M3 - Article
AN - SCOPUS:105003047346
SN - 1530-437X
VL - 25
SP - 12547
EP - 12558
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 8
ER -