Projects per year
Abstract / Description of output
The ultimate goal of machine learning-based myoelectric control is simultaneous and independent control of multiple degrees of freedom (DOFs), including wrist and digit artificial joints. For prosthetic finger control, regression-based methods are typically used to reconstruct position/velocity trajectories from surface electromyogram (EMG) signals. Although such methods have produced highly-accurate results in offline analyses, their success in real-time prosthesis control settings has been rather limited. In this work, we propose action decoding, a paradigm-shifting approach for independent, multi-digit movement intent decoding based on multi-label, multi-class classification. At each moment in time, our algorithm classifies movement action for each available DOF into one of three categories: open, close, or stall (i.e., no movement). Despite using a classifier as the decoder, arbitrary hand postures are possible with our approach. We analyse a public dataset previously recorded and published by us, comprising measurements from 10 able-bodied and two transradial amputee participants. We demonstrate the feasibility of using our proposed action decoding paradigm to predict movement action for all five digits as well as rotation of the thumb. We perform a systematic offline analysis by investigating the effect of various algorithmic parameters on decoding performance, such as feature selection and choice of classification algorithm and multi-output strategy. The outcomes of the offline analysis presented in this study will be used to inform the real-time implementation of our algorithm. In the future, we will further evaluate its efficacy with real-time control experiments involving upper-limb amputees.
FingerprintDive into the research topics of 'Myoelectric digit action decoding with multi-label, multi-class classification: an offline analysis'. Together they form a unique fingerprint.
- 1 Finished
1/02/18 → 31/01/23