On Tiny Feature Engineering: Towards an Embedded EMG-Based Hand Gesture Recognition Model

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Robotic-based therapy is becoming a very popular treatment for the rehabilitation of hand function impairment as a consequence of strokes, due to the high-dimensional and less interpretable nature of superficial electromyography (sEMG) signals. In this context, feature engineering becomes particularly important to estimate the intention of upper limb movements by utilizing machine learning models, specially when a hardware embedded on-board implementation is expected, due to the strong computational, energy and latency constraints. Our work compares the performance achieved by implementing four state-of-the-art feature techniques (random forest, minimum redundancy maximum relevance (MRMR), Davies-Bouldin index, and t-tests), when these are evaluated on a sEMG dataset intended for training hand gesture classifiers. The results of three different machine learning algorithms (neuronal networks, k-nearest neighbors and bagged forest) are used as a reference to validate the analysis. This ongoing research has revealed valuable information on the potential and constraints of these 4 feature generation methods for real gesture recognition embedded applications.
Original languageEnglish
Title of host publication2024 IEEE/ACM Symposium on Edge Computing (SEC)
Pages437-442
Number of pages7
ISBN (Electronic)979-8-3503-7828-3
DOIs
StatePublished - 04 Dec 2024
Event2024 IEEE/ACM Symposium on Edge Computing (SEC) - Roma, Italy
Duration: 04 Dec 202407 Dec 2024

Publication series

Name2024 IEEE/ACM Symposium on Edge Computing (SEC)

Conference

Conference2024 IEEE/ACM Symposium on Edge Computing (SEC)
Country/TerritoryItaly
CityRoma
Period04/12/2407/12/24

Fingerprint

Dive into the research topics of 'On Tiny Feature Engineering: Towards an Embedded EMG-Based Hand Gesture Recognition Model'. Together they form a unique fingerprint.

Cite this