Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification

In recent years, significant strides in deep learning have propelled the advancement of electromyography (EMG)-based upper-limb gesture recognition systems, yielding notable successes across a spectrum of domains, including rehabilitation, orthopedics, robotics, and human-computer interaction. Despi...

Full description

Saved in:
Bibliographic Details
Main Authors: Hunmin Lee, Ming Jiang, Jinhui Yang, Zhi Yang, Qi Zhao
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Transactions on Neural Systems and Rehabilitation Engineering
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10818442/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832592965851152384
author Hunmin Lee
Ming Jiang
Jinhui Yang
Zhi Yang
Qi Zhao
author_facet Hunmin Lee
Ming Jiang
Jinhui Yang
Zhi Yang
Qi Zhao
author_sort Hunmin Lee
collection DOAJ
description In recent years, significant strides in deep learning have propelled the advancement of electromyography (EMG)-based upper-limb gesture recognition systems, yielding notable successes across a spectrum of domains, including rehabilitation, orthopedics, robotics, and human-computer interaction. Despite these achievements, prevailing methodologies often overlook the intrinsic physical configurations and interconnectivity of multi-channel sensory inputs, resulting in a failure to adequately capture relational information embedded within the connections of deployed EMG sensor network topology. This oversight poses a significant challenge, impeding the extraction of crucial features from collaborative multi-channel EMG inputs and subsequently constraining model performance, generalizability, and interpretability. To address these limitations, we introduce novel graph structures meticulously crafted to encapsulate the spatial proximity of distributed EMG sensors and the temporal adjacency of EMG signals. Harnessing these tailored graph structures, we present Graph Convolution Network (GCN)-based classification models adept at effectively extracting and aggregating key features associated with various gestures. Our methodology exhibits remarkable efficacy, achieving state-of-the-art performance across five publicly available datasets, thus underscoring its prowess in gesture recognition tasks. Furthermore, our approach provides interpretable insights into muscular activation patterns, thereby reaffirming the practical effectiveness of our GCN model. Moreover, we show the effectiveness of our graph-based input structure and GCN-based classifier in maintaining high accuracy even with reduced sensor configurations, suggesting their potential for seamless integration into AI-powered rehabilitation strategies utilizing EMG-based gesture classification systems.
format Article
id doaj-art-de97e4d4e9c34adfa25d9f5f61916838
institution Kabale University
issn 1534-4320
1558-0210
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Transactions on Neural Systems and Rehabilitation Engineering
spelling doaj-art-de97e4d4e9c34adfa25d9f5f619168382025-01-21T00:00:09ZengIEEEIEEE Transactions on Neural Systems and Rehabilitation Engineering1534-43201558-02102025-01-013340441910.1109/TNSRE.2024.352394310818442Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable ClassificationHunmin Lee0https://orcid.org/0000-0001-7595-9791Ming Jiang1https://orcid.org/0000-0001-6439-5476Jinhui Yang2https://orcid.org/0000-0001-8322-1121Zhi Yang3https://orcid.org/0009-0004-8396-2666Qi Zhao4https://orcid.org/0000-0003-3054-8934Department of Computer Science, College of Engineering and Science, University of Minnesota, Minneapolis, MN, USADepartment of Computer Science, College of Engineering and Science, University of Minnesota, Minneapolis, MN, USADepartment of Computer Science, College of Engineering and Science, University of Minnesota, Minneapolis, MN, USADepartment of Biomedical Engineering, College of Engineering and Science, University of Minnesota, Minneapolis, MN, USADepartment of Computer Science, College of Engineering and Science, University of Minnesota, Minneapolis, MN, USAIn recent years, significant strides in deep learning have propelled the advancement of electromyography (EMG)-based upper-limb gesture recognition systems, yielding notable successes across a spectrum of domains, including rehabilitation, orthopedics, robotics, and human-computer interaction. Despite these achievements, prevailing methodologies often overlook the intrinsic physical configurations and interconnectivity of multi-channel sensory inputs, resulting in a failure to adequately capture relational information embedded within the connections of deployed EMG sensor network topology. This oversight poses a significant challenge, impeding the extraction of crucial features from collaborative multi-channel EMG inputs and subsequently constraining model performance, generalizability, and interpretability. To address these limitations, we introduce novel graph structures meticulously crafted to encapsulate the spatial proximity of distributed EMG sensors and the temporal adjacency of EMG signals. Harnessing these tailored graph structures, we present Graph Convolution Network (GCN)-based classification models adept at effectively extracting and aggregating key features associated with various gestures. Our methodology exhibits remarkable efficacy, achieving state-of-the-art performance across five publicly available datasets, thus underscoring its prowess in gesture recognition tasks. Furthermore, our approach provides interpretable insights into muscular activation patterns, thereby reaffirming the practical effectiveness of our GCN model. Moreover, we show the effectiveness of our graph-based input structure and GCN-based classifier in maintaining high accuracy even with reduced sensor configurations, suggesting their potential for seamless integration into AI-powered rehabilitation strategies utilizing EMG-based gesture classification systems.https://ieeexplore.ieee.org/document/10818442/Explainable AIgeneralizabilitygraph neural networkgraph representationhand gesture classificationsurface electromyography
spellingShingle Hunmin Lee
Ming Jiang
Jinhui Yang
Zhi Yang
Qi Zhao
Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Explainable AI
generalizability
graph neural network
graph representation
hand gesture classification
surface electromyography
title Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification
title_full Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification
title_fullStr Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification
title_full_unstemmed Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification
title_short Decoding Gestures in Electromyography: Spatiotemporal Graph Neural Networks for Generalizable and Interpretable Classification
title_sort decoding gestures in electromyography spatiotemporal graph neural networks for generalizable and interpretable classification
topic Explainable AI
generalizability
graph neural network
graph representation
hand gesture classification
surface electromyography
url https://ieeexplore.ieee.org/document/10818442/
work_keys_str_mv AT hunminlee decodinggesturesinelectromyographyspatiotemporalgraphneuralnetworksforgeneralizableandinterpretableclassification
AT mingjiang decodinggesturesinelectromyographyspatiotemporalgraphneuralnetworksforgeneralizableandinterpretableclassification
AT jinhuiyang decodinggesturesinelectromyographyspatiotemporalgraphneuralnetworksforgeneralizableandinterpretableclassification
AT zhiyang decodinggesturesinelectromyographyspatiotemporalgraphneuralnetworksforgeneralizableandinterpretableclassification
AT qizhao decodinggesturesinelectromyographyspatiotemporalgraphneuralnetworksforgeneralizableandinterpretableclassification