Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture
This study addresses a critical gap in human activity recognition (HAR) research by enhancing both the explainability and efficiency of activity classification in collaborative human–robot systems, particularly in agricultural environments. While traditional HAR models often prioritize improving ove...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/2/650 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832589287610122240 |
---|---|
author | Lefteris Benos Dimitrios Tsaopoulos Aristotelis C. Tagarakis Dimitrios Kateris Patrizia Busato Dionysis Bochtis |
author_facet | Lefteris Benos Dimitrios Tsaopoulos Aristotelis C. Tagarakis Dimitrios Kateris Patrizia Busato Dionysis Bochtis |
author_sort | Lefteris Benos |
collection | DOAJ |
description | This study addresses a critical gap in human activity recognition (HAR) research by enhancing both the explainability and efficiency of activity classification in collaborative human–robot systems, particularly in agricultural environments. While traditional HAR models often prioritize improving overall classification accuracy, they typically lack transparency in how sensor data contribute to decision-making. To fill this gap, this study integrates explainable artificial intelligence, specifically SHapley Additive exPlanations (SHAP), thus enhancing the interpretability of the model. Data were collected from 20 participants who wore five inertial measurement units (IMUs) at various body positions while performing material handling tasks involving an unmanned ground vehicle in a field collaborative harvesting scenario. The results highlight the central role of torso-mounted sensors, particularly in the lumbar region, cervix, and chest, in capturing core movements, while wrist sensors provided useful complementary information, especially for load-related activities. The XGBoost-based model, selected mainly for allowing an in-depth analysis of feature contributions by considerably reducing the complexity of calculations, demonstrated strong performance in HAR. The findings indicate that future research should focus on enlarging the dataset, investigating the use of additional sensors and sensor placements, and performing real-world trials to enhance the model’s generalizability and adaptability for practical agricultural applications. |
format | Article |
id | doaj-art-bce27c104a8741b7b85e2428eb0e5f77 |
institution | Kabale University |
issn | 2076-3417 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj-art-bce27c104a8741b7b85e2428eb0e5f772025-01-24T13:20:17ZengMDPI AGApplied Sciences2076-34172025-01-0115265010.3390/app15020650Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in AgricultureLefteris Benos0Dimitrios Tsaopoulos1Aristotelis C. Tagarakis2Dimitrios Kateris3Patrizia Busato4Dionysis Bochtis5Institute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceInteruniversity Department of Regional and Urban Studies and Planning (DIST), Polytechnic of Turin, Viale Mattioli 39, 10125 Torino, ItalyInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceThis study addresses a critical gap in human activity recognition (HAR) research by enhancing both the explainability and efficiency of activity classification in collaborative human–robot systems, particularly in agricultural environments. While traditional HAR models often prioritize improving overall classification accuracy, they typically lack transparency in how sensor data contribute to decision-making. To fill this gap, this study integrates explainable artificial intelligence, specifically SHapley Additive exPlanations (SHAP), thus enhancing the interpretability of the model. Data were collected from 20 participants who wore five inertial measurement units (IMUs) at various body positions while performing material handling tasks involving an unmanned ground vehicle in a field collaborative harvesting scenario. The results highlight the central role of torso-mounted sensors, particularly in the lumbar region, cervix, and chest, in capturing core movements, while wrist sensors provided useful complementary information, especially for load-related activities. The XGBoost-based model, selected mainly for allowing an in-depth analysis of feature contributions by considerably reducing the complexity of calculations, demonstrated strong performance in HAR. The findings indicate that future research should focus on enlarging the dataset, investigating the use of additional sensors and sensor placements, and performing real-world trials to enhance the model’s generalizability and adaptability for practical agricultural applications.https://www.mdpi.com/2076-3417/15/2/650XGBoost classificationSHapley Additive exPlanations (SHAP)feature importancefield experimentmaterial handling taskswearable sensors |
spellingShingle | Lefteris Benos Dimitrios Tsaopoulos Aristotelis C. Tagarakis Dimitrios Kateris Patrizia Busato Dionysis Bochtis Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture Applied Sciences XGBoost classification SHapley Additive exPlanations (SHAP) feature importance field experiment material handling tasks wearable sensors |
title | Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture |
title_full | Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture |
title_fullStr | Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture |
title_full_unstemmed | Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture |
title_short | Explainable AI-Enhanced Human Activity Recognition for Human–Robot Collaboration in Agriculture |
title_sort | explainable ai enhanced human activity recognition for human robot collaboration in agriculture |
topic | XGBoost classification SHapley Additive exPlanations (SHAP) feature importance field experiment material handling tasks wearable sensors |
url | https://www.mdpi.com/2076-3417/15/2/650 |
work_keys_str_mv | AT lefterisbenos explainableaienhancedhumanactivityrecognitionforhumanrobotcollaborationinagriculture AT dimitriostsaopoulos explainableaienhancedhumanactivityrecognitionforhumanrobotcollaborationinagriculture AT aristotelisctagarakis explainableaienhancedhumanactivityrecognitionforhumanrobotcollaborationinagriculture AT dimitrioskateris explainableaienhancedhumanactivityrecognitionforhumanrobotcollaborationinagriculture AT patriziabusato explainableaienhancedhumanactivityrecognitionforhumanrobotcollaborationinagriculture AT dionysisbochtis explainableaienhancedhumanactivityrecognitionforhumanrobotcollaborationinagriculture |