Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments
In open-field agricultural environments, the inherent unpredictable situations pose significant challenges for effective human–robot interaction. This study aims to enhance natural communication between humans and robots in such challenging conditions by converting the detection of a range of dynami...
Saved in:
| Main Authors: | , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-08-01
|
| Series: | AgriEngineering |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2624-7402/6/3/146 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850258789513560064 |
|---|---|
| author | Vasileios Moysiadis Lefteris Benos George Karras Dimitrios Kateris Andrea Peruzzi Remigio Berruto Elpiniki Papageorgiou Dionysis Bochtis |
| author_facet | Vasileios Moysiadis Lefteris Benos George Karras Dimitrios Kateris Andrea Peruzzi Remigio Berruto Elpiniki Papageorgiou Dionysis Bochtis |
| author_sort | Vasileios Moysiadis |
| collection | DOAJ |
| description | In open-field agricultural environments, the inherent unpredictable situations pose significant challenges for effective human–robot interaction. This study aims to enhance natural communication between humans and robots in such challenging conditions by converting the detection of a range of dynamic human movements into specific robot actions. Various machine learning models were evaluated to classify these movements, with Long Short-Term Memory (LSTM) demonstrating the highest performance. Furthermore, the Robot Operating System (ROS) software (Melodic Version) capabilities were employed to interpret the movements into certain actions to be performed by the unmanned ground vehicle (UGV). The novel interaction framework exploiting vision-based human activity recognition was successfully tested through three scenarios taking place in an orchard, including (a) a UGV following the authorized participant; (b) GPS-based navigation to a specified site of the orchard; and (c) a combined harvesting scenario with the UGV following participants and aid by transporting crates from the harvest site to designated sites. The main challenge was the precise detection of the dynamic hand gesture “come” alongside navigating through intricate environments with complexities in background surroundings and obstacle avoidance. Overall, this study lays a foundation for future advancements in human–robot collaboration in agriculture, offering insights into how integrating dynamic human movements can enhance natural communication, trust, and safety. |
| format | Article |
| id | doaj-art-e40270bd3eae433d8d73d9156ebbf40b |
| institution | OA Journals |
| issn | 2624-7402 |
| language | English |
| publishDate | 2024-08-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | AgriEngineering |
| spelling | doaj-art-e40270bd3eae433d8d73d9156ebbf40b2025-08-20T01:56:02ZengMDPI AGAgriEngineering2624-74022024-08-01632494251210.3390/agriengineering6030146Human–Robot Interaction through Dynamic Movement Recognition for Agricultural EnvironmentsVasileios Moysiadis0Lefteris Benos1George Karras2Dimitrios Kateris3Andrea Peruzzi4Remigio Berruto5Elpiniki Papageorgiou6Dionysis Bochtis7Institute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceDepartment of Informatics and Telecommunications, University of Thessaly, 35100 Lamia, GreeceInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceDepartment of Agronomy and Agroecosystem Management, University of Pisa, Via S. Michele degli Scalzi 2, 56124 Pisa, ItalyInteruniversity Department of Regional and Urban Studies and Planning, University of Torino, Viale Matttioli 39, 10125 Torino, ItalyDepartment of Energy Systems, University of Thessaly, Gaiopolis Campus, 41500 Larisa, GreeceInstitute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, GreeceIn open-field agricultural environments, the inherent unpredictable situations pose significant challenges for effective human–robot interaction. This study aims to enhance natural communication between humans and robots in such challenging conditions by converting the detection of a range of dynamic human movements into specific robot actions. Various machine learning models were evaluated to classify these movements, with Long Short-Term Memory (LSTM) demonstrating the highest performance. Furthermore, the Robot Operating System (ROS) software (Melodic Version) capabilities were employed to interpret the movements into certain actions to be performed by the unmanned ground vehicle (UGV). The novel interaction framework exploiting vision-based human activity recognition was successfully tested through three scenarios taking place in an orchard, including (a) a UGV following the authorized participant; (b) GPS-based navigation to a specified site of the orchard; and (c) a combined harvesting scenario with the UGV following participants and aid by transporting crates from the harvest site to designated sites. The main challenge was the precise detection of the dynamic hand gesture “come” alongside navigating through intricate environments with complexities in background surroundings and obstacle avoidance. Overall, this study lays a foundation for future advancements in human–robot collaboration in agriculture, offering insights into how integrating dynamic human movements can enhance natural communication, trust, and safety.https://www.mdpi.com/2624-7402/6/3/146human–robot collaborationnatural communication frameworkvision-based human activity recognitionsituation awareness |
| spellingShingle | Vasileios Moysiadis Lefteris Benos George Karras Dimitrios Kateris Andrea Peruzzi Remigio Berruto Elpiniki Papageorgiou Dionysis Bochtis Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments AgriEngineering human–robot collaboration natural communication framework vision-based human activity recognition situation awareness |
| title | Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments |
| title_full | Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments |
| title_fullStr | Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments |
| title_full_unstemmed | Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments |
| title_short | Human–Robot Interaction through Dynamic Movement Recognition for Agricultural Environments |
| title_sort | human robot interaction through dynamic movement recognition for agricultural environments |
| topic | human–robot collaboration natural communication framework vision-based human activity recognition situation awareness |
| url | https://www.mdpi.com/2624-7402/6/3/146 |
| work_keys_str_mv | AT vasileiosmoysiadis humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT lefterisbenos humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT georgekarras humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT dimitrioskateris humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT andreaperuzzi humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT remigioberruto humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT elpinikipapageorgiou humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments AT dionysisbochtis humanrobotinteractionthroughdynamicmovementrecognitionforagriculturalenvironments |