Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors
Recently, aerial manipulations are becoming more and more important for the practical applications of unmanned aerial vehicles (UAV) to choose, transport, and place objects in global space. In this paper, an aerial manipulation system consisting of a UAV, two onboard cameras, and a multi-fingered ro...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/25/2/470 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832587501878902784 |
---|---|
author | Ryuki Sato Etienne Marco Badard Chaves Silva Romulo Tadashi Wada Aiguo Ming |
author_facet | Ryuki Sato Etienne Marco Badard Chaves Silva Romulo Tadashi Wada Aiguo Ming |
author_sort | Ryuki Sato |
collection | DOAJ |
description | Recently, aerial manipulations are becoming more and more important for the practical applications of unmanned aerial vehicles (UAV) to choose, transport, and place objects in global space. In this paper, an aerial manipulation system consisting of a UAV, two onboard cameras, and a multi-fingered robotic hand with proximity sensors is developed. To achieve self-contained autonomous navigation to a targeted object, onboard tracking and depth cameras are used to detect the targeted object and to control the UAV to reach the target object, even in a Global Positioning System-denied environment. The robotic hand can perform proximity sensor-based grasping stably for an object that is within a position error tolerance (a circle with a radius of 50 mm) from the center of the hand. Therefore, to successfully grasp the object, a requirement for the position error of the hand (=UAV) during hovering after reaching the targeted object should be less than the tolerance. To meet this requirement, an object detection algorithm to support accurate target localization by combining information from both cameras was developed. In addition, camera mount orientation and UAV attitude sampling rate were determined by experiments, and it is confirmed that these implementations improved the UAV position error to within the grasping tolerance of the robot hand. Finally, the experiments on aerial manipulations using the developed system demonstrated the successful grasping of the targeted object. |
format | Article |
id | doaj-art-7447a49abfa641bfb4f7e2b42f1437ca |
institution | Kabale University |
issn | 1424-8220 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj-art-7447a49abfa641bfb4f7e2b42f1437ca2025-01-24T13:49:03ZengMDPI AGSensors1424-82202025-01-0125247010.3390/s25020470Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity SensorsRyuki Sato0Etienne Marco Badard1Chaves Silva Romulo2Tadashi Wada3Aiguo Ming4Department of Mechanical and Intelligent Systems Engineering, The University of Electro-Communications, Tokyo 1828585, JapanDepartment of Mechanical and Intelligent Systems Engineering, The University of Electro-Communications, Tokyo 1828585, JapanDepartment of Mechanical and Intelligent Systems Engineering, The University of Electro-Communications, Tokyo 1828585, JapanDepartment of Mechanical and Intelligent Systems Engineering, The University of Electro-Communications, Tokyo 1828585, JapanDepartment of Mechanical and Intelligent Systems Engineering, The University of Electro-Communications, Tokyo 1828585, JapanRecently, aerial manipulations are becoming more and more important for the practical applications of unmanned aerial vehicles (UAV) to choose, transport, and place objects in global space. In this paper, an aerial manipulation system consisting of a UAV, two onboard cameras, and a multi-fingered robotic hand with proximity sensors is developed. To achieve self-contained autonomous navigation to a targeted object, onboard tracking and depth cameras are used to detect the targeted object and to control the UAV to reach the target object, even in a Global Positioning System-denied environment. The robotic hand can perform proximity sensor-based grasping stably for an object that is within a position error tolerance (a circle with a radius of 50 mm) from the center of the hand. Therefore, to successfully grasp the object, a requirement for the position error of the hand (=UAV) during hovering after reaching the targeted object should be less than the tolerance. To meet this requirement, an object detection algorithm to support accurate target localization by combining information from both cameras was developed. In addition, camera mount orientation and UAV attitude sampling rate were determined by experiments, and it is confirmed that these implementations improved the UAV position error to within the grasping tolerance of the robot hand. Finally, the experiments on aerial manipulations using the developed system demonstrated the successful grasping of the targeted object.https://www.mdpi.com/1424-8220/25/2/470aerial manipulationunmanned aerial vehicle flight controltracking cameradepth camerarobotic handproximity sensors |
spellingShingle | Ryuki Sato Etienne Marco Badard Chaves Silva Romulo Tadashi Wada Aiguo Ming Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors Sensors aerial manipulation unmanned aerial vehicle flight control tracking camera depth camera robotic hand proximity sensors |
title | Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors |
title_full | Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors |
title_fullStr | Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors |
title_full_unstemmed | Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors |
title_short | Development of an Aerial Manipulation System Using Onboard Cameras and a Multi-Fingered Robotic Hand with Proximity Sensors |
title_sort | development of an aerial manipulation system using onboard cameras and a multi fingered robotic hand with proximity sensors |
topic | aerial manipulation unmanned aerial vehicle flight control tracking camera depth camera robotic hand proximity sensors |
url | https://www.mdpi.com/1424-8220/25/2/470 |
work_keys_str_mv | AT ryukisato developmentofanaerialmanipulationsystemusingonboardcamerasandamultifingeredrobotichandwithproximitysensors AT etiennemarcobadard developmentofanaerialmanipulationsystemusingonboardcamerasandamultifingeredrobotichandwithproximitysensors AT chavessilvaromulo developmentofanaerialmanipulationsystemusingonboardcamerasandamultifingeredrobotichandwithproximitysensors AT tadashiwada developmentofanaerialmanipulationsystemusingonboardcamerasandamultifingeredrobotichandwithproximitysensors AT aiguoming developmentofanaerialmanipulationsystemusingonboardcamerasandamultifingeredrobotichandwithproximitysensors |