Terrain Traversability via Sensed Data for Robots Operating Inside Heterogeneous, Highly Unstructured Spaces

This paper presents a comprehensive approach to evaluating the ability of multi-legged robots to traverse confined and geometrically complex unstructured environments. The proposed approach utilizes advanced point cloud processing techniques integrating voxel-filtered cloud, boundary and mesh genera...

Full description

Saved in:
Bibliographic Details
Main Authors: Amir Gholami, Alejandro Ramirez-Serrano
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/2/439
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a comprehensive approach to evaluating the ability of multi-legged robots to traverse confined and geometrically complex unstructured environments. The proposed approach utilizes advanced point cloud processing techniques integrating voxel-filtered cloud, boundary and mesh generation, and dynamic traversability analysis to enhance the robot’s terrain perception and navigation. The proposed framework was validated through rigorous simulation and experimental testing with humanoid robots, showcasing the potential of the proposed approach for use in applications/environments characterized by complex environmental features (navigation inside collapsed buildings). The results demonstrate that the proposed framework provides the robot with an enhanced capability to perceive and interpret its environment and adapt to dynamic environment changes. This paper contributes to the advancement of robotic navigation and path-planning systems by providing a scalable and efficient framework for environment analysis. The integration of various point cloud processing techniques into a single architecture not only improves computational efficiency but also enhances the robot’s interaction with its environment, making it more capable of operating in complex, hazardous, unstructured settings.
ISSN:1424-8220