Calibration between a panoramic LiDAR and a limited field-of-view depth camera

Abstract Depth cameras and LiDARs are commonly used sensing devices widely applied in fields such as autonomous driving, navigation, and robotics. Precise calibration between the two is crucial for accurate environmental perception and localization. Methods that utilize the point cloud features of b...

Full description

Saved in:
Bibliographic Details
Main Authors: Weijie Tang, Bin Wang, Longxiang Huang, Xu Yang, Qian Zhang, Sulei Zhu, Yan Ma
Format: Article
Language:English
Published: Springer 2024-12-01
Series:Complex & Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1007/s40747-024-01710-x
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832571188451213312
author Weijie Tang
Bin Wang
Longxiang Huang
Xu Yang
Qian Zhang
Sulei Zhu
Yan Ma
author_facet Weijie Tang
Bin Wang
Longxiang Huang
Xu Yang
Qian Zhang
Sulei Zhu
Yan Ma
author_sort Weijie Tang
collection DOAJ
description Abstract Depth cameras and LiDARs are commonly used sensing devices widely applied in fields such as autonomous driving, navigation, and robotics. Precise calibration between the two is crucial for accurate environmental perception and localization. Methods that utilize the point cloud features of both sensors to estimate extrinsic parameters can also be extended to calibrate limited Field-of-View (FOV) LiDARs and panoramic LiDARs, which holds significant research value. However, calibrating the point clouds from two sensors with different fields of view and densities presents challenges. This paper proposes methods for automatic calibration of the two sensors by extracting and registering features in three scenarios: environments with one plane, two planes, and three planes. For the one-plane and two-plane scenarios, we propose constructing feature histogram descriptors based on plane constraints for the remaining points, in addition to planar features, for registration. Experimental results on simulation and real-world data demonstrate that the proposed methods in all three scenarios achieve precise calibration, maintaining average rotation and translation calibration errors within 2 degrees and 0.05 meters respectively for a $$360^{\circ }$$ 360 ∘ linear LiDAR and a depth camera with a field of view of $$100^{\circ }$$ 100 ∘ vertically and $$70^{\circ }$$ 70 ∘ degrees horizontally.
format Article
id doaj-art-a1ca524969384d3397a503e52e054ed5
institution Kabale University
issn 2199-4536
2198-6053
language English
publishDate 2024-12-01
publisher Springer
record_format Article
series Complex & Intelligent Systems
spelling doaj-art-a1ca524969384d3397a503e52e054ed52025-02-02T12:50:08ZengSpringerComplex & Intelligent Systems2199-45362198-60532024-12-0111111610.1007/s40747-024-01710-xCalibration between a panoramic LiDAR and a limited field-of-view depth cameraWeijie Tang0Bin Wang1Longxiang Huang2Xu Yang3Qian Zhang4Sulei Zhu5Yan Ma6The College of Information, Mechanical, and Electrical Engineering, Shanghai Normal UniversityThe College of Information, Mechanical, and Electrical Engineering, Shanghai Normal UniversityShenzhen Guangjian Technology Co., LtdShenzhen Guangjian Technology Co., LtdThe College of Information, Mechanical, and Electrical Engineering, Shanghai Normal UniversityThe College of Information, Mechanical, and Electrical Engineering, Shanghai Normal UniversityThe College of Information, Mechanical, and Electrical Engineering, Shanghai Normal UniversityAbstract Depth cameras and LiDARs are commonly used sensing devices widely applied in fields such as autonomous driving, navigation, and robotics. Precise calibration between the two is crucial for accurate environmental perception and localization. Methods that utilize the point cloud features of both sensors to estimate extrinsic parameters can also be extended to calibrate limited Field-of-View (FOV) LiDARs and panoramic LiDARs, which holds significant research value. However, calibrating the point clouds from two sensors with different fields of view and densities presents challenges. This paper proposes methods for automatic calibration of the two sensors by extracting and registering features in three scenarios: environments with one plane, two planes, and three planes. For the one-plane and two-plane scenarios, we propose constructing feature histogram descriptors based on plane constraints for the remaining points, in addition to planar features, for registration. Experimental results on simulation and real-world data demonstrate that the proposed methods in all three scenarios achieve precise calibration, maintaining average rotation and translation calibration errors within 2 degrees and 0.05 meters respectively for a $$360^{\circ }$$ 360 ∘ linear LiDAR and a depth camera with a field of view of $$100^{\circ }$$ 100 ∘ vertically and $$70^{\circ }$$ 70 ∘ degrees horizontally.https://doi.org/10.1007/s40747-024-01710-xCalibrationLiDARDepth cameraThree scenariosFeature histogram descriptor
spellingShingle Weijie Tang
Bin Wang
Longxiang Huang
Xu Yang
Qian Zhang
Sulei Zhu
Yan Ma
Calibration between a panoramic LiDAR and a limited field-of-view depth camera
Complex & Intelligent Systems
Calibration
LiDAR
Depth camera
Three scenarios
Feature histogram descriptor
title Calibration between a panoramic LiDAR and a limited field-of-view depth camera
title_full Calibration between a panoramic LiDAR and a limited field-of-view depth camera
title_fullStr Calibration between a panoramic LiDAR and a limited field-of-view depth camera
title_full_unstemmed Calibration between a panoramic LiDAR and a limited field-of-view depth camera
title_short Calibration between a panoramic LiDAR and a limited field-of-view depth camera
title_sort calibration between a panoramic lidar and a limited field of view depth camera
topic Calibration
LiDAR
Depth camera
Three scenarios
Feature histogram descriptor
url https://doi.org/10.1007/s40747-024-01710-x
work_keys_str_mv AT weijietang calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera
AT binwang calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera
AT longxianghuang calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera
AT xuyang calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera
AT qianzhang calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera
AT suleizhu calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera
AT yanma calibrationbetweenapanoramiclidarandalimitedfieldofviewdepthcamera