Font Size: a A A

Extrinsic Calibration And Odometry For Camera-LiDAR Systems

Posted on:2020-04-08Degree:MasterType:Thesis
Country:ChinaCandidate:C H ShiFull Text:PDF
GTID:2518306548995299Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Most autonomous mobile robots are often equipped with monocular cameras and3 D Li DARs to perform vital tasks such as localization and mapping.Camera-Li DAR systems can often provide more accurate and more complete localization and mapping results than an individual sensor.However,to achieve improved performance for such multiple sensors system,two important problems need to be addressed.First of all,to register information from multiple sensors,the spatial relationship between individual sensor has to be determined.Such a task is also called extrinsic calibration.Second,how to fuse sensor data for taking full advantage of the individual sensor,and thus provide improved results.Such a task is also called data fusion.In this paper,we address such two problems and present a two-stage extrinsic calibration method as well as a hybridresidual-based odometry approach for such camera-Li DAR systems.Our extrinsic calibration method combines a motion-based approach and a mutualinformation-based approach.The motion-based approach is referred as the first stage calibration.It is designed to provide initial calibration results without a given initial guess.The mutual-information-based approach is referred as the second stage calibration.It takes the results from the first stage as the initial guess and further refines the extrinsic parameters by registering the Li DAR reflectivity information to the camera image intensity information using a metric called mutual information.Furthermore,to achieve high estimation accuracy,we apply a novel occlusion detection algorithm to the second stage calibration.Our extrinsic calibration method can estimate the relative transformation between the camera and the Li DAR with high accuracy,and with no requirement to the initial guess,allowing us to better register the image and the point cloud data.After the calibration,our hybrid-residual-based odometry can be used to provide realtime,accurate odometry estimates.Our approach exploits both direct and indirect image features.The sensor motions are estimated by jointly minimizing reprojection residuals and photometric residuals in a nonlinear optimization procedure.The occlusion detection algorithm is also applied to the odometry pipeline to improve accuracy.Furthermore,we utlize a color-based depth interpolation method to solve the depth missing problem.Our hybrid-residual-based odometry can generate accurate and complete 3D color map,which provides an important foundation for advanced applications such as human-computer interaction,3D reconstruction and so on.Experiments are conducted to show the accuracy and robustness of our extrinsic calibration and odometry algorithms using both public and self-owned real-world datasets.The results suggest that our calibration method can provide accurate extrinsic parameters estimation without using initial values,and our odometry approach can achieve competitive estimation accuracy and robustness.
Keywords/Search Tags:Data Fusion, Extrinsic Calibration, Hybrid Residuals, Odometry, Camera-Li DAR Systems
PDF Full Text Request
Related items