Font Size: a A A

Research On Simultaneous Localization And Mapping Based On LiDAR-RGB-IMU Fusion

Posted on:2023-12-06Degree:MasterType:Thesis
Country:ChinaCandidate:F Z DongFull Text:PDF
GTID:2558306839495574Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Simultaneous localization and mapping(SLAM)can give intelligent unmanned systems the ability to locate and build scene maps in unknown scenes,and plays an important role in the fields of warehousing,logistics and automatic driving.In recent years,SLAM algorithms based on single sensor,especially LiDAR and RGB cameras,have been widely studied.However,in the tunnel,open ground and other scenes lacking structural information,SLAM based on LiDAR will fail.The SLAM algorithm based on RGB camera also has poor performance in the scene of low illumination or sharp illumination change.In order to improve the accuracy and robustness of SLAM in a variety of complex environments,this paper studies the LiDAR-RGB-IMU fusion SLAM algorithm.Through the comparative analysis of current slam algorithms,we think that the current LiDAR-RGB-IMU fusion SLAM has four shortcomings: poor scalability,insufficient data fusion,high computational complexity and poor robustness.Therefore,this paper aims to improve the following four shortcomings.Firstly,aiming at the problem of poor scalability,we study the Kalman Filter and its shortcomings from the perspective of fusion algorithm framework,and select Error State Iterative Kalman Filter(ESIKF)as the core algorithm of multi-sensor fusion.The proposed fusion SLAM algorithm consists of two parts: LiDAR IMU Odometry(LIO)and RGB IMU Odometry(VIO).It processes the sensor data in time sequence,so it is suitable for LiDAR and RGB cameras with any frequency.Secondly,aiming at the problems of insufficient data fusion and high computational complexity,we study from the perspective of LiDAR and RGB data fusion,and propose a two-stage landmark initialization algorithm and dynamic factor graph based on triangulation local map projection.These methods are based on the constraints between visual landmarks and LiDAR local point cloud maps,fully integrate the trajectory and map information of LIO and VIO with a small amount of computation,ensure the consistency between the trajectory and map of LIO and VIO,and improve the problem of insufficient fusion of LiDAR and RGB data.In addition,by designing LiDAR-RGB fusion algorithm based on local point cloud map,this algorithm has no restrictions on LiDAR and camera field of view,and further improves the universality of the algorithm.Then,aiming at the problem of poor robustness,we study it from the perspective of sensor degradation,and propose a sensor degradation detection and optimization algorithm and a dynamic update strategy.By analyzing a variety of lidar degradation scenarios and state update methods based on observation equations,a degradation detection and state update method based on the eigenvalue of Kalman gain matrix is proposed to reduce the localization error of Lio in degradation scenarios.By studying the characteristics of LIDAR point clouds and RGB images,a dynamic update strategy is proposed.Only when lidar is degraded,the system state is updated based on RGB observation equation,so that the algorithm proposed in this paper can accurately locate in the degraded scene of lidar or RGB camera,solve the problem of poor robustness,and improve the problem of increasing positioning error caused by blind fusion of data.Finally,the experimental conditions and data sets are introduced,and the evaluation indexes are selected.Experiments are carried out on a variety of sensor degraded and non degraded data sets,and compared with a variety of current advanced methods,which prove the effectiveness of the methods proposed in this paper.They can maintain high accuracy in both sensor and non degraded scenes,and are suitable for LiDAR and RGB cameras with different types and fields of view,and maintain a small amount of computation.
Keywords/Search Tags:intelligent unmanned system, SLAM, ESIKF, muti-sensor fusion, sensor degeneration, factor optimization
PDF Full Text Request
Related items