Font Size: a A A

Multi-Sensor Based Simultaneous Localization And Mapping Techniques

Posted on:2021-05-16Degree:MasterType:Thesis
Country:ChinaCandidate:K X WangFull Text:PDF
GTID:2428330611496471Subject:Instrument Science and Technology
Abstract/Summary:PDF Full Text Request
Simultaneous localization and mapping(SLAM)is one of the key techniqes in the field of mobile robots and intelligent driving.However,with increasely complex scenes and higher requirements for localization and mapping technology,the present SLAM technology using a single sensor cannot meet the existing requirements in terms of localization stability and mapping accuracy.Thus,a multi-sensor SLAM technology integrating multi-line lidar,camera,and IMU is proposed in this thesis to achieve the technical goal of simultaneous localization and mapping with better stability and accuracy in complex environments.At first,the joint calibration of multi-line lidar and camera is carried out to solve the external parameters between the lidar coordinate system and camera coordinate system.Due to the sparsity of camera sampling pixels and multi-line lidar point clouds,it is difficult to find a point-to-point correspondence betwwen them,thereforce,a surface-to-surface matching method is designed to solve this problem,by segmenting a specific plane in the point clouds and recovering the 3D points corresponding to the pixel plane restored in the image,and then the point cloud registration algorithm is used to obtain the transformation matrix from 3D point to 3D point,thus,the external parameter matrix between the camera coordinate system and the multi-line lidar coordinate system can be obtained.This calibration method of 3D-to-3D reduces the errors caused by the sparseness of image pixels and point clouds.Then,linear interpolation algorithm is used to align the time stamps to solve the time inconsistency caused by the working frequency differences betwwen the sensors.Finally,the output odometer of combining the front-end odometers of the monocular vision SLAM and the multi-line lidar SLAM is used as a front-end odometer of system,and the error function is set up by using the image feature information and point cloud geometric feature information,the error function is the back-end optimization objective function,thus,the optimal position estimation output can be obtained by using the graph optimization algorithm,and the sliding window is used to maintain to the global map in the end.In this thesis,a multi-sensor fusion SLAM system integrating the camera,IMU and multi-line lidar is formed through a joint calibration of camera and multi-line lidar and linear interpolation algorithm,combining the monocular SLAM algorithm and multi-line lidar SLAM algorithm.The experimental results show that compared with the LOAM algorithm,the multi-sensor fusion SLAM system designed in this thesis is more stable in the estimation of localization and higher accuracy in mapping under complex environment.
Keywords/Search Tags:Multi-line lidar, Camera, IMU, Multi-sensor fusion, SLAM, Pose estimation
PDF Full Text Request
Related items