Font Size: a A A

Research On SLAM Algorithm Of Mobile Robot Fusion Of 2D Laser And Depth Camera

Posted on:2021-05-20Degree:MasterType:Thesis
Country:ChinaCandidate:S Z HeFull Text:PDF
GTID:2428330605956937Subject:Mechanical engineering
Abstract/Summary:PDF Full Text Request
Simultaneous Localization and Mapping(SLAM)is one of the key technologies for robots to move autonomously.From the early lidar to the current fiery visual SLAM,the usual SLAM methods are limited to a single sensor to collect environmental information.However,using lidar to gather information can only collect one plane of environmental information,and cannot completely restore the real scene;the visual camera has rich information and can restore local details of the environment,but it takes up a lot of computing resources and the generated map cannot be directly applied to the subsequent navigation and other tasks of the robot and has low accuracy,susceptibility to interference,and poor robustness.Multi-sensor fusion is the development trend of SLAM.In this paper,multi-sensor fusion is based on lidar,RGB-D camera,encoder,IMU,etc.This paper conducts theoretical and experimental research on multi-sensor calibration,state estimation,and graph optimization SLAM algorithm.The main contents are as follows(1)Construction and modeling of multiple sensors mobile robot experimental platform Firstly,the multiple sensor mobile robot experimental platform is built,then the robot coordinate system is established,and the differential kinematics model is derived.Secondly,the sensor models are introduced,including the odometer model,lidar model,and depth camera model.Finally,the internal parameters of the depth camera are calibrated.(2)The external parameters of the multiple sensor systems are calibrated.The key data unification problem in the multi-sensor system is explained,the principle of joint calibration of external parameters between odometer and lidar,lidar and RGB-D camera is deduced,and the external parameter calibration of the robot is completed by designing experiments.(3)Multi-sensor fusion positioning.The solution is based on Bayes' rule,the UKF method is applied to the multi-sensor SLAM problem,the multi-sensor data fusion is performed using the UKF algorithm,the state equation of the system is derived,and the sensor information is updated to the system using the pose measurement model.Algorithm design shows that the integration of UKF positioning accuracy in the comparison of odometer dead reckoning is greatly improved,which is further improved than the EKF algorithm.The experimental design loopback test results show that UKF positioning accuracy is improved by 1.9%compared with pure laser positioning(4)Research on Multi-sensor SLAM scheme.In this paper,3D laser point cloud information generated by an RGB-D camera is added based on 2D lidar based on the graph optimization framework.In the sequence registration stage,2D laser point cloud and 3D RGB-D point cloud are matched by CSM method;In the loop detection stage,the 3D point cloud is described by the descriptor,and the accuracy of the loop is further verified by the 3D descriptor after the 2D laser matching.Experimental verification shows that the multi-sensor SLAM framework designed in this paper has a good mapping effect,and the algorithm has strong robustness.Figure[48]table[10]reference[76]...
Keywords/Search Tags:mobile robot, simultaneous localization and mapping, multi-sensor fusion, lidar, RGB-D camera
PDF Full Text Request
Related items