Font Size: a A A

SLAM Based On Multi-sensor Fusion For Mobile Robot

Posted on:2021-06-22Degree:MasterType:Thesis
Country:ChinaCandidate:C ZhouFull Text:PDF
GTID:2518306551952639Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
Ego-estimation is a basic requirement for most mobile robot applications.Through sensor fusion,we can make up for the shortcomings of independent sensors and provide more reliable estimates.This paper introduces a new method of tightly coupling sensors such as lidar,inertial measurement unit IMU,and wheel speed meter,which can realize real-time six-degree-of-freedom attitude esti-mation of the robot.High update frequency outputs accurate state estimates.By jointly minimizing the cost function of lidar,wheel speed meter,and IMU measurements,the drift of the lidar-wheel speed meter-IMU odometer after long-term operation is still within the acceptable range,even in challenging situations The same is true,such as the degradation of lidar measurements.Using the speed measurement value of the wheel tachometer instead of the accelerometer of the inertial mea-surement unit IMU,the entire system is more stable,and a more reliable lidar posture estimation value is obtained.The experimental results show that the proposed method can estimate the pose of the sensor pair with high accuracy even under fast motion conditions or when the feature points are insufficient,and output at the rate of IMU update.This article is ground optimized because it takes advantage of the subdivision and optimization steps of the ground plane.First,point cloud segmentation is applied to filter out noise,and then feature extraction is performed to obtain unique planar and edge features.Then a graph optimization method based on Bayesian estimation is used to fuse multiple sensors.The author compares this scheme with the most advanced method LOAM,using a variable terrain environment with ground vehicles,and shows that this method can achieve similar or higher accuracy while reducing calculation costs.We also integrated GPS into the SLAM framework to eliminate pose estimation errors caused by drift,and tested it using the KITTI dataset.This paper proposes to integrate information from 3D lidar,inertial measurement unit(IMU)and wheel speed sensor,and use tight coupling to do state estimation.Considering the prior infor-mation of lidar-imu odometer,a rotation The constraint refinement method further optimizes the final pose and the generated point cloud map.This guarantees the consistency and robustness of the estimates,even in the case of lidar degradation.The main research results of this article include:1.The method of solving the constraints of the matching between the lidar frame and the frame was changed from a numerical solution to a closed analytical solution,thereby avoiding the unsolved state that may be caused by laser degradation.2.The combined data of inertial measurement unit(IMU)and wheel speed meter(Odo)is pro-posed as a prior for laser registration and joint optimization is performed.3.Joining GPS,using GPS position information to constrain,can get the global position of the robot.4.A complete and unified SLAM system based on laser point cloud registration and registration is designed to meet the needs of real-time positioning and high-precision mapping of outdoor mobile robots.
Keywords/Search Tags:SLAM, multi-sensor fusion, point cloud registration, graph optimization
PDF Full Text Request
Related items