Font Size: a A A

Simultaneous Localization And 3D Environment Mapping Based On Visual-Inertial-Lidar Sensor Fusion

Posted on:2020-08-20Degree:MasterType:Thesis
Country:ChinaCandidate:J ChenFull Text:PDF
GTID:2428330578954855Subject:Control engineering
Abstract/Summary:PDF Full Text Request
With the development of science and technology in our country,unmanned systems are widely used in workshop and factory,and new intelligent mobile robots will gradually appear in people's life.SLAM(Simultaneous Localization and Mapping)technology is the core technology for autonomous system location and navigation of unmanned.At present,only use lidar or Visual Inertial Odometry(VIO)for location and mapping is subject to environmental constraints.Based on this background,this paper uses multiple sensor fusion to study the autonomous localization of mobile robots.Firstly,due to the motion distortion under the rapid motion of lidar,In this paper,the existing fusion method of lidar and Inertial Measurement Unit(IMU)is improved.The lidar is tightly coupled with the IMU to correct the point cloud.This effectively corrects the distortion point cloud caused by motion,making the lidar more accurate in data matching and improving the system location accuracy.In addition,aiming at the problem of odometer drift caused by insufficient feature constraints in structured scenes and lack of visual texture,this paper fuses monocular camera and lidar,uses lidar to establish depth maps to recover the depth of visual feature points,and solves the visual odometer as a prior estimation of lidar point cloud attitude.This method of fusion vision aided positioning makes up for the defects between different sensors,solves the problem of limited use of sensors in some scenarios,and effectively improves the robustness of location system.The hardware platform of the experimental system and the underlying data fusion are introduced firstly,including the design of the underlying lidar platform and the calibration principle of external parameters between sensors.On the basis of hardware,the process of lidar feature extraction,matching and tight coupling for odometer is introduced.The basic algorithm of vision and lidar fusion and visual odometer is expounded.Finally,the process of map establishment is described.Finally,in the indoor and outdoor scenes of the campus,the sensor is used for field testing to verify the algorithm.On the Robot Operating System(ROS),the visualization tool is used to display the location and mapping results.The experimental results show that the performance of this algorithm is better than the previous lidar location algorithm,and the localization accuracy and mapping effect are also improved.
Keywords/Search Tags:SLAM, Lidar, Sensor Fusion, IMU
PDF Full Text Request
Related items