Font Size: a A A

Simultaneous Localization And Mapping For Mobile Robots Based On Multi-Sensor Data

Posted on:2018-10-10Degree:MasterType:Thesis
Country:ChinaCandidate:M X ChenFull Text:PDF
GTID:2428330623450673Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Nowadays,mobile robots have been widely used in many fields such as disaster rescue,industrial manufacturing or domestic service.In order to achieve autonomous navigation,it is necessary to solve the problem of precise positioning and map construction.Simultaneous localization and mapping(SLAM)is a fundamental technique to achieve autonomous navigation and sequential tasks,while robustness and accuracy of localization and mapping are particularly important.Lidars,cameras and other sensors are essential to obtain motion information and perceive environmental information.However,with their constraints in environmental perception,those SLAM systems based on a single sensor are difficult to obtain reliable environmental characteristics especially in unknown scenes,which may cause tracking failure and affect the quality of map construction.We can implement sensor fusion methods for SLAM in order to utilize advantages of different sensors.In addition,while it is improper to use the high cost and large volume of the three-dimensional lidar for small mobile robots,how to use twodimensional laser to build three-dimensional environment point cloud map is also a question worth exploring.Based on the existing multi-sensor fusion SLAM method,we present a new solution for mobile robot SLAM system based on multi-modal sensor data.With the method of extended Kalman filter(EKF),inertial measurement,pose estimates from visual SLAM and two-dimensional laser data are fused to build a three-dimensional environment map in real time,improving robustness and accuracy of the SLAM system.In this thesis,we present a solution of pose estimates from an IMU-aided visual SLAM system based on EKF.The inertial measurement is used for propagation and prediction and the core part of visual SLAM localization is used for correction.With initial measurement,the pose estimates of visual SLAM are used as observed value to optimize.Based the pose estimate of EKF,the information obtained by two-dimensional lidar is projected,coordinate converted and aggregated to obtain the point cloud units corresponding to certain keyframes in visual SLAM.Finally,the three-dimensional laser point cloud map can be build incrementally.In this thesis,a ground mobile robot running ROS,which is an open source robot operating system,is used as the experimental platform.An IMU,a RGB-D camera and a two-dimensional lidar are mounted on it.Experiments are conducted in several real scenes.Experimental results show that the IMU-aided visual SLAM can achieve robust localization.Combined pose estimation results,two-dimensional laser data can be used to construct accurate 3D laser point cloud map in real time.
Keywords/Search Tags:Mobile Robot, Simultaneous Localization and Mapping, Multi-Sensor Fusion, Extended Kalman Filter, Three-Dimensional Pointcloud Mapping
PDF Full Text Request
Related items