Font Size: a A A

Simultaneous Localization And Mapping Of Mobile Robot Based On Multi-Source Perception Information Fusion

Posted on:2021-03-13Degree:MasterType:Thesis
Country:ChinaCandidate:T XuFull Text:PDF
GTID:2428330602489069Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Simultaneous localization and mapping(SLAM)of mobile robot is an important research direction in the field of robot and one of the key technologies for mobile robot to achieve autonomous navigation.With the continuous improvement of artificial intelligence and robot hardware technology,SLAM technology gradually develops toward diversification,mainly including two-dimensional laser SLAM and visual SLAM.In a relatively complex environment,mobile robot can use multiple sensors onboard to perform simultaneous localization and mapping,such as two-dimensional laser rangefinder,depth camera and IMU,etc.This thesis combines the characteristics of two-dimensional laser SLAM and visual SLAM,and the advantages and disadvantages of various sensors used to study and implement the SLAM of mobile robot based on multi-source perception information fusion.The main research content completed are as follows:(1)The two-dimensional laser SLAM based on Rao-Blackwellized particle filtering is studied.In view of the limitations of the traditional RBPF-SLAM algorithm,that is to say,the error of the odometer is large and the number of resampling times is large,the odometer and IMU information fusion algorithm based on the extended Kalman filter algorithm is proposed.Then combined with the fusion information of odometer and IMU and observation information,a hybrid proposal distribution based on multi-sensor information is proposed,and selective resampling is performed to achieve the improvement of the RBPF-SLAM algorithm.Finally,the accuracy of the RBPF-SLAM algorithm based on the fusion of odometer and IMU information in mapping is verified by simulation,and the accuracy of the RBPF-SLAM algorithm based on the fusion of odometer and IMU information in localization is verified through experiment.And the RBPF-SLAM algorithm based on the fusion of odometer and IMU information is used to construct the grid map in the actual scene,which further verified the effectiveness of the laser SLAM algorithm.(2)Aiming at the problem that the two-dimensional laser rangefinder cannot detect obstacles in the vertical direction,and that the error of depth camera is large in an environment with few features or too bright(dark)light,a laser and visual information fusion method is proposed.First,pseudo laser information is extracted from the depth image obtained by depth camera,then the laser information obtained by the two-dimensional laser rangefinder is filtered,and then a laser information fusion algorithm is proposed to fuse the two laser information,and then the obtained fusion laser information is used to perform obstacle detection and map construction.Finally,the simulation verifies that SLAM based on laser and visual information fusion can detect obstacles in the vertical direction and build it on the grid map,and can ensure the accuracy of robot localization at the same time.(3)In view of the problem of interframe estimation failure when the mobile robot rotates rapidly in visual SLAM,the two-dimensional pose of the robot obtained by laser SLAM is used to improve the PnP algorithm,and the Fusion-PnP interframe estimation algorithm is proposed.And through the ORB_SLAM2 framework,the visual SLAM based on Fusion-PnP algorithm is realized.Experiment were performed to compare the performance of ORB_SLAM2 and visual SLAM based on Fusion-PnP algorithm when the robot performs rapid rotation movement,and the effectiveness and robustness of visual SLAM based on Fusion-PnP algorithm is verified.
Keywords/Search Tags:multi-source perception information fusion, simultaneous localization and mapping, two-dimensional laser rangefinder, inertial measurement unit, depth camera
PDF Full Text Request
Related items