Font Size: a A A

Design And Implementation Of Indoor Autonomous Navigation System Based On Multi-sensor Fusion

Posted on:2022-01-24Degree:MasterType:Thesis
Country:ChinaCandidate:Y ZhangFull Text:PDF
GTID:2518306725969229Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
In recent years,robot technology has developed rapidly and has begun to play a role in military,agriculture,and people's daily life.Autonomous navigation technology is becoming mature as one of the core technologies of robots.Simultaneous Localization and Mapping(SLAM)technology is the foundation of autonomous navigation of robots.It is also the key for the robot to complete the assigned tasks successfully and efficiently in various environments.At the same time,positioning is also an indispensable function for mobile robots to realize autonomous navigation.However,in the complex indoor environment,there are still many shortcomings in the single sensor for mapping and positioning.When building a map,a single lidar cannot detect obstacles on other planes,and the positioning accuracy of the odometer is too low.It is difficult to ensure the high efficiency of the robot.To complete the task.In view of these shortcomings,this article mainly studies the problems of robot mapping and positioning suitable for indoor environments.The specific research contents are as follows:(1)A mapping algorithm based on the fusion of lidar and depth camera is proposed.Aiming at the current problem that a single 2D lidar cannot detect obstacles outside the detection plane of lidar,this paper proposes a mapping algorithm of lidar and depth camera fusion.This method uses Bayesian inference formula as the fusion scheme.The data of the two sensors are processed,the radar data is converted to the data format,the depth camera data is subjected to point cloud filtering,then the coordinate system of the depth camera and the lidar coordinates are jointly calibrated,and finally a new SLAM framework is built based on the Cartographer algorithm,and The fusion mapping effect was verified.(2)Multi-sensor fusion positioning module design in the SLAM process.For the already established SLAM framework,the extended Kalman filter algorithm is used to perform the fusion positioning of wheel odometer,IMU and UWB.First,the UWB data is analyzed so that the microcomputer Jetson nano on the robot can read the UWB position data,and then Design the two positioning frameworks of UWB positioning module during map building and navigation.Finally,the extended Kalman filter algorithm is studied,and the mathematical model is substituted into the fusion framework,so as to design the fusion positioning algorithm in the SLAM process.(3)Positioning module design in the navigation process.Based on the fusion result of the extended Kalman filter algorithm,the adaptive Monte Carlo algorithm is used to fuse the result with the lidar scanning matching positioning result,and the mathematical model is also substituted into the fusion framework to further improve the positioning accuracy by fusing the lidar.Aiming at the shortcoming of global positioning that requires human intervention,a new method is proposed to use UWB module to provide initial position information,document navigation pose,and automatically update particles to improve initial attitude information,so as to provide robots with global positioning without human intervention.A more accurate initial pose.Finally,in an indoor scenario,a mobile robot is used to verify the autonomous navigation system.The results show that the fusion mapping algorithm in this paper can construct a more comprehensive map of environmental information,and the positioning algorithm fused with UWB can provide higher positioning accuracy and realize global positioning without human intervention.
Keywords/Search Tags:SLAM, multi-sensor fusion, robot autonomous positioning, autonomous navigation system
PDF Full Text Request
Related items