Font Size: a A A

Research On Autonomous Navigation Of Mobile Robot Based On Fusion Of Vision And Lidar

Posted on:2022-03-27Degree:MasterType:Thesis
Country:ChinaCandidate:M ZhongFull Text:PDF
GTID:2518306572452904Subject:Mechanical engineering
Abstract/Summary:PDF Full Text Request
Mobile robots can replace human beings in high-risk and arduous tasks to ensure safety and improve work efficiency.In the post-disaster rescue work,there are a large number of random obstacles in the damaged building,the space is relatively closed,and the light is poor.A post-disaster rescue robot equipped with a single sensor cannot achieve good positioning,mapping and navigation in this environment.In view of this,this paper designs and implements an autonomous navigation system for mobile robots based on the fusion of vision and lidar.O n this basis,this paper completed the positioning and mapping of the mobile robot,and realized the autonomous navigation of the mobile robot in a complex environment.First of all,the data preprocessing is carried out on the multi-sensor fusion of odometer,vision and lidar.The motor speed is obtained through the encoder of the wheel motor of the mobile robot,and track deduction method are used to generate the data of odometer then published through ROS topic.For odometer data,vision data and lidar data of different frequencies,two different synchronization algorithms,accurate synchronization and approximate synchronization,are designed to unify the data timestamp and reduce the computing performance and storage resource occupation.The camera and lidar are calibrated jointly to correct the camera distortion and obtain more accurate relative pose between the camera and lidar.A program is written to test the extraction and matching speed of different visual features,and appropriate visual features are selected for the visual and lidar SLAM algorithm of this paper.Then,in view of the four major difficulties in the positioning and mapping of mobile robots in damaged buildings after the disaster,a SLAM system solution integrating vision and lidar is designed.The visual features extracted from the camera input image are combined with the bag-of-words model to construct visual vocabulary to create nodes.The node weights are updated by the number of visual vocabulary matches between nodes,and the nodes with higher weights are estimated by discrete Bayesian filtering for loop detection.At the same time,the proximity relationship between nodes is constructed based on lidar data.When the localization is inaccurate or fails,the proximity detection is performed for the nodes with proximity relationship.Aiming at the defects of occupancy grid map built by 2D laser slam and the point cloud map built by visual slam cannot be used for navigation,a conversion algorithm from point cloud map to occupancy gri d map is designed.Then,for the complex environment after the disaster,a mobile robot autonomous navigation scheme composed of a hybrid A* global path planning algorithm and TEB local obstacle avoidance algorithm is designed.Considering that the global path planned by the commonly used A* algorithm in the navigation algorithm is connected by polyline segments,which causes the mobile robot navigation to include excessive acceleration,deceleration,and rotation in place.The hybrid A* algorithm designed and implemented in this paper considers the kinematic constraints of the mobile robot in the path search phase,and constructs a path smoothing objective function to smooth the generated path using the Ceres library.The TEB local obstacle avoidance algorithm interpolates the global path,constructs temporary target points,imposes constraints on the current pose and temporary target points of the mobile robot,such as time,obstacle avoidance,curvature,and speed,and plans a smooth local path to ensure t he mobile robot to reach the target pose safely.Finally,according to the simulation platform and physical platform built in this paper,the simulation experiments and physical experiments are carried out respectively.A simulation model of Gazebo is established by using URDF and Solid Works to verify the mapping effect,navigation speed and obstacle avoidance effect of the algorithm designed in this paper.The actual scene is arranged,and the autonomous navigation algorithm is verified through the sibot e xperimental platform.The simulation and experiment results are analyzed and summarized.
Keywords/Search Tags:mobile robot, multi-sensor fusion, mapping, autonomous navigation, hybrid A* algorithm
PDF Full Text Request
Related items