Font Size: a A A

Research On Mobile Robot Navigation System Based On Information Fusion

Posted on:2020-07-06Degree:MasterType:Thesis
Country:ChinaCandidate:D X ZhangFull Text:PDF
GTID:2428330572481041Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
In indoor robot navigation,accurate navigation and positioning technology is the premise for mapping,obstacle avoidance and path planning.Because the indoor environment is complex and variable,the visual sensor is usually used as the main input information to navigate and locate.However,the visual sensor is very dependent on the image,and there is mismatch in the special scene.At the same time,relying on a single vision sensor alone,the cumulative error of the system cannot be eliminated.Therefore,this paper combines both visual sensors and inertial navigation sensors to achieve robot navigation and positioning.In this paper,the mobile robot has been deeply researched on simultaneous positioning and mapping,sensor fusion and path planning.The main contents are as follows:(1)Conduct research on simultaneous positioning and mapping methods.First,the conversion between the model and coordinate system of the camera and how the depth camera acquires the depth information are explained;Then use Zhang Zhengyou camera calibration method,use the toolbox that comes with matlab to calibrate the depth camera xtion,and obtain the internal parameters of the camera;Finally,the ORB-SLAM2 algorithm is introduced in detail.The color image information and depth image information are acquired by xtion depth camera.The algorithm is used to calculate the pose of the mobile robot and establish a sparse point cloud map,and loop detection is used to reduce the cumulative error.(2)A positioning scheme based on xtion and INS information fusion is developed.Firstly,the error of Strapdown Inertial Navigation and Depth Camera is analyzed;The inertial navigation is then calibrated to reduce the error;Finally,the prediction equation and the observation equation are established according to the Strapdown Inertial Navigation and the depth camera xtion,and the Kalman filtering scheme based on the fusion of vision sensor and inertial navigation information is given.The experimental results show that this scheme can improve the positioning accuracy compared to a single vision sensor.(3)Path planning based on a combination of vision and inertial navigation is implemented.Firstly,using RGB-D information and INS information to perform autonomous positioning based on ORB-SLAM2 algorithm and construct a dense point cloud map;Then send the robot's own positioned and built raster map to the move-base framework;Finally,the global and local path planning is performed using the A* and dynamic window methods under the move-base framework.The experimental results show that the proposed method can realize the path planning efficiently while the mobile robot completes its own positioning and map construction.
Keywords/Search Tags:Navigation, Inertial navigation, Depth camera, Sensor fusion, Path planning
PDF Full Text Request
Related items