Font Size: a A A

Research On Indoor Mobile Robot Localizaion Based On Visual And Inertial Navigation Fusion

Posted on:2020-09-10Degree:MasterType:Thesis
Country:ChinaCandidate:X R GongFull Text:PDF
GTID:2428330572478126Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
The positioning of mobile robot is one of the most important research topics in the field of robots,and it is also the basis of autonnomous navagation for the mobile robot.Because of the limitation of GPS signal in indoor environment,the positioning method based on GPS signal can not be widely used in indoor mobile robot localization.Therefore,vision sensors,inertial sensors,wheeled odometer and other sensors without signal constraints are used in indoor mobile robot localization.At present,due to environmental impact,the visual sensor has the problem of tracking lost and inaccurate prediction;the short-term prediction of inertial sensor is stable,but its cumulative error is very serious.In order to solve this problem,this paper combines visual positioning and inertial positioning technology to form a visual-inertial fusion positioning method,so as to achieve both indoor positioning accuracy and stability.The specfic research contents are as follows:Firstly,the basic theory of strapdown inertial navigation system in inertial navigation system is studied,including coordinate system transformation,the knowledge of attitude calculation and dead reckoning.At the same time,the transformation of coordinate system in mobile robot is deduced,and the transformation relationship between coordinate systems is constructed,which lays the foundation for the follow-up research.Secondly,the vision localization of mobile robot is studied.Firstly,the vision sensor Kinect used in this paper is introduced,and the imaging model is established.Then,the ORB-SLAM2 visual odometry process based on Kinect camera and the robot positioning model are described.Aiming at the problem that ORB-SLAM2 is limited in environment and illumination conditions,and the sharp decrease of contrast easily leads to the reduction of feature points,which leads to tracking failure,a feature point detection method with adjustable threshold is proposed based on the acquisition of image contrast information,which improves the robustness of ORB-SLAM2.Third,The method of visual inertial integrated positioning is studied,and an integrated positioning scheme based on ORB-SLAM2 fusion inertial data is presented.Robot motion prediction model is constructed by using inertial data.ORB-SLAM2 outputs camera pose as update of robot pose.A loosely coupled visual inertia fusion framework is formed by extended Kalman filter to estimate the pose of visual odometry and IMU output.Fourth,we introduced the hardware and software system of the whole platform.Aiming at the problem of too large operational data,constructed the distributed software architecture of multiple computers under ROS,and built the robot model and simulation environment based on ROS.According to the proposed method of feature point detection and combined positioning,a number of experiments were designed to verify its feasibility and reliability in the simulation environment and the actual platform.
Keywords/Search Tags:Indoor mobile robot localization, Inertial navigation, ORB-SLAM2, Extended Kalman filter
PDF Full Text Request
Related items