Font Size: a A A

Research On Visual Inertial Combined Positioning Of Indoor Mobile Robot

Posted on:2022-06-14Degree:MasterType:Thesis
Country:ChinaCandidate:Z G WangFull Text:PDF
GTID:2518306539491934Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Robotic technology is a high-tech formed by integrating control theory,computer science,sensor technology and bionics,and the application of robotics is an important indicator to measure the level of industrial automation in a country.With the deepening of robotics research,the scope of application of robots continues to expand,covering medical,industrial,logistics and other fields.Among them,the indoor navigation and positioning of robots is a research hotspot now,because in the indoor environment,traditional satellite navigation technology is limited,which requires autonomous positioning and navigation technology of robots.Among the common autonomous positioning and navigation technologies,visual SLAM technology stands out with its powerful image processing capabilities and low cost.In view of the shortcomings of a single sensor,and the vision sensor and the inertial measurement unit have excellent complementarity and autonomy,the fusion of multiple sensors is a development trend.This paper proposes a visual-inertial combined positioning algorithm,which uses a tight coupling method for multi-sensor fusion.It has studied a number of key technologies of vision sensors and inertial measurement units,and experiments have proved that it is indeed suitable for the indoor positioning field of mobile robots.The main research contents of this paper are as follows:First of all,the key technologies of vision sensors are studied,and two improvements are made in view of the limitations of the classic ORB algorithm.The first is to use the SURF algorithm to improve the scale invariance of the ORB algorithm.In the image feature matching experiments of different scales,the average matching accuracy is improved by 57.5%;the second is to use filters and model pre-check operations to improve the RANSAC algorithm for ORB algorithm.In the fine matching link,the average matching optimization rate has increased by 32.6%.Finally,the calculation method of the visual measurement constraint is given,which provides data support for the sensor fusion in the following.Then,the key technology of inertial measurement unit is studied.The preintegration technology is used to fuse camera data and IMU data of different sampling frequencies to obtain the attitude change in the IMU body coordinate system;establish the error state recursive equation to derive the covariance matrix,so as to apply the attitude change to the subsequent nonlinear During optimization;the constraint relationship between the two moments is described by the amount of attitude change,and the expression of the inertial measurement constraint is obtained.In the meantime,an IMU internal parameter calibration method using the Vicon system is proposed.This method overcomes the dependence on the turntable and it is easy to operate.In the experiment,the trajectory deviation of about 10 m before IMU calibration is corrected.Finally,the key technologies of the visual-inertial combined positioning method are studied,covering joint initialization,tight coupling technology,sliding window technology and marginalization technology,etc.Put the visual measurement constraint and inertial measurement constraint obtained in the previous article into the same state quantity.In the process,an optimization function is constructed to iteratively optimize the real-time pose of the carrier.In the linear trajectory experiment,the average error in the X-axis direction is 0.03 m,and the average error in the Y-axis direction is 0.04m;in the polygonal trajectory experiment,the average error in the X-axis direction is0.06 m,and the average error in the Y-axis direction is 0.07 m.
Keywords/Search Tags:indoor positioning, improved ORB algorithm, IMU internal parameter calibration, multi-sensor fusion
PDF Full Text Request
Related items