Font Size: a A A

Research On Stereo-vision / IMU Deep Coupling Indoor Navigation Technology

Posted on:2021-12-21Degree:MasterType:Thesis
Country:ChinaCandidate:J TanFull Text:PDF
GTID:2518306503973299Subject:Instrumentation engineering
Abstract/Summary:PDF Full Text Request
With the widespread application of intelligent robots,its operating environment has become more complex and diverse.Precise navigation and positioning is not only the key to ensure its stability in complex environments,but also the basis of its control system design.However,the independent inertial navigation technology cannot work for a long time due to the lasting accumulated errors,and the stereo camera is easily affected by environmental factors like light.A combination of the two kinds of sensors will give rise to two specific advantages.For one thing,it can help correct the drift of the IMU.For another,the IMU can provide short-term effective motion information in case of camera failure.In view of this,this paper proposes a stereo-vision/IMU deeply coupled navigation method for indoor robot navigation.The main research contents of the article are summarized as follows:1)Characteristic analysis of stereo cameras and IMU sensors.The measurement model and output characteristics of the sensors utilized herein are deeply dug.The noise characteristics and zero bias characteristics of the inertial sensors are analyzed.In addition,the description method of rigid body motion and the mapping relationship between Lie groups and Lie algebras are introduced.These comprehensive fundamental knowledge provide a theoretical basis for the design of fusion algorithms.2)Research on Vision Tracking Algorithm with Deep Coupling of IMU and Image Information.Aiming at the problem of dense distribution of feature points,an optimization extraction method based on the response intensity of the corner points was proposed,and an image mask was employed to achieve the efficient addition of feature points.Secondly,in order to improve the performance of feature matching,a multi-layer optical flow feature tracking algorithm combined with IMU prior position prediction is exploited.Finally,a method based on rotation compensation for continuous image frame mismatch elimination is developed,which simplifies the parameter estimation model of the epipolar geometry constraint equation and effectively eliminates mismatch.The algorithm eventually realized the effective extraction and fast tracking of visual information.3)Research on Motion State Estimation Algorithm with the fusion of IMU and visual information.Firstly,regarding the iterative integration problem in the optimization iteration process,a measurement constraint construction model based on IMU accumulating integration is derived.The objective function of state estimation is constructed by using the back-projection error of visual information and the incremental integration error of IMU,and the optimal state of carrier state is obtained by L-M iterative algorithm.Secondly,a sliding window algorithm based on marginal probability is introduced,which can realize high-precision and real-time motion information estimation without losing historical prior information.Finally,the median integral is discussed to realize the track estimation of the carrier to satisfy the requirement of high-frequency motion information output in some scenarios.4)The proposed method is validated in public data sets and real scenarios.The experimental results of public data sets show that the accuracy of this algorithm has been greatly improved compared with the two state-of-art methods with the estimated accuracy that can reach0.096 m.The real scene experiments under different lighting conditions also verify the algorithm's estimated performance of external parameters and motion trajectories.The experiments aforementioned show that the algorithm proposed in this paper not only has good accuracy and real-time performance,but also has robustness,which can satisfy the requirements of robot indoor navigation.
Keywords/Search Tags:Deep Coupling, Visual Tracking, Data Fusion, Vision / Inertial Navigation
PDF Full Text Request
Related items