Font Size: a A A

Research On Visual/Inertial Navigation Technologies Of Micro/small Air Vehicle In Special Vision Applied Environment

Posted on:2019-08-12Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y S WangFull Text:PDF
GTID:1368330590966674Subject:Navigation, guidance and control
Abstract/Summary:PDF Full Text Request
The visual/intertial integrated navigation technology has important significance and value to the Micro/Small Air Vehicle(MAV)in the none GNSS environments such as indoor and urban agglomeration.It has been paid widely attention by domestic and overseas researchers.In some sparse feature environments such as light insufficient and texture deficiency,or the high dynamic environments,feature information captured by visual sensor is always not enough for the environment reconstruction and navigation.The applications of visual/inertial navigation are restricted by this problem.Therefore,aiming at the sparse feature and high dynamic environments which are the typical and common spatial vision applied environments,this paper researches the visual/inertial integrated navigation technology for the MAV in such environments.The reconstruction of environment has significance for the indoor autonomous navigation of MAV.So,the 3D reconstruction of environment in sparse feature environment is researched first.The calibration of structured light visual sensor is studied and the influence of configuration parameter to the ranging accuracy is analyzed.In this case,the visual/inertial reconstruction methods of 3D environment based on image matching and cross structured light are proposed.The rotation angle between two frames are calculated by image matching and horizontal structured light respectively so that the additional driving mechanism is not necessary for the onboard structured light sensor.An error points removed strategy based on planes constraint is proposed.The error points caused by the interference of environment light can be removed effectively.In the other side,A Kalman Filter is designed for fusing the rotaion angle measured by inertial sensor and structured light visual sensor.The robustness and accuracy of the reconstruction method are enhanced.Then the visual/inertial integrated navigation technology in sparse feature environment is researched.Aiming at the problem that only partial information of position and attitude can not be measured by structured light visual sensor at present,the pose measurement of monocular visual sensor aided by cross structured light is proposed.Only one line in the environment is need to measure all the information fo position and attitude.To enhance the robustness of the navigation in sparse feature environment,the filter model of visual/inertial integrated navigation aided by cross structured light is built according to the characteristic of cross structured light visual model.The accumulated error of height is avoided.The cross structured light visual measurement model is analyzed by the observability analysis.Based on this,an self-Adaptive Sage-Husa Kalman Filter(ASHKF)based on fading factors matrix is designed and the accuracy of visual/inertial integrated navigation aided by cross structured light is improved.On the other side,aiming at the problem that the large viewpoint of two images caused by high dynamic MAV is difficult to track,the completely affine invariant matching method is researched.After the analysis of completely affine invariant matching method,the affine invariant matching method baesd on Artificial Bee Colony(ABC)is proposed.The affine transformed model is replaced by the perspective transformed model for the accuracy simulation.The ABC algorithm is improved to replace the traverse sampling strategy so that the the parameters of viewpoint simulation can be obtained fast and accuracy.Furthermore,aiming at the characteristic of repetitive pattern images,the repetitive areas in the iamges are normalizated to estimate the affine transformed matrix.In this way,the number of viewpoint simulated transformation is decreased.A mixed featured discreption combined with local discreption and global discreption is designed to improve the matching accuracy of repetitive pattern images.To increase the efficiency of calculation for the application of MAV,the inertial information of MAV is used to calculate the viewpoint change of the images and the calculated speed of ORB is improved by avoiding the building of Gaussian pyramid.It's a real time affine invariant matching method which can operate on the MAV,and the number of keyframes generated during the navigation is decresed.Then the inertial information is added to the ORB-SLAM system.An optimization problem which include visual residual,inertial residual and prior residual is built.A tightly-coupled visual/inertial method is realized by solving the system state which makes the residual minimum.Thus,accuracy and reliability of ORBSLAM is improved.At last,an indoor quad-rotor test plantform is built,and the motion captured system is used to verify the visual/inertial integrated navigation method aided by cross structured light.Furthermore,an outdoor six-rotor test plantform is built,and the RTK is used to verify the visual/inertial SLAM method based on fast affine image matching algorithm.The results show that the methods proposed by this paper can improved the accuracy and robustness of MAV in the spars feature environment and high dynamic environment.Thses methods provide important technical reference for the application of MAV in the complex environments.
Keywords/Search Tags:visual navigation, inertial navigation, structured light, environment reconstruction, affine invariant image matching, self-adaptive filter, tightly-couple, SLAM
PDF Full Text Request
Related items