Font Size: a A A

Research On Short-range Pose Measurement Technology Of Space Non-cooperative Target Based On Vision Information

Posted on:2021-07-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:G L HuFull Text:PDF
GTID:1488306455463194Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Vision measurement technology is widely used in the fields of industry and aerospace.This dissertation conducts comprehensive research on the key algorithms of the close range and high precision pose measurement of space non-cooperative targets.The main research contents and innovation are as follows:For the optimization of internal and external parameters in camera calibration,a non-linear optimization algorithm based on exponential growth updating damping coefficient is proposed in this dissertation.The multi threshold is used for optimization,which solves the shortcomings of traditional optimization algorithm such as too many iterations and slow convergence.Extensive practical experiments prove that the calibration results obtained by the proposed algorithm are more robust and accurate.To refine the solutions of the initial guess obtained in binocular camera calibration,a calibration algorithm based on the Singular Value Decomposition and non-linear optimization is proposed in this dissertation.A random sampling algorithm based on multi threshold is proposed to calculate the high-precision fundamental and essential matrix.A new calculating expression matrix is derived for computing the relative position matrix of the binocular cameras,which is solved in using the Frobenius norm.An algorithm for optimizing the initial guess based on the simplified residual function is proposed,which greatly simplifies the calculation of the Jacobian matrix.Extensive experiments demonstrated that the proposed method is more robust and accurate than the widely used Hartley,Bouguet and other non-linear least squares minimization methods.For the purpose of the united measurement of binocular vision and lidar,a robust method to calibrate the relative pose between camera and lidar using a 3-D spherical target is proposed in this dissertation.The lidar is completely assumed to be a camera with fixed parameters that are given,and the 3D point cloud data can be directly projected into a two-dimensional image.For the first time,the problem of laser-tocamera calibration turns to camera-to-camera calibration dramatically.Finally,the two groups of systems will be fused.Accurate and dense three-dimensional(3D)information of the target is obtained.In contrast to existing methods,the calibration of camera and laser radar is completed by means of two-dimensional image processing.The amount and complexity of data are greatly reduced.The error of the re-projection by the proposed method was less than 0.13 pixels.The proposed method achieves road crack width measurement error of less than 1.5 mm.Compared with the Levenberg-Marquardt algorithm,the iteration times of our algorithm are reduced by 33%;The accuracy is improved by 1.96% compared with the traditional non-linear least squares minimization algorithm.For the high precision pose measurement and dense 3D reconstruction of non-cooperative targets,this dissertation adopts the combination of feature based matching and pixel based matching methods.For the actual application requirements of aerospace,an ellipse mark extraction method is designed,and the high accuracy of the algorithm was verified through data comparison.Extensive practical experiments demonstrated that the proposed method can significantly improve the accuracy of calibration and ultimately obtains higher measurement precision.At a range of 1 meter,the measurement error is less than 1 mm for the length test and less than 0.3 degree for the angle test.Binocular vision was used to reconstruct the point cloud of the non-cooperative target.At the same time,data was also obtained using lidar.Finally,the two groups of systems were fused.Accurate and dense three-dimensional(3D)information of the target was obtained and the average error of fusion is less than 1cm at 1 meter and less than 4.5cm at 20 meters.
Keywords/Search Tags:Binocular vision, Camera calibration, Stereo vision, 3D reconstruction, point cloud fusion
PDF Full Text Request
Related items