| Chang’e-3 successfully launched the lunar lander to the moon in 2013,and the lander carried the "Yutu" rover,which can patrol and survey the lunar surface in the lunar environment,and obtain lunar landing areas and lunar patrol detection Geological structure and topographic features of the area.The successful completion of the Chang’e-3 mission is inseparable from key steps such as lunar terrain reconstruction,planetary vehicle positioning,robotic arm detection,and travel path planning.Among them,the relative positioning of the lunar planetary vehicle is an important technology and is the cornerstone of the scientific exploration of the lunar surface by the patrol device.Therefore,this article focuses on the relative positioning of planetary vehicles.The main research work is as follows:(1)The internal parameters,distortion parameters and relative pose between the left and right cameras of the navigation camera are obtained by Ground Calibration of the navigation camera.The parameters obtained by camera calibration are used in the subsequent planetary vehicle positioning experiments..(2)The stereo image obtained by the binocular camera is used for global stereo matching,and the image is constrained by epipolar line in the stereo matching process.The disparity value of the pixel is obtained by establishing the energy cost function,and the three-dimensional information of the corresponding point can be obtained according to the disparity value.(3)There is noise in the point cloud generated by stereo matching,which will affect the accuracy of point cloud stitching.This article need to filter and denoise the point cloud obtained by stereo matching,and choose bilateral filtering to denoise.This denoising method can effectively reduce the noise of 3D model surface while preserving the geometric feature information of point cloud data,and ensure the subsequent point cloud stitching accuracy.(4)Most of the existing point cloud stitching algorithms are for laser point clouds and are not suitable for point clouds generated by stereo matching,and the point clouds generated by stereo matching are in a separate camera coordinate system,therefore,in this paper,three pairs of feature points with the same name between stereo images are added as the connection points,and the conversion parameters between the two point clouds are taken as the initial value of point cloud splicing in this paper.The navigation camera ground calibration experiment uses 24 checkerboard calibration board stereo images captured by the navigation camera,and the camera parameters obtained are used as the data source in the subsequent planetary vehicle positioning experiment.In the relative positioning experiment of the planetary vehicle,the stereo images of 16 stations on the lunar surface obtained by the rover navigation camera are used as the experimental data.Firstly,the point cloud model of each station is obtained by global stereo matching,and then the point clouds are denoised by bilateral filtering,and then the relative positions of two adjacent stations are obtained by point cloud stitching,The relative positioning results of the rover can be obtained by the relative positions of the adjacent stations.The positioning results in this paper are compared with those based on DOM matching and beam adjustment,According to the comparison of the positioning route broken line chart and the direction mutual difference chart of the three methods,the average mutual difference between the relative positioning results obtained by the positioning method in this paper and the positioning results by the beam adjustment method with higher accuracy is less than 0.1M,which proves that this paper can obtain more accurate relative positioning results of the planetary vehicle under the premise of simple calculation.The paper has 44 pictures,12 tables,and 72 references. |