Font Size: a A A

Research On The Fusion Technology Of Depth Information And Color Information Of The Full Scene Based On The Pose Relation Of Binocular Cameras

Posted on:2020-05-14Degree:MasterType:Thesis
Country:ChinaCandidate:S T YanFull Text:PDF
GTID:2518306563967649Subject:Instrumentation engineering
Abstract/Summary:PDF Full Text Request
In the three-dimensional reconstruction of large scenes,with the continuous progress of depth sensor and CCD technology,the means to obtain depth information and color information are becoming more and more mature.However,the result of data fusion between the depth and color information could affects the effect of subsequent texture mapping,and thus affect the display effect of the entire 3D model.At present,many researches mainly focus on data fusion between RGB-D depth sensor and common camera.However,the research on data fusion between laser radar with higher precision and longer measuring distance and fisheye camera with larger field of view and depth of field is still limited.Therefore,this thesis focuses on the key techniques for fusion of depth data and color data between laser radar and fisheye camera.Understanding from the perspective of machine vision,the problem of data fusion between laser radar and fisheye camera is to obtain one-to-one and point-to-point mapping relation between depth data and color data.However,due to the non-similar imaging of fisheye camera and the artificial introduction of a large number of barrel distortion,there is evident error in directly establishing the mapping relationship between fisheye camera coordinate system and laser radar coordinate system.Therefore,after obtaining the binocular pose relationship between the common camera and the fisheye camera,this thesis proposes to convert the pose relationship between the fisheye camera and the laser radar into the pose relationship between the common camera and the laser radar.Firstly,this thesis completes the binocular calibration between the ordinary camera and the fisheye camera.A novel method for pose calibration between two cameras is proposed.Because the imaging model and distortion model of ordinary camera and fisheye camera are different,the existing binocular calibration method is not suitable for both.By analyzing the imaging model,it can be seen that the imaging models of common camera and fisheye camera can be converted to each other according to the same relation of incident angles.Therefore,images captured by ordinary cameras can be converted into simulated fisheye images.Based on the binocular fisheye calibration model,binocular calibration of ordinary camera and fisheye camera is completed,thus unifying the coordinate systems of the two.Secondly,the pose calibration of common camera and laser radar is completed.Then,the coordinate system between fisheye camera and laser radar is unified by combining the above two pose relationships,thus establishing the mapping relationship between depth data and color data.Finally,this mapping relation is used to complete the 360° full scene data fusion.We fuse and visualize the 360° horizontal color panoramic data and 360° horizontal depth data to show that the mapping relationship between laser radar and fisheye camera is accurate.Therefore,the proposed calibration method can be considered a feasible solution for large scene 3D reconstruction using laser radar and fisheye camera.
Keywords/Search Tags:laser radar, fisheye lens, binocular calibration, pose calibration, data fusion
PDF Full Text Request
Related items