Font Size: a A A

Study On Camera Calibration And Point Cloud Registration In Computer Vision

Posted on:2016-08-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:R Y WangFull Text:PDF
GTID:1108330488957657Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
In computer vision, camera calibration and 3D(3-Dimensional) point clouds registration are two key problems. While the former connects 2D(2-Dimensional) images with the 3D world, the latter provides relationships between the different 3D reconstructions. There has been a long history for the research on both problems. However, with the increasing development of hardware facilities as well as the popular applications of computer vision in the commercial market, traditional solutions to the two problems can no longer satisfy the consumers’ demand in their daily lives. This paper mainly studies problems of camera calibration and point clouds registration on the basis of the practical application, and its main contributions are listed as follows.For the traditional methods of camera calibration, the calibration objects are usually made specially. Furthermore, in applications which demand high-precision, the commercial calibration objects are always required, which is expensive and not convenient. In this paper, by taking the familiar industrial products in daily lives as the calibration objects, such as the bottles and industrial objects, two novel calibration methods are presented.1. Propose the calibration method based on the rigid motion between the identical objects. This paper analyzes the invariants of the rigid motion, and presents its new geometrical interpretations in Euclidean space: a pair of conjugate invariants of the rigid motion is a pair of circular points in 3D Euclidean space; the projections of the circular points pair in one image are just the images of the circular points used in the method. Based on the conclusion, the algorithm takes some identical industrial products as the calibration objects, and calibrates the intrinsic and extrinsic parameters of the camera by the rigid motion between the identical objects. The accuracy of the industrial products can guarantee the precision of the calibration objects. The experimental results demonstrate the method is accurate and robust.2. Propose the calibration method based on the lengths of corresponding line segments between the identical objects. Based on the vision geometry, the method analyzes the relationship between the intrinsic and extrinsic parameters, and represents the extrinsic parameters of the camera through the intrinsic ones. After that, the intrinsic parameters are calibrated by the constraints of the geometrical invariants of differentiation between the identical objects, i.e., the lengths of line segments. In the method, only two identical objects are chosen as the calibration objects, and only two images of the products are needed. The proposed method is very easy to use.For the 3D point clouds registration, the traditional methods for fine registration can only obtain the local optimum. To acquire the finest registration results, they always demand another method to provide the rough registration in advance. Moreover, with the advances in the acquisition of the point clouds, the accuracy and the density of the data in the point clouds are increasing rapidly, which made the quantity of data is also growing. In the processing of the point clouds with great quantity, most of the previous methods can not satisfy the actual demands in efficiency and performance. In this paper, several registration methods are presented for different kinds of point clouds, which can ensure the efficiency and good performance of registration.1. Propose a fine registration method with the corresponding spheres. The method searches the rough corresponding points between the point clouds using the 3D corresponding spheres. When the first pair of corresponding spheres are found, the hierarchical propagation of the corresponding spheres are introduced. The hierarchical propagation can guarantee the rough corresponding spheres to be well distributed throughout the overlapped areas. From all the propagating spheres, a great number of corresponding points could be obtained. Finally, an accurate rigid motion can be successfully calculated by simply applying RANSAC(RANdom SAmple Consensus) and the least squares algorithm to all detected corresponding points. The fine registration result is then acquired. The method can register the point clouds in arbitrary positions.2. Propose a registration method based on the planar structure. For the large-scale city point clouds, the plane is the main geometric elements. Thus the registration results can be obtained from the planar structures. Firstly, to extract the planes from the point clouds exactly, a novel co-clustering method is presented, which clusters data points and hypotheses simultaneously. Then the extracted planes are matched by random sampling, and the transformation between the point clouds is calculated from the matched planes. Since our method only deals with the sparse version of the point clouds, it reduces the complexity of calculation and quite suitable for the large-scale city reconstruction.3. Propose a fast registration method based on the image information. On the existing laser scanners, there usually is a coaxial camera, which can capture images in the scanning site. The captured images can provide supplementary information for the registration. At first, using the vision geometry, the method computes the rotation transformation between the point clouds by the images immediately. Then the translation is acquired by the improved ICP(Iterative Closest Point) algorithm proposed in this paper. In the improved ICP algorithm, only the three variables in translation vector is updated iteratively. Compared with the traditional ICP algorithm, which updated six variables iteratively, our method reduces the complexity of calculation and converges faster.
Keywords/Search Tags:Vision Geometry, Camera Calibration, Point Cloud Registration, Corresponding Spheres
PDF Full Text Request
Related items