Font Size: a A A

Remote Cooperation System Based On Mixed Reality

Posted on:2022-05-16Degree:MasterType:Thesis
Country:ChinaCandidate:L X LiFull Text:PDF
GTID:2518306605470464Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
In 1994,the concept of virtual reality continued was proposed,which constituted the beginning of the field of virtual reality.With the continuous development of advanced computer vision technology and chip technology,mixed reality systems have been continuously integrated and become highly integrated,so various mixed reality Equipment appears in many fields.As an emerging field accompanied by industrial transfer,remote cooperation has received a lot of attention.Mixed reality devices have the characteristics of high immersion,intuition display capabilities,and human interaction modes.Therefore,the combination of mixed reality and remote cooperation will produce huge Practical value.This thesis mainly focused on the design and implementation of a remote cooperation system based on mixed reality,and detailed description and testing of the multi-camera external parameter calibration,RGBD data compression transmission and parallel acceleration involved in the system.The main research contents of this paper are as follows:(1)This paper proposes a method for calibrating external parameters of multi-RGBD cameras based on a sphere.There are many common methods for external parameter calibration between cameras,such as methods based on natural scenes and methods based on checkerboards.However,these methods are usually limited by the scene texture and the calibrated viewing angle,and cannot meet the application scenarios where multiple cameras form a 360-degree surround.Therefore,this article first uses the depth camera to obtain the point cloud in the space,removes the distance threshold of the point cloud to exclude the irrelevant point cloud,and then uses the spherical fitting RANSAC method to obtain the center of the sphere,according to different camera coordinate systems The center of the sphere calculated the pose transformation relationship between the cameras,and finally obtains the spatial point cloud in the same coordinate system.This method can minimize the personnel operation during the calibration process while ensuring the accuracy of the external parameter calibration.(2)This article proposes a RGBD data compression transmission method based on video encoding and decoding.Due to the different properties of color map and depth map,we convert the color map from RGB format to YUV420 format.The human eye is not sensitive to colors.Therefore,this method can reduce the amount of transmitted data by half without producing color distortion.Because the error in the depth map will change the geometric structure,we divide the 16-bit depth data into 3 parts and shift them to higher bits.The spare low-bit data is used to resist the loss in the compression process.This can be achieved by the above methods.High-accuracy compression transmission of RGBD data.(3)This article uses parallel computing to accelerate the system.Many of the aforementioned algorithms are independent of each other.Therefore,complex tasks can be disassembled into multiple parts and assigned to multiple cores for parallel processing,thereby effectively reducing The calculation time of each frame of the system.Specifically,we use the Compute shader in Unity to complete the parallel depth map to restore the spatial point cloud and give it a color texture,and use the Visual effect graph to complete the parallel display of the point cloud in the form of particles in the virtual environment.
Keywords/Search Tags:Remote cooperation, mixed reality, multi-RGBD camera external parameter calibration, RGBD data compression transmission, parallel acceleration
PDF Full Text Request
Related items