Font Size: a A A

Real-Time Face Modeling System Running On Mobile Device With RGB-D Cameras

Posted on:2020-06-12Degree:MasterType:Thesis
Country:ChinaCandidate:J Y ShenFull Text:PDF
GTID:2428330572496596Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
We present a real-time face modeling system based on RGB-D camera that can run interactively on mobile devices.The system is used to collect geometry,3D landmarks and texture of human faces.Ordinary users hold mobile phones and follow simple instructions to turn head to complete face modeling.The system responds quickly and provides a friendly interactive experience.The input to the system is continuously synchronized RGB-D frames.In this paper,a 2D face alignment algorithm is used to locate the initial face.Then,we use KinectFusion for 3D geometric reconstruction.Face geometry can be reconstructed in real time on mobile devices at 30fps.Then,the acquired face model is used for 3D landmarks localization.Based on PointCNN,which is designed for 3D point cloud deep learning,we design FacePointNet,using a coarse-to-fine and segmenting strategy to locate 3D landmarks.Experiments show that FacePointNet is more accurate than the state-of-the-art algorithms,and its performance is also high.For texture modeling,the color of three different perspective images are firstly sampled,and then projected into a unified 2D image space using cylindrical projection.Then 2D image fusion algorithm is used to obtain a seamless texture.The texture mapping and synthesis process is highly parallel,and the result can be obtained in real time using GPGPU.
Keywords/Search Tags:mobile interaction, real-time, RGB-D, face, 3D reconstruction, texture, landmarks
PDF Full Text Request
Related items