Font Size: a A A

Research On Robust Model Of Gender Transfer In Facial Images Based On MUNIT

Posted on:2024-05-25Degree:MasterType:Thesis
Country:ChinaCandidate:W LuFull Text:PDF
GTID:2568307076495424Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Gender Transfer in Facial Images refers to the face image without changing the identity information under the premise of generating the opposite gender,which belongs to a face attribute transfer.The existing face image gender conversion model usually adopts image style transfer model.Because there is no paired data set for face gender conversion and it is difficult to capture the key features of face gender conversion,background domain distortion and face blur are common phenomena in generated images.In view of the above problems,this paper focuses on building an improved face image gender conversion model.And to improve the gender conversion quality of facial images.Under the premise of fully studying the UNsupervised style transfer algorithm based on generative adversarial network,an improved MUNIT(Multimodal unsupervised image-to-image Translation)face image gender conversion model is proposed.This model can encode the content features and style features of face images more effectively,and achieve higher model feature extraction ability through the attention mechanism,so as to obtain higher quality face image gender conversion results.In addition,on the basis of the above improved model,a robust face image gender conversion model is proposed.The model can retain the skin color information of the original image and irrelevant background domain through face analysis preprocessing operation and adding a new face skin color loss function,and further improve the effect and quality of face image gender conversion.The framework of this paper mainly includes:1.Firstly,the current mainstream face image gender conversion algorithms at home and abroad are introduced.The first is gender conversion algorithm based on face attribute editing.The second type is facial gender conversion algorithm based on style transfer.Secondly,four typical unsupervised style transfer algorithms are studied.Ada IN(Adapative Instance Normalize),UGATIT(Unsupervised Generative Attentional Networks with Adaptive LayerInstance Normalization for Image-to-Image Translation)and Cycle GAN(Unpaired Image-toImage Translation using Cycle-Consistent Adversarial Networks and MUNIT model,focuses on MUNIT’s network architecture and loss function,points out the difficulties and challenges faced by face image gender conversion task,and its important value in practical application.2.A new face image gender conversion model(IMUNIT)based on improved MUNIT is proposed.In this model,DIN(Dynamic Instance Normalization)is added to the architecture of encoders and the normalization of face content and style is much more accurate.Moreover,a CBAM(Convolutional Block Attention Module)is added to the residuals network of the content coding part,which enables the model to extract more abundant facial gender features.In addition,face images in the Cele BA dataset were screened and cliped according to attributes,which reduced the impact of image background on image generation and made the model more focused on face feature learning.According to the experimental situation,this method can generate more precise facial gender conversion image,and effectively reduce the distortion of background image.3.Based on the improved MUNIT model of face image gender conversion,a robust face image gender conversion model is proposed.Firstly,the model conducted Face Parsing on the face image input to the model,and the face part of the image was accurately input to the model for training and learning,so as to solve the problem of background domain on model training.Secondly,a new loss function was constructed to perform color based Histogram Matching on faces before and after the generation of the model,so as to ensure the consistency of skin color before and after gender transfer of faces.Finally,attribute screening was carried out on the public face data set Cele BA to reduce the adverse factors affecting model training such as face occlusion and glasses,so as to improve the quality of the generated images.The experimental results show that,compared with other classical algorithms,the proposed method can effectively preserve the background area of the image and the skin color of the face,and generate better facial gender conversion images.
Keywords/Search Tags:Gender transfer in facial images, Generate adversarial network, Convolutional block attention module, Face parsing, Histogram matching
PDF Full Text Request
Related items