Font Size: a A A

Unpaired Image-to-imgage Translation Based On Generative Adversarial Network

Posted on:2021-01-11Degree:MasterType:Thesis
Country:ChinaCandidate:Z H XiaoFull Text:PDF
GTID:2428330605472052Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The goal of unpaired image-to-image translation is to learn a mapping from a source domain to a target domain without using any labelled examples of paired images.This problem can be solved by learning the conditional distribution of the source image in the target domain.A major limitation of existing unpaired image-to-image translation algorithms is that they generate untruthfulness images which are overcolored and lack details,while the translation of realistic images must be rich in details.A random reconstructed unpaired image-to-image translation(RRUIT)framework is proposed to address this limitation,which uses random reconstruction to preserve the high level-features in source and adopts an adversarial strategy to learn the distribution in the target.The framework of RRUIT proposed in this paper mainly consists of the following parts:(1)In the framework of generative adversarial network,the proposed framework is updated with two loss functions.The auxiliary loss guides the generator to create coarse image,while the coarse-to-fine block next to the generator block produces an image that obeys the distribution of the target domain.(2)The coarse-to-fine block contains two sub-modules based on the densely connected atrous spatial pyramid pooling which enriches the details of generated images.(3)Based on the existing feature reconstruction loss,the random feature reconstruction loss is proposed in this paper,which can effectively retain the high-frequency features of the input image and discard some low-frequency features when applied to unpaired image-to-image translation.Extensive experiments on photorealistic stylization and artistic stylization confirm the superiority of the proposed RRUIT.
Keywords/Search Tags:image-to-image translation, GANs, random feature reconstruction
PDF Full Text Request
Related items