Font Size: a A A

The Research On Locally Controllable Style Transfer Neural Network Model

Posted on:2022-09-23Degree:MasterType:Thesis
Country:ChinaCandidate:W R GaoFull Text:PDF
GTID:2518306335971529Subject:Circuits and Systems
Abstract/Summary:PDF Full Text Request
Since the 20 th century,computer graphics has developed rapidly,and more and more researchers at home and abroad have paid attention to the transfer of image style.Image style transfer use the computer to combine the semantic content of one image with the style of another image,and reshape the image style on the basis of retaining the semantic information of the original image,so that the image has different artistic effects.Traditional image style transfer methods include physical model based rendering,stroke based rendering,texture synthesis based rendering,filtering based rendering,etc.Physical model-based rendering and brush-based rendering are style transfer methods that simulate the process of painting creation,such as the size and color of strokes,etc.it uses interactive or parameter control to generate images.It uses low-level image information such as image pixel,color and edge,so it cannot fully express the semantic information and style texture of the image,resulting in unsatisfactory visual effect.The method based on texture synthesis synthesizes images by copying the information of pixel points or pixel blocks,which is suitable for images with a large number of repeated texture cycles.In recent years,due to the improvement of computer software and hardware technology,the use of deep learning for image Style Transfer has gradually become a research hotspot,and this method is called Neural Style Transfer.This paper introduces the development and research status of image style transfer,summarizes the main image style transfer methods in this field,and briefly expounds the shortcomings of traditional image style transfer methods and the advantages of deep learning method for image style transfer.Based on the NST,an image style transfer method based on feature synthesis is proposed.In the depth feature space,style samples and content images are used to synthesize the feature image of the target image,and then the feature image is restored to the image through the inverse feature transformation to realize the style transfer.The main work contents of this paper are as follows:(1)We propose a style transfer framework based on feature synthesis.We transform the style transfer process of the image into feature space for operation.Combining the advantages of traditional texture synthesis with the feature maps of content image and style image,a target feature map containing both semantic features of content image and style texture features is synthesized.The style transfer is realized by restoring the inverse transformation of the target feature graph to an image.Experiments show that this method has good interactivity and local controllability.(2)We give a method of layer by layer synthesis from deep level to shallow level.Firstly,we use VGG network to get feature maps of content images and style images in different layers.Then,the greedy search method is used to search and select feature points in the initial layer(the deepest layer of the network),and the feature map is transferred to the next layer by up sampling.Finally,the feature maps of other layers are synthesized by using the coherence synthesis method and adding semantic constraints of content images.Compared with other experimental results of the NST model,our approach enables a reasonable texture distribution,which preserves the semantic structure of the content image while changing its stylistic texture to produce a more realistic artistic effect.
Keywords/Search Tags:Non-Photorealistic Rendering, Image style transfer, Texture synthesis, Deep neural network, Deep learning
PDF Full Text Request
Related items