Image style transfer is a computer vision technique that allows us to recompose the content of an image in the style of another.China pays more attention to the development of cultural industry due to the transformation of the main contradiction and people’s increasing demand for material culture.Image style transfer technology has broaden the application prospects in AI art creation,computer-aided design,image inpainting,and image processing fields such as movies,animation and games.Traditional image style transfer change the style of the image mainly by establishing mathematical and statistical models,basing the sampling on neighborhood to generate new pixels or pixel blocks.In fact,it can be regarded as a texture transfer technology.This method generates images with single style and cannot match the image content well.As deep learning becomes more and more popular,various image style transfer methods based on it have gradually appeared,and the strong learning ability of convolutional neural networks has led to rapid development of image style transfer.Although the new image style transfer methods developed better,there are still three aspects can be improved: style diversity,image quality and transfer speed.Some algorithms are not flexible enough and can only generate a certain style or several styles.Some algorithms cannot completely maintain the content structure information of the image,resulting in serious distortion of the generated image and poor visual effect.Some algorithms use complex network structure,which can generate images of better quality,but lead to the problems of complex network structures,too many parameters,and slow the transfer speed.The main work of this paper is as follows:To address the problems of single style and poor quality of generated images in the current research,we propose an arbitrary style transfer network incorporating self-attention mechanism.Firstly,put the content image and style image into the encoder for multiscale feature extraction respectively.Inspired by non-local networks,we propose a style transfer module that can capture long range dependencies.By calculating the similarity between content features and style features,it can embed style features into semantically similar content features,and realize arbitrary style embedding with overall style coordination.Then,the stylized image is reconstructed by inputting the features embed style into a decoder that is symmetric with the encoder,and the jump connection method is adopted between the codecs to integrate multi-scale information to adjust the local style.Finally,the perceptual loss is calculated by the extracted advanced features,in particular,a feature loss is introduced in the implementation process of this method to avoid image distortion.Experimental results show that this method can generate any style image with global and local style coordination,image content information retention,and high quality.To address the problems of low speed transformation,we propose a style feature matching fusion module that based on the criss-cross attention mechanism and the normalization technology,to reduce the additional computation and memory usage and improve the speed of the proposed method.Firstly,Input the content features and style features output by the encoder into a set of parallel channel space attention modules to efficiently capture key features of content images and style images.Secondly,a sparse cyclic crossover method is used to capture the global correlation of content features and style features,two normalization parameters for adjusting the global style are obtained,and a style instance normalization module is constructed to match and fuse the content features and style features to generate stylized features.Finally,feature refactoring is performed to generate the target stylized image.The experimental results show that the proposed method can reduce the computation and memory usage while ensuring the quality of stylized images,improve the transfer speed,which can be an effective addition to the field of image style transfer. |