| Style transfer technology is an image editing technique that extracts content features and style features from content reference images and style reference images,respectively,and blends them together to generate synthesized images.The resultant images align with the content reference images in terms of object type,structure,relative position,and outline shape,while their colors and textures are consistent with the style reference images.Although recent improvements in texture enhancement and encoder-decoder disentanglement have enhanced the performance of style transfer,most methods overlook the crucial role of structural information in image feature generation,leading to structured style transfer problems:the object structure in the synthesized image deviates from the content reference,and the object texture spills over and invades other objects.This issue stems from the overlook of structural information in existing generators and the inadequacy of structural constraints.This paper focuses on the aforementioned structured style transfer issues,with the main contents including:1.To complete the structural information in the generator,we innovatively propose a plug-and-play style transfer layer,Depth Feature Transfer Layer(DFT Layer).This DFT Layer obtains depth information from depth maps and adaptively extracts spatial structure modulation parameters.Moreover,it provides structural guidance to the backbone network during image features synthesis.This process provides the spatial relationship,edge shape,and internal space information of objects,and guide the rendering of texture in corresponding areas.2.To address the defect of inadequate structural constraints,considering that the existing architecture cannot directly apply structural loss to style transfer result due to the large pixel value difference between the transfer result and the content reference,and the existing perceptual loss cannot precisely define the structure,we innovatively propose the Depth Structure Loss(DS Loss).This loss establishes constraints on the depth map,directly constrains the style transfer task,thus achieving an improvement in transfer results.3.We designed and implemented a style transfer system fused with depth information.This system is platform-independent and user-friendly,allowing users to perform real-time style transfer,model deployment,training monitoring,and principle learning through a few keystrokes and parameter configurations on a graphic user interface.Experiments were conducted using the proposed methods on various baseline methods and different datasets with different scene.The results show that the plug-and-play style transfer layer and DS Loss proposed in this paper can effectively alleviate the structured style transfer problem,improving style transfer quality in terms of structure,texture,and authenticity. |