Font Size: a A A

Text Style Transfer Based On Prompt Learning

Posted on:2024-07-22Degree:MasterType:Thesis
Country:ChinaCandidate:A Q LiFull Text:PDF
GTID:2568307157982309Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Style representation,recognition and transfer are important aspects of natural language processing and have many applications.For example,style transfer can be used to improve user expression or to identify negative or aggressive language and fade it out in a comment-based community.Therefore,in recent years,much attention has been paid to the task of text style transfer.However,there are still some challenges in this task,such as: 1)the lack of parallel corpus with different styles of expression;2)The loss of semantic content cannot be avoided when performing style transfer.While maintaining the correct style transfer,how to minimize the loss of style-independent content becomes a key issue in current research.The learning ability of large language models is increasing.As for how to mine the internal knowledge of large language models for downstream tasks,prompt-learning quickly becomes the key method to finetune large language models once it is proposed.Inspired by prompt-learning,this paper introduces this method to solve the problems in the task of text style transfer,mainly doing the following:First,in order to improve the quality of transfer under non-parallel style corpus,a new method is proposed to guide the fill mask model to rewrite sentences as target styles.The method is generally based on the Delete-Retrieve-Generate style transfer framework,but employs a large unsupervised pre-training language model and a Transformer architecture.According to the working mechanism of Transformer,the method of filtering style attributes from source statements is improved to improve the accuracy of transfer.Then,the internal knowledge of the pre-training model is mined by prompting learning method to generate the target style words.Experiments on two emotional benchmark datasets show that this method is superior to existing editing methods in text style transfer tasks,and the relative score of the composite index is increased by more than 14% on average.Second,the previous text transfer method based on encoder-decoder separates the vector representation of style and content in the hidden space,but when style information is removed,the style-independent content information is lost incorrectly.By using content word retention,the quality of text style transfer can be improved and content loss can be avoided as much as possible.As prompt learning can supplement domain knowledge on the basis of source input,it helps to generate target sentences that are contextual.Therefore,this paper presents a text style transfer method based on content word preservation and prompt learning.Retain content words from source sentences and domain-related sentences obtained by prompt learning,and filter key semantic information from them by attention mechanism,fuse with hidden representation of source sentences in decoder to generate target sentences.In addition,the accuracy of style transfer can be improved by embedding the target style words derived from the prompt learning into the target style.Experiments on two benchmark datasets show that the proposed content word retention method based on prompt learning is superior to other benchmark algorithms.
Keywords/Search Tags:Text generation, Text style transfer, Prompt learning, Attention mechanism, Emotional transfer
PDF Full Text Request
Related items