Font Size: a A A

Multi-Conditional Generation Of Personalized Texts Based On Deep Learning

Posted on:2020-07-21Degree:MasterType:Thesis
Country:ChinaCandidate:Z W WangFull Text:PDF
GTID:2428330575456632Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Automatic text generation has received much attention owing to rapid development of deep learning.A breakthrough in improving the quality of generated texts is achieved benefiting from the application of deep neural networks in the field of natural language generation.However,machine-generated texts always lack characteristics compared with human-created texts.In general,text generation systems based on statistical language model will not consider anthropomorphic characteristics,which results in machine-like generated texts.To solve this problem,we propose a conditional language generation model with the big five personality feature vectors as input context,which writes human-like short texts.The short text generator consists of a layer of long short memory network,where a big five personality feature vector is concatenated as one part of input for each cell.To enable supervised training generation model,a text classification model based convolution neural network has been used to prepare Chinese micro-blog corpora tagged by the big five personality feature.The experimental results show that our generated Chinese short texts exhibit discriminative personality styles and performance better in the quality assessment than unconditional generation model.Furthermore,due to the absence of structure and emotional controlled literary generation system,we propose a lyric generation model based on linguistic rules and emotional control.We use Sequence to Sequence model as the baseline.The content,structure and rhyme of the generated lyrics are controlled by changing the input mode.A convolutional neural network structure is added to the decoder of the network to classify the emotion of the generated lyrics,and the emotion is controlled by the backpropagation of classification error.At the same time,we propose solutions to repetition problem and out of vocabulary problem,which are the most common problems in the field of natural language generation.The experimental results show that our model can control the content,structure,rhyme and emotion of the lyrics according to the given requirements and its performance on text quality,repetition and out of vocabulary problem is also significantly better than the baseline.
Keywords/Search Tags:natural language generation, deep learning, convolution neural network, long short memory network, encoder-decoder
PDF Full Text Request
Related items