Font Size: a A A

Research On Short Text Dialogue Generation Method Based On Multiple Information Fusion

Posted on:2020-12-16Degree:MasterType:Thesis
Country:ChinaCandidate:N YuFull Text:PDF
GTID:2438330572479808Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the rapid development of social platforms such as sina weibo,baidu tieba,Twitter and Ubuntu community,a large number of short text conversation data in the form of Q&A have appeared on the Internet,which provides a good data basis for the study of short text dialogue and makes it become a research hotspot in the field of natural language processing recently.In view of the characteristics of short text conversation,this thesis combine the information such as syntax and emotion in a sequence-to-sequence framework based on Transformer neural network to improve the quality of short Chinese dialogues.Specifically,the research content of this paper mainly includes the following three aspects:1.Efficient short text conversation generation model: In order to improve the efficiency of dialogue generation,this thesis adopts a sequence-to-sequence framework based on Transformer neural network to model the task of short text conversation generation.Firstly,Transformer neural network model is adopted for automatic feature combination and extraction.The performance of the model is improved by pre-training.In addition,beam search strategy is used in model prediction decoding to reduce error propagation.In order to verify the effectiveness of the our model,the Seq2 Seq dialogue model based on the neural network of GRU+Attention was adopted as the baseline method in this paper.Experimental results show that Transformer model performs better than GRU+Attention model in short text conversation model.2.Short text conversation generation model model with syntactic information: in order to improve the syntactic and word diversity of short text conversation generation model,we further integrates syntactic information into model.Under the setting of neural network model,Tree GRU is used to extract syntactic information in the mainstream model.In this thesis,we use the hidden layer of neural dependency parser instead.In particular,the parser is trained to obtain the syntax tree and the hidden layer.Experimental results show that using hidden layer can not only effectively improve the quality of generated sentences,but also effectively reduce error propagation.3.Short text conversation generation model with emotional information: in order to effectively control the emotional polarity of dialogue generation,we further combine emotional information on the basis of short text conversation generation model with syntactic information enhancement.In particular,this paper compares two ways to fuse emotion information under the setting of neural network: simple emotion combination and emotion combination based on conditional variational autoencoder.Experimental results show that the use of emotional information can effectively reduce the perplexity of the model,and the conditional variational autoencoder can produce repsonse of emotional polarity.
Keywords/Search Tags:Short text conversation, response generation, sentiment analysis, syntax analysis, neural network
PDF Full Text Request
Related items