Font Size: a A A

Research Of Dialogue Generation Method Based On LSTM Neural Network

Posted on:2021-03-03Degree:MasterType:Thesis
Country:ChinaCandidate:S L QinFull Text:PDF
GTID:2428330614958505Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Driven by artificial intelligence as a national development strategy technology,as a typical application in the field of natural language processing in the artificial intelligence industry,chat-type human-machine dialogue has always been an important criterion for evaluating the progress of intellectualization.This thesis is oriented to the dialogue scene in the open field.For the text generation problem in the dialogue process,learning the language rules,knowledge structure and semantic information contained in the dialogue text through the construction of the deep neural network model.Furthermore,it can realize the diversity expression of the dialogue text in the interactive process,while enhance the semantic relevance of human-machine dialogue.The main research process of the thesis is carried out from the following two aspects:For deep semantic feature extraction of retrievable dialogue text,a recurrent neural network is used to construct a dialogue matching model,and long short-term memory neural networks are used to eliminate long-term dependence of the recurrent neural network,and semantic coding of input sentences and generated responses is completed to obtain deep semantic features vector representation,according to the semantic matching algorithm to get the matching value of the input sentence and the reply vector,and then filter out the best reply sentence.Aiming at the diversity expression task of generative dialogue text,the encoder-decoder model of sequence-to-sequence is used as the research basis for dialogue generation.Take advantage of the long short-term memory neural network in dealing with long dialogue text sequence problems,obtain the semantic expression information of long text sentences.Adjust effectively the weight of keywords in the sentence to the generated sequence through the attention mechanism,and improve the accuracy of the semantic expression of the dialogue reply text.At the same time,for the diversity problem in generated response,the bidirectional long short-term memory neural network is used to improve the model's ability to encode sentences.The diversity cluster search algorithm and the maximum mutual information training network model are introduced to eliminate the problem of generating sentences into the local best.Each iteration of the top-k calculation,to get the top-k sequence of statements,achieve the variety expressions of dialogue generation.Experiments show that the dialogue based on information retrieval has higher semantic relevance and more complete sentence structure,but is less innovative and diverse;the generative dialogue has better scalability,the reply sentence is more innovative,and has more research prospects.
Keywords/Search Tags:natural language processing, deep learning, dialogue generation, long short-term memory
PDF Full Text Request
Related items