Font Size: a A A

Research On Natural Language Generation In Task-based Dialogue System

Posted on:2020-05-12Degree:MasterType:Thesis
Country:ChinaCandidate:J H ZhangFull Text:PDF
GTID:2428330590461160Subject:Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of artificial intelligence technology and the continuous growth of information data,the deep learning technology with data-driven is constantly updated and developed,and the interaction methods closely related to people's lives are constantly changing.Providing convenience and fast service for people has become the trend of the times,and the dialogue system came into being.More and more smart products change people's lives through dialogue interactions,such as booking airline tickets,booking restaurants,planning driving routes,intelligent customer service and so on.Get the service you want by communicating with the system.A dialogue system that includes natural language generation support these convenient products.In these dialogue systems which are designed for specific fields,the natural language generation module needs to convert specific semantic information into natural sentences that the user is accustomed to.The traditional rule-based template generation method is slowly being replaced due to poor mobility and single recovery,and the neural network-based generation method has received more and more attention due to its flexibility and versatility.Aiming at the single rigid problem of the system query in the task-based dialogue system,we explore a template-based sequence-to-sequence(Seq2Seq)generation model to generate diversity problems.Aiming at the problem of poor semantic alignment and inaccurate semantic representation in multi-round dialogue process in some neural network-based generation models,this paper proposes a multi-level attention mechanism natural language generation model based on LSTM and Seq2 Seq.Designing a context encoder to encode the context information and improve the semantic control ability of the model,so that the content generated by the dialogue can inherit the above semantics and solve the problem of inaccurate semantic expression in the process of multi-round dialogue;Designing a hierarchical attention mechanism that include word level attention which is used to record and save the current input sequence information and utterance level attention which is used to encode the historical attention information sequence to increase the information amount when the model is decoded,enhance the semantic alignment and semantic control ability of the model,ensure the accuracy of generating words,and further improve the consistency of semantic expression.Simultaneously encode the dialogue behavior to ensure the accuracy of the slot value information output.In terms of diversity of system query questions,we validated the effectiveness and diversity of the proposed method on a Q&A dataset.In terms of versatility generation,the paper uses two different fields of task-based dialogue data to validate the proposed model.The final experiment shows that the proposed model has about 3% increase in the value of BLEU4 on two different datasets,and the quality of the generated text is better.
Keywords/Search Tags:Natural language generation, LSTM, hierarchical attention mechanism, context Encoding
PDF Full Text Request
Related items