Font Size: a A A

Research On Dialogue Understanding For Pretraining And Graph Structure Reasoning

Posted on:2022-11-29Degree:MasterType:Thesis
Country:ChinaCandidate:S Y TaoFull Text:PDF
GTID:2518306776492884Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
In the field of natural language processing,it is a very important and difficult task to build a professional and intelligent dialogue system that can conduct targeted,semantically rich and fluent dialogues with humans.At present,there is a growing demand for dialogue systems in the real world.Intelligent customer service,chatbots and other projects have also become hot spots in the society,with very high practical and commercial value.Re-cently,methods on dialogue-related tasks mainly rely on building a large-scale candidate dialogue library,and by calculating the similarity between text,to retrieve the dialogue content that may be the answer.This method requires a lot of manpower and time to con-struct the candidate dialogue database,and only considers the similarity of the calculated text,and ignores the understanding of text semantics and the reasoning of text content.After the neural network algorithm has been proposed,it has been applied in many tasks because of its powerful reasoning ability.It is very rare to the application on the retrieval-based multi-turn dialogue task,mainly because the current lack of a bonded dia-logue graph construction method.In response to this problem,this paper proposes a tem-poral diagram construction algorithm for the particularity of dialogue task data,which is used to strengthen the semantic connection and understanding between dialogue texts.By controlling the number of hops in the temporal diagram construction algorithm,the local semantic information in the dialogue text is strengthened,and higher accuracy is obtained with the traditional graph neural network algorithm.The temporal diagram construction algorithm improves by 2.0% on GCN and 1.2% on GAT.In addition,current dialogue models regard multi-turn dialogue as a whole conversa-tion starts from the beginning to the end,dose not take into account the dialogue sequence consisting of sequential dialogue statements in the content,to solve this problem,consid-ering the natural temporal characteristics of dialogue tasks,this paper proposes a gated temporal graph convolution algorithm(TGCN)for dialogue tasks,which uses the tempo-ral diagram construction method proposed above to construct dialogue graphs,forming multiple dialogue reasoning paths,and performing information filtering and feature inte-gration on the temporal information and textual semantic information of dialogue data in each reasoning chain through the gating mechanism.The final experiment shows that this method improves the effect by 1.7% compared with the baseline model.Furthermore,now the multi-turn dialogue task mainly focus on model’s fine-tune to use the ability of language model,in order to make a strong use of language model,con-sidering the strong learning ability of pre-trained language models in the field of natural language processing,this paper proposes a new pre-training method using characteris-tics of dialogue data which is based on the traditional text pre-training method.In the pre-training process,the unique temporal information and the strong correlation of local information will be used to further strengthen the pre-training model in dialogue related tasks.The experimental results show that compared with the general pre-trained language model Bert,the proposed method improves the performance by 2.0%,and also has a cer-tain performance improvement beyond the stronger model Ro BERTa by 0.7%.In summary,this paper optimizes multi-turn task from three angels,dialogue graph construction method,graph gated mechanism and the predefined training target of lan-guage model.From the final experimental results,we can see that our proposed methods have certain performance improvements comparing with match-based models and strong language models,while multiple comparative experiments have also proven the effective-ness of modules in our model.
Keywords/Search Tags:Graph neural network, Dialogue reasoning system, Dialogue-graph construction, Pre-trained language model
PDF Full Text Request
Related items