Font Size: a A A

Research On Natural Language Generation Methods Based On Neural Sequence Learning

Posted on:2022-10-08Degree:MasterType:Thesis
Country:ChinaCandidate:Z L WangFull Text:PDF
GTID:2518306602494864Subject:Natural language processing
Abstract/Summary:PDF Full Text Request
Natural language generation is a critical research task in natural language processing.In recent years,neural networks have been used widely in many fields,so researchers in natural language processing integrated neural networks into this field.We conduct research in two directions.First of all,we take natural language sequences as the input of a neural network,which can model the context relationship in text sequences,and then adopts a sequence-to-sequence model to guide the process of natural language generation.In addition,we adopt natural language text to build graphs because structured data can capture the relationship of context information,it can guide the process of natural language generation.Specifically,we propose several sequence-to-sequence generative models for natural language generation.First of all,we propose the Sentiment Controllable Poetry Generation task,which combines style transfer with poetry generation,and proposes a Gaussian kernelbased poetry style model to solve this problem.This model can generate poetry with consistent themes and emotions.After that,in order to better combine natural language processing with actual scenarios and improve the user's search experience,we combine the query terms from the user into the document,and propose a text summarization task related to the query term.Meanwhile,in order to solve the long-term dependence of the long-shortterm memory network,we adopted a sequence-to-sequence model based on Transformer.In addition,since single document cannot provide enough contect information,we propose a tip generation model based on multi-document summarization.Due to the longterm dependence of encoding on long document,we propose a hierarchical Transformer model to modify the Transformer's multi-head attention mechanism to achieve sentencelevel attention mechanism.Meanwhile,in order to consider the interaction between query words and document,we propose a relevance-based attention mechanism.The model can simultaneously consider the information of multiple texts and realize the generation of text summaries related to query words.Our model has achieved the best performance in multiple automatic evaluation,and online test have been conducted in our experiment.The results show that the model proposed in this article can increase the click-through rate by 0.08% in search engines with tens of millions of traffic.Finally,we construct event by context information to use the contextual relationship problem in sequence data in natural language generation.We construct the nodes and edges of the event graph through information extraction.In order to let the event graph guide the model to generate summaries,we propose two methods to construct the focus graph.The focus graph can help the model find the key points in the text,in order to optimize the representation of the graph.We propose a graph contrastive learning to optimize the graph rerpesentations.Meanwhile,in order to effectively encode source document,we use a largescale pre-training model to encode the source document,and combines text representation with graph representation to generate summaries.The experimental results show that the model proposed in this paper achieves state of art results.
Keywords/Search Tags:Neural Network, Natural Language Generation, Attention Mechanism, Pre-trained Model, Graph Neural Network
PDF Full Text Request
Related items