Font Size: a A A

Research On Abstractive Summarization Based On Sequence-to-Sequence Neural Network Model

Posted on:2021-02-11Degree:MasterType:Thesis
Country:ChinaCandidate:D J FengFull Text:PDF
GTID:2428330605974903Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Facing the huge amount of data on the Internet,the automatic summarization technology compresses the text context to obtain the main purpose of the article by a certain means.The abstract is a high-level summary of the text information.With the continuous development of deep learning technology,the current abstractive summarization mainly uses sequence-to-sequence model,and the text-encoded semantic vector is passed into the decoder through the Encoder-Decoder framework to generate the summary.To solve some problems in the existing model framework,this paper starts from fully mining the text sequence information to improve the model's summary results.The research work mainly includes the following three aspects:(1)First,this paper proposes an automatic summarization model based on network structure of attention mechanism.Considering the temporal characteristics of text information,this paper uses recurrent neural network as an encoder to encode the input.At the same time,in order to extract sentence phrase feature,this paper introduces a convolutional network encoder based on the original model.In the calculation process of attention,the encoding information of the two network structures is combined to calculate new context vector,and passes it to the decoder to generate a summary.The experimental results show that the model performs well under the rouge evaluation systerm.(2)Secondly,this paper proposes an automatic summarization model based on self-attention gating network.Due to the limitations of the size of convolution kernel and training time,the lack of global context connection between features that are far away from each other.To solve this problem,this paper uses self-attention gating to process the information from convolutional encoder.The experimental results show that the model has been improved under the rouge evaluation systerm.(3)Finally,this paper proposes an automatic summarization model based on dual gating attention network.This article aims to enrich the decoding information starting from the decoder.The attention module constructs gating based on historical information to filter the core information of the two encoding networks,and uses the traditional soft-attention calculation method to obtain their respective context vectors.A summary is generated with the results of the two attention modules.The experimental results show that the model has good performance after the dual gating attention is introduced.
Keywords/Search Tags:Abstractive Summarization, Sequence-to-Sequence, Neural Network, Encoder-Decoder, Attention Mechanism
PDF Full Text Request
Related items