Font Size: a A A

Research On Generation Of News Headlines With Neural Network

Posted on:2019-02-23Degree:MasterType:Thesis
Country:ChinaCandidate:C PangFull Text:PDF
GTID:2348330542975001Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the development of Mobile Internet,network news resources show exponential growth,generate a can reflect the theme of news content for news headlines,allows readers to quickly browse news content,accurately choose the content that they are interested in reading,saving time and cost,bring readers a good reading experience.In fact,in the market,news headlines recommended by many news clients often have such a problem,which makes it impossible for readers to effectively get the news content that they are interested in,and affect the reading experience of these news clients.In practical applications,automatic text summarization can be used to generate an accurate,smooth and concise news headline for the news on the network.Automatic text summarization technology is using computer to generate summaries automatically from original documents,and summarization is a concise and coherent essay that can accurately and comprehensively reflect the content of a document center.According to the different ways of implementation,it can be divided into extractive and abstractive.The abstractive automatic text summarization is a representation of the central content and concept of the original document in different forms.The words in the generated summaries need not be the same as the original ones.The neural network,which is applied to the generation of news headlines in this paper,is an abstract automatic text summary system.The automatic text summarization system discussed in this paper uses a specific model structure based on recurrent neural network's sequence to sequence model with attention mechanism,and how to generate summaries after obtaining the final training model.In this paper,the problems in the automatic text summarization system using the sequence to sequence model with attention mechanism are improved.The existing model for the formation of the words is not accurate,especially difficult to deal with the unknown word problem,proposed the use of pointer generation network ability makes the model to generate new words from the table in a fixed size but also has the ability to choose words from the original text,so as to effectively solve the generating abstract words this phenomenon is not accurate and unknown with attention mechanism model based on sequence to sequence the original.For automatic text summarization system with attention mechanism based on the model of sequence to sequence the existence of duplicate content in the process of the formation of the phenomenon,especially in the original text in case of longer this phenomenon is a particularly serious problem,this paper proposes to use coverage mechanism used in Machine Translation task to solve the problem,and on the cover the mechanism for automatic text sunmmarization tasks are modified properly.In addition,through the analysis of news headlines generated in the process,the model combines the text classification task,through multi task mode improves the quality of model generation of news headlines as a lmatter of fact,in combination with text classification model in complete text classification problems at the same time,also makes part of the model the automatic text summarization to capture the writing style of the characteristics of different types of news in the generation in the title.Finally,the minimum processing unit of automatic text summarization tasks are discussed in this paper,analysis of the use of the word level text processing and use of the advantages and disadvantages Chinese characters level text processing,and the training model of the huge amount of computation is proposed using hierarchical softmax and negative sampling are resolved,and compared the performance of two ways.
Keywords/Search Tags:Recurrent Neural Network, Automatic Text Summarization, Pointer Generator Network, Coverage Mechanism, Multi-task mechanism
PDF Full Text Request
Related items