Font Size: a A A

Research On Abstractive Automatic Text Summarization Methods

Posted on:2022-11-23Degree:MasterType:Thesis
Country:ChinaCandidate:T C HuangFull Text:PDF
GTID:2518306770971869Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
Automatic text summarization refers to the technology of computers generating a sentence(or a paragraph)from an article to summarize the primary meaning of the original text and effectively extract the valuable information of the original text.This technology can help people better obtain helpful information from Internet and improve work efficiency in the era of information explosion.Unlike the traditional extractive automatic text summarization technology,the abstractive automatic text summarization technology is closer to human thought and is more worthy of research.Thanks to deep learning,abstractive automated text summarization technology has become a new research trend,and a series of new progress and achievements have been proved.Although abstractive automatic summarization has achieved remarkable achievements,it remains two main problems to be solved.First,the factual consistency problem,which refers to the facts of the summary generated by the abstractive text summarization model,are inconsistent with the original text.Second,limited by the long-distance dependence of text,the abstractive automatic text summarization model does not have a good solution for processing long texts.In this thesis,two deep neural network models are designed to investigate whether the structured information of the text helps solving the factual consistency problem and whether the dependency tree can help the model learn the long-distance dependencies of the text.The specific models are as below:(1)Entity Relation-based Abstractive Text Summarization Pointer Generator Network.This model first extracts the entity-relationship in the original document,and then uses the Informative Open IE Relation Triples selection algorithm to select the entity-relationship triples containing the most information and construct the entityrelationship knowledge graph from the triples.The Relation based Graph Attention Neural Network learns the features of the entity-relationship knowledge graph,meanwhile uses the entity relation attention method to enhance the basic attention mechanism to increase the probability of the entity word segmentation in the text being selected for output.(2)Dependency Tree-based Graph Attention Transformer.This model builds on the classic Transformer model and uses a graph attention neural network to learn longdistance dependencies between text segmentations.This thesis evaluates the effectiveness of the proposed models on multiple English and Chinese datasets,and conducts ablation experiments on each module in the model.The experimental results show that using graph neural networks to learn structural information features in text can improve the performance of abstractive automatic text summarization models.
Keywords/Search Tags:Abstractive Automatic Text Summarization, Graph Attention Neural Network, Entity-Relationship, Dependency Tree, Deep Learning
PDF Full Text Request
Related items