Font Size: a A A

Paragraph Text Generation For Contrastive Relationship

Posted on:2021-08-12Degree:MasterType:Thesis
Country:ChinaCandidate:Q H JiaoFull Text:PDF
GTID:2518306521963319Subject:Information Science
Abstract/Summary:PDF Full Text Request
With the rapid development of the Internet and information technology,the information resources on the network have grown exponentially.It is difficult for people to learn knowledge tirelessly,and they are beginning to expect computer to learn it from massive information and write texts like humans automatically.Generating contrastive relationship paragraph text automatically can not only help people obtain the important contrastive content quickly,but also can save people's writing time and energy,provide some ideas for the text automatic generation for specific writing relationships,and provides a reference for the research and application of machine automatic writing for specific writing intentions.The paper focuses on the problem of text automatic generation,and aims to generate contrastive relationship paragraph text between the text to be compared and its multiple related text.By designing a paragraph text generation model for contrastive relationship,the paragraph text generation is divided into two parts: contrastive relationship sentence generation and sentence organization.With reference to the related technologies of text automatic generation,a contrastive relationship sentence generation model of Seq2 seq sequence structure is constructed.On the basis of the dynamic text representation model based on the improvement of Bert,the contrastive feature vector is integrated to represent the input text and semi-supervised self-learning training is used for incremental training.Then organizing the sentences by designing rule templates to obtain the paragraph text of the contrastive relationship.Specifically,the following work was mainly carried out:(1)Combined with specific example of generated paragraph text,we analyzed the different granularity features of the contrastive text,including part-of-speech feature,keyword feature and numerical feature,and described the way to obtain features.(2)For problem with traditional text representation,we introduced dynamic text representation model commonly used and the feature extractors involved in them.For the actual task of the study,the original Bert model was improved to be more lightweight,and the scientific literature data was retrained to adapt to the research task.We use perplexity as the evaluation standard.The final experiment result is 7.439,and the effectiveness of the improved dynamic text representation language model is proved by comparison with Elmo.(3)We have constructed a contrastive relationship sentence generation model based on Seq2 seq.Both the encoder and the decoder used Bi LSTM,and the Attention mechanism was added.We have optimized the input layer of the generation model,and on the basis of dynamic word vector representation combining word-level and character-level features,we incorporated part-of-speech feature vectors,keyword feature vectors,and numerical feature vectors to represent the input text.In view of the fact that there are few labeled training data,the semi-supervised self-learning method was used to perform incremental training.We experimented with the sentence generation model on the manually-labeled search list and the scientific paper,using BLEU as the evaluation index of the generation effect,and the final evaluation score was 15.3,which was an increase of 7.4 and 8.7 compared to the Transformer and Bi LSTM + Attention.(4)We have designed a rule template to organize sentence and generated the paragraph text of contrastive relationship.Then we took the new conclusion in the scientific novelty as an example to analyze the text generation method of this paper,which proves the effectiveness of the method.The paper includes 15 figures and 20 tables.
Keywords/Search Tags:Text Generation, Contrastive relationship, Deep Learning, Text Representation
PDF Full Text Request
Related items