Font Size: a A A

Research On Question Generation By Integrating Feature And Attention Mechanism

Posted on:2020-08-17Degree:MasterType:Thesis
Country:ChinaCandidate:X Z DongFull Text:PDF
GTID:2428330578980891Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Question generation task aims to generate relevant questions for texts,which can promote the development of various fields such as intelligent medical treatment,Question-Answer system and education field.Traditional methods mainly use redundant rules to turn declarative sentences into questions,some questions are too simple and domain-specific to be used.Deep learning technology has been applied in the research,such as Sequence-to-Sequence architecture(Seq2Seq).However,these methods directly take a declarative sentence as the input and ignore the secret tokens information.The secret tokens are not only the continuous phrase in the sentence,but also the "potential object".So,the performance is often limited by the inherent way of generating questions and cannot adapt to the characteristics of the original sentence.Based on the deep learning method,the paper conducts a series of studies on the question generation for declarative sentences.The main contents include the following three aspects:(1)Question Generation Based on Location Information of Secret TokensThe existing question generation methods take Seq2Seq serialization model as the basic framework.The method does not pay enough attention to the secret tokens and generating questions cannot adapt to the linguistic characteristics of the sentence.So this paper proposes a point-wise Question Generation(PQG)model based on the location information of secret tokens.By adding the location information of secret tokens,the model automatically pays attention to the tokens.The experiment shows that the method outperformed the basic system by 1.98%in SQuAD data set.(2)Question Generation method Based on Interrogative Word RecognitionThe Seq2Seq model contains the determination of question types in the generating process,but the accuracy of the question types has low performance.As the classification model can accurately determine the target type,it is more competent for the task of interrogative word prediction.So we use a model based on the convolution neural network to identity specified question type,and then we will exploit the type into generating question processing.In the end,comparing with the benchmark system,BLEU-4 value of the method is improved by 1.66%in SQuAD data.(3)Question Generation Network Based on Bidirectional Attention MechanismThe question generation based on secret tokens location information and question type recognition can focus on secret tokens information and improve the performance.However,secret tokens do not interact with the sentence information in both methods,which limits the deep semantic acquisition of the sentence and secret orders.To solve the problem,the paper proposes a model based on bidirectional attention mechanism.The deep semantic representation of secret tokens and sentence are obtained by bidirectional attention mechanism,and a decoder generates a question through the semantic.The experiment shows the performance exceeds the benchmark system in SQuAD and MARCO datasets.Aiming at the inherent methods of question generation,this paper introduces location information and question type recognition of secret tokens to solve the language characteristics that cannot adapt to source sentences?Both methods have achieved certain effects.In order to make full use of the secret tokens information,we presents a question generation method based on bidirectional attention mechanism,the method has a better performance in multiple corpus sets.It is proved that the method can effectively obtain the deep semantic representation of cipher and sentence.
Keywords/Search Tags:Question Generation, Attention Mechanisms, Secret Tokens, Sequence-to-Sequence Model, Interrogative Word
PDF Full Text Request
Related items