Font Size: a A A

Reasearch On Machine Reading Comprehension Methods Based On Incorporating The History Of Conversation

Posted on:2022-07-30Degree:MasterType:Thesis
Country:ChinaCandidate:Z B YinFull Text:PDF
GTID:2518306572459654Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Human ask for information in a conversational manner,and the current question is usually related to previous question and answer.However,current machine reading comprehension tasks are more of a single round of question and answer.Questions and answers in different rounds have no correlation and can be solved independently.In order to make the machine have the ability to ask for information like a human,researchers propose the conversational reading comprehension task.Given a passage and multiple rounds of questions,the answer to the current question not only depends on the content of the passage,but also needs to be combined with previous question and answer.This article explores the machine reading comprehension technology that integrates question and answer historical information,that is,integrates question and answer historical information into the process of answering the current round of question in conjunction with the article,and mainly conducts the following three research work.(1)A method of integrating question and answer history information based on pretrained language models.Researchers propose a prepending method to incorporate historical information of question and answer,that is,before prepending the question and answer history to the current round of questions,and then input the prepended text and passage into a single-round machine reading comprehension model to give the answer.However,single-round reading comprehension models such as Bi DAF++ cannot fully interact with the question and answer history and the current round of question.We build a conversational reading comprehension model based on the pretrained model,and use the deep network of the pretrained model to deepen the history of the question and answer and the current round of questions.The experimental results verify the effectiveness of the method.(2)A method of integrating question and answer history information based on dialogue information selection.Existing work has shown that the model is easily interfered by irrelevant historical information of question and answer with the prepending method.In response to this problem,we propose two methods for integrating question and answer history information based on dialogue information selection.The first method converts each round of question and answer in the question and answer history and the current round of question into sentence representation to express important information,and then selects the historical information of the question and answer through the gating mechanism of the gated recurrent neural network.The second method respectively interacts with each round of question and answer in the history of question and answer and the current round of question to initially select information,and then merge the interacted information into the current round of question to select the historical information of the question and answer again.Experimental results show the effectiveness of this method.(3)A method of integrating question and answer history information based on latent semantic reasoning.The latent semantic reasoning mechanism combines the intermediate representations generated in the process of answering the previous round of questions to answer the current round of question,and can use the latent semantic information in the question and answer history.However,the existing methods do not directly consider the question when transmitting the potential question and answer history information,and cannot deliver information that is helpful for answering the question in a targeted manner.In order to solve this problem,a multi-granular conversational reading comprehension model based on question guidance is proposed.The question information is used to guide the transmission process of latent semantic information,and the information transmission process of sentence granularity is introduced to further enrich the transmitted information.The experimental results prove the effectiveness of this method.
Keywords/Search Tags:Machine Reading Comprehension, Conversational Reading Comprehension, Pretrained Language Model, Attention Mechanism, Recurrent Neural Network
PDF Full Text Request
Related items