Font Size: a A A

Research On Recognizing Textual Entailment And Application Based On Deep Neural Networks

Posted on:2017-10-10Degree:MasterType:Thesis
Country:ChinaCandidate:B X WangFull Text:PDF
GTID:2348330503487197Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Recognizing textual entailment is a critical issue which has been applied in various applications such as question answering, information retrieval, information extraction and so on. Traditional approaches on RTE mainly include the classification method based on artificial feature, the method based on word similarity and symbolic logic. Traditional methods require a large number of manual features, rules, and some pipeline tools of NLP such as part of speech tagging, named entity recognition.Deep neural network can avoid manual features and error accumulated by pipeline tools. Therefore, this paper mainly discusses recognizing textual entailment based on deep learning technology and try to apply RTE on reading comprehension.Firstly, we introduce some deep learning approaches on RTE and attempt to apply the attention mechanism to the convolutional neural network model, and then compare attention mechanism with other deep learning approaches.The same text in different tasks would focus on different key parts. For recognizing textual entailment, we usually focus on the matching of the events. Specifically, if the text can be decomposed into events and all the events of hypothesis are included in the premise, we will reason that the premise entails the hypothesis.Based on the above idea, this paper proposes a pre-attentive memory network model. By automatically decomposing the two sentences into the representation of the corresponding events, we can reason the relationship between premise and hypotheis. The model can be used to cluster the input words in advance, which is equivalent to the representation of each event in the source text. Then, based on the relationship of the event, we can determine the relationship between two sentences.Natural language inference and understanding have been a major problem of NLP. As deep learning methods have achieved good results in natural language processing, the research of reading comprehension also increases. In this paper, according to the reading comprehension task, we transform it into recognizing textual entailment problem. Then we apply the RTE method based on deep neural networks to reading comprehension task.In this paper, we test our RTE mothods on the SNLI corpus, on which the preattentive mechanism has achieved 86.5% accuracy. Our performance can be compared with the state of the art. Reading comprehension task is tested on named entity dataset of CBT corpus. Our model based on textual entailment achieves 69.2% accuracy, which is the state of the art for single model.
Keywords/Search Tags:recognizing textual entailment, deep neutral networks, reading comprehension, memory networks
PDF Full Text Request
Related items