Font Size: a A A

Task-oriented Machine Reading Comprehension Via Deep Learning

Posted on:2019-12-17Degree:MasterType:Thesis
Country:ChinaCandidate:G X ZhuFull Text:PDF
GTID:2428330566986594Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Machine reading comprehension(MRC)aims at teaching machine to answer questions after understanding texts.It's an important step to natural language understanding and has received a lot of attention recently.Research on MRC is of great value as it can help promoting the development of related natural language processing techniques and improving the performance of application such as search engine and question-answering system.Generally,research on MRC is task-oriented.In this paper,we focus on text-span extraction task as well as text generation task.What's more,as deep learning methods have shown great advantages in text feature extraction and compositional semantic representation,our research mainly focuses on MRC models based on deep learning methods.The major contributions of this paper are as follows:1.We study a MRC model called FusionNet,which performs well in single-document text-span extraction task.We re-implement the FusionNet model and make it the baseline model in our following work.2.We modify the FusionNet model and migrate to multi-document text-span extraction task.Noticing that existing MRC models do not explicitly model the relationship between documents,we propose a better model based on the FusionNet model and additional hierarchical attention layer,called Hierarchical Attention Network(HieAttnNet).By combining local self-attention with global self-attention,our model handles intra-document and inter-document information flow respectively and thus becomes more expressive.Experiments on several datasets show that our model generally outperforms other methods.3.We study the application of sequence-to-sequence text generation models in MRC task.To solve the problems of out-of-vocabulary(OOV)and repetition in the generating process,we propose a method to improve the sequence-to-sequence model by incorporating copy mechanism and coverage mechanism.Experimental results show that our method effectively reduces errors in text generation and makes the generated answer more accurate and natural.
Keywords/Search Tags:Machine Reading Comprehension, Deep Learning, Hierarchical Attention Mechanism, Copy Mechanism, Coverage Mechanism
PDF Full Text Request
Related items