Font Size: a A A

Reasearch On Machine Reading Comprehension Methods Based On Incorporating External Knowledge

Posted on:2021-05-19Degree:MasterType:Thesis
Country:ChinaCandidate:Y LeFull Text:PDF
GTID:2428330614450004Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
A big difference between human reading comprehension and machine reading comprehension is that human beings are very good at using some external knowledge other than text to assist themselves in understanding and obtaining answers.However,many current methods of machine reading comprehension are more at the level of text matching,only to find answers based on the text and questions provided by reading comprehension.However,the task of machine reading comprehension in the real world is very complicated.Based on the text and questions provided,the answers to the questions cannot be obtained,and some common-sense external knowledge information is needed.This article takes the introduction of external knowledge for machine reading comprehension as an entry point,by retrieving external knowledge information related to the machine reading comprehension task,and then designing a method to add it to the process of obtaining the answer to the question of machine reading comprehension,thereby improving the machine reading comprehension The performance of the answer to the question.Mainly carried out the following three research work:(1)A method of implicitly introducing external knowledge based on pretrained language models.Due to the excellent deep learning architecture,the pretrained language model can make good use of a large amount of unlabeled text,and these large amounts of unlabeled text already contain a lot of knowledge,so directly use the pre-trained model to build machine reading comprehension,the model implicitly introduces external knowledge.Compared with the traditional reading comprehension method,it has achieved very good results in the experimental test set.(2)An explicit introduction of external knowledge based on the Attention mechanism.In view of the fact that many current external knowledge bases such as NELL and Word Net contain rich knowledge information,after using appropriate methods to retrieve relevant knowledge,the attention mechanism is used to design the knowledge fusion module to display these external knowledge into the existing machine reading comprehension.In the model,the experimental results prove the effectiveness of the method.(3)The method of introducing external knowledge combined with entity perception enhancement.The current machine reading comprehension model based on the pre-trained language model will split some of the entities after word segmentation of the text,and many of the external knowledge retrieved are at the entity level,which will affect the machine reading comprehension fusion entity related external For this problem,a reading comprehension combining entity perception enhancement and external knowledge is proposed.This method adds an auxiliary task of named entity recognition.This task and the machine reading comprehension task are jointly trained together to enhance the entity perception of the machine reading comprehension model.The ability to further improve the machine reading comprehension ability to obtain the answer to the question,the experimental results show the effectiveness of the method.
Keywords/Search Tags:Machine Reading Comprehension, Pretrained Language Model, Attention Method, Named Entity Recognition, Multi-task Learning
PDF Full Text Request
Related items