Font Size: a A A

Research On Machine Reading Comprehension Based On Enhanced BERT Representation And R-GCN Network

Posted on:2021-08-15Degree:MasterType:Thesis
Country:ChinaCandidate:X Q HuangFull Text:PDF
GTID:2518306122468654Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Machine reading co mprehensio n is to make the machine read and understand the given art icle and related quest ions,and predict the answers o f the related quest ions.Machine reading co mprehensio n is one o f the most important tasks in the field o f natural language processing,and is considered to be one of the most challenging direct ions in the development o f art ificial intelligence.With the introduct ion o f many large-scale,high-qualit y data sets and the use of various deep neural networks,machine reading comprehensio n has developed rapidly,with significant progress,and the accuracy o f answer predict ion is far beyo nd humans.In recent years,pre-trained language models have been creat ively proposed.Due to the powerful performance o f the pre-trained language model,most o f the exist ing machine reading co mprehensio n models use the pre-trained language model to encode the given art icles and art icle-related problems in the encoding stage.In the informat io n fusio n stage,various attent ion mechanisms are used to interact ively process the art icle informat ion and the art icle-related quest ion informat ion,and finally generate the predicted answer.However,this model has two shortco mings.The first is that the words o f the pre-trained language model represent lack o f background knowledge and mult i-granular feature informat ion.The second is that in the informat ion fusio n stage,only the attent ion mechanism cannot effect ively extract the relationship between the ent it ies in the art icle and the ent ities in related problems.Therefore,based on the above shortco mings,this paper proposes a machine reading co mprehensio n model based o n enhanced Bert representat ion and r-gcn network.There are two contribut ions o f this model:1)Based on the enhanced Bert representation model,the knowledge map informat ion corresponding to each word and the vector representat ion o f glo ve word level are integrated into the Bert representat ion,adding background knowledge and other granularit y feature informat ion to the input o f the model2)The r-gcn network model uses the r-gcn to process the ent it y graph constructed by the full-text ent it y,so that the model can extract the relat ionship informat io n between the ent it y in the art icle and the ent ity in the related problems,and further improve the understanding abilit y o f the modelThe model in this paper is mainly experimented on the SQu AD1.1 data set,and the accuracy rates o f 85.9% and 92.6% are obtained on the indicators EM and F1,respectively.The experimental results show that the combi nat ion o f adding input informat ion to improve the representat ion of pre-trained language models and the use of ent it y graph convo lut ional neural networks to capture co mplex,mult i-hop reasoning informat ion effect ively improves the machine's abilit y to unde rstand text.
Keywords/Search Tags:Machine reading comprehension, Pre-trained language model, Knowledge graph, Graph convo lut ional neural network, Attent ion mechanism
PDF Full Text Request
Related items