Font Size: a A A

Research On Single-relation Knowledge Base Question Answering Based On BERT And Relation-aware Attention

Posted on:2021-03-02Degree:MasterType:Thesis
Country:ChinaCandidate:D LuoFull Text:PDF
GTID:2428330611965588Subject:Computer technology
Abstract/Summary:PDF Full Text Request
In recent years,with the development of large-scale knowledge base technology,knowledge base question answering(KBQA)has become a research hotspot in the field of natural language processing,which aims to use the structured knowledge base facts to answer natural language questions.This paper focuses on the single-relation question answering(SR-QA)task,which is an important branch of KBQA.In this task,each question involves only one knowledge base relation,also known as a simple question.Most of the questions in Internet queries are simple questions,and the answer to simple questions is the basis for building a complex question answering system.The current researches on SR-QA can be divided into two categories,semantic parsing-based method and neural network-based method.The former relies on the definition of artificial rules,which lacks generality.Although the latter has high flexibility,the existing methods focus on the modeling methods and neglect the relations between questions and knowledge base facts,which might restrict the further improvements of the performanceAimed at these problems,this paper proposes a BERT and relation-aware based method,BERTRA,for SR-QA.BERTRA divides the SR-QA task into two sub-tasks,entity linking and relation detection.For entity linking,BERTRA uses a pre-trained BERT model for sequence labeling to improve the accuracy of entity labeling in the question,while combining heuristic algorithms to reduce the noise in the candidate entity set.For relation detection,BERTRA constructs the "question-answer" pair as the input of a pre-trained BERT model,which models the question and the answer at the same time to preserve the original interaction information between them.Besides,due to the insufficient use of the structural information of the knowledge base by the existing methods,BERTRA introduces the link-out-relations features of candidate answers,and enhances the representation of the candidates through the relation-aware attention mechanismThe experimental results show that BERTRA effectively improves the accuracy of a single-relation KBQA and achieves the highest current accuracies of 80.9%and 80.7%on the SimpleQuestions dataset with FB2M and FB5M as background knowledge bases respectively.
Keywords/Search Tags:Knowledge Base Question Answering, Deep Learning, Natural Language Processing, BERT, Attention Mechanim
PDF Full Text Request
Related items