Font Size: a A A

Research On Knowledge Graph Completion Technology Based On Deep Learning

Posted on:2021-01-23Degree:MasterType:Thesis
Country:ChinaCandidate:L W NiFull Text:PDF
GTID:2428330623967757Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
The knowledge graph is a collection of fact triples,which can also be viewed as a semantic network of entities and relationships.As Google uses knowledge graphs in search engines,such as recommendation systems and intelligent question-and-answer systems have also begun to use knowledge graphs on a large scale.However,the existing knowledge graph is usually incomplete,and many triples existing in the fact are missing from the knowledge graph.Much work is devoted to the task of knowledge graph completion,that is,using existing triples to add new triples to the knowledge graph.The embedded knowledge graph completion model represented by TransE maps entities and relationships into a low-dimensional dense vector space,and on this basis,constructs a scoring function about triples to measure the correctness of triples.This type of knowledge graph completion model is usually classified as a static knowledge graph completion task,that is,the entities and relationships that appear in the test phase have been seen during the training phase,and the model has a vector representation of entities and relationships.Hamaguchi raised the problem of OOKB(out of knowledge base).In the knowledge graph completion task,new entities will appear during the testing phase.We call these new entities OOKB entities.Because the model has not seen the OOKB entity,the model does not have a vector representation of the OOKB entity.The OOKB problem belongs to the task of dynamic knowledge graph completion.To solve the OOKB problem,Hamaguchi proposed a two-stage model of the propagation model and the output model.In recent years,deep learning technologies represented by convolutional neural networks,recurrent neural networks,and attention mechanisms have achieved good development in the field of natural language processing.In this paper,we will use the deep learning technology to improve the propagation model proposed by Hamaguchi,and try other output models in the experimental part.Based on the analysis of Hamaguchi's work,our work is mainly as follows:First,the information in the auxiliary triplet can be regarded as sequence information,and the recurrent neural network is a natural neural network structure that processes sequence information.We propose the propagation models P-LSTM and P-GRU based on recurrent neural networks.Second,the self-attention mechanism is used to give different attention to different entities in the connected entity sequence.First,we use the self-attention mechanism to examine the interaction between entities in the connected entity sequence.Second,each connected entity's contribution to the OOKB entity should be different.We use the mean sum method to obtain the rough vector representation of OOKB entities as a query vector,and use an attention mechanism to examine the contribution of different connected entities to OOKB entities,and finally get the vector representation of OOKB entities.We named the model P-Att.Third,on the basis of the data set WordNet11,construct a data set suitable for OOKB problems,and use the triple classification task and link prediction task for evaluation.Experiments show that P-LSTM,P-GRU and P-Att are superior to the model proposed by Hamaguchi on this data set.For the output model,we use the ConvKB model based on the convolutional neural network,not just the simple TransE model.
Keywords/Search Tags:knowledge graph completion, deep learning, recurrent neural network, attention mechanism
PDF Full Text Request
Related items