Knowledge graph embedding has been a popular research topic for knowledge graph completion.However,as a whole,nodes and relations interact with each other.Sparse interactions can have a serious impact on the quality of their embeddings.Furthermore,the existing graph neural networks based on message passing tend to use relations for graph traversal or selection of message encoding functions through relations.Lack of effective learning of relation embeddings and insufficient utilization of relation semantics could have severe repercussions on the quality of node embeddings,especially when dealing with two nodes with multiple relations.In addition,node types tend to play a role in identifying the idnode and scene of a node,and differences between entities can be usually expressed through node types.For a triple,the types of the head and tail nodes may be different,and they should be treated differently.Ignoring this difference and treating the ndoe types as the same is not only not conducive to improving the expression of node embeddings,but also bad for characterizing the heterogeneity of graphs.Recently,heterogeneous graph neural networks have received more and more attention due to their strong ability to capture the heterogeneity of graphs.However,the extra parameters introduced by modeling heterogeneity are usually huge,and the high memory consumption and computational complexity may not only be unbearable for common machines,but also detrimental to the practical deployment of the models.To overcome these limitations,this paper proposes two knowledge graph completion methods:Heterogeneous Node-Aware Graph Convolutional Network(NHGCN).NHGCN consists of a Heterogeneous Weighted Graph Convolutional Network(HWGCN)encoder that learns the graph structure(topology)and a parameter-free scoring function decoder that considers nodes and relations.In addition to selecting high-quality node types based on the two principles of node scene and node attribute frequency,NHGCN independently parameterizes each node type,and learns adaptively attention for different types to express different importance to the node when a node has multiple types.In order to model heterogeneity,NHGCN makes full use of the node structure,node attributes,node types,and relation types of knowledge graphs,has learnable weights,can handle heterogeneous relations,and can adapt the amount of information from neighbors used in local aggregation,leading to more accurate embeddings of graph nodes.In addition,NHGCN takes the encoded multi-view text attributes as a prior knowledge,which is input into the NHGCN graph encoder as the initialization vector of nodes to provide rich initial semantic information,which will be beneficial to improve the expression of node embeddings,especially when dealing with sparse graphs.NHGCN also uses multi-head attention to adaptively weigh the proportion of node attribute embedding and graph embedding.Finally,in order to reduce the spacetime complexity,NHGCN introduces basis matrix decomposition and subgraph sampling techniques.Multi-Interaction Heterogeneous Contrastive Graph Transformer Network(MHCGT).MHCGT is composed of a heterogeneous graph Transformer encoder with a learned structure and a parameter-free relational scoring function decoder.It takes noise contrast estimation as the training target and guides the learned representation to map pairs of nodes in positive triples to closer locations and pairs of nodes in negative triples to farther locations.That is,in the semantic embedding space,positive nodes are closer to the target node,and negative nodes are further away from the target node.MHCGT adopts a two-stage and four-view interaction strategy of nodes and relations,including embedded interaction,projection interaction,sharing interaction in the graph encoder,and scoring interaction in the relationship decoder.This strategy not only strengthens the multi-stage and multi-view connection between nodes and relations,but also fully utilizes the relation semantics in the interaction process.To model heterogeneity,MHCGT uses independent parameters to explicitly model node and relation types and to characterize the heterogeneous attention over each triple through Transformer multi-head self-attention mechanism,enhancing the ability to maintain dedicated representation for different types of nodes and relations. |