Font Size: a A A

Heterogeneous Network Embedding Model Based On Contexts

Posted on:2019-03-15Degree:MasterType:Thesis
Country:ChinaCandidate:Z S KangFull Text:PDF
GTID:2370330590473929Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Complex systems in the real world can usually be built into network forms,such as social networks,biological networks,and so on.It generally contains a variety of information and generate a heterogeneous network.Heterogeneous network embedding studies are to obtain the feature vectors of nodes,edges or the network itself.In this paper,the diverse information of a heterogeneous network is considered to be multiple contexts.For example,in a social network,the context of the user may be a friend relationship,a gender age,and the like,and a published statement.Today,the data from the Internet poses significant challenges for context-based heterogeneous network embedding.One of the main technical problems is that it is very difficult to train multiple context spaces and embedding spaces at the same time,which is prone to contextual interference and affect the convergence of the model.In such an environment,this paper mainly study multi-context-based network embedding and try to solve multi-context training problems.In this paper,the pretraining context method is used to simplify the complex update process of the model,and the convergence speed of the model is greatly improved.In this paper,we use edge sampling and set different learning rates for different types of edges,which avoids the problem of meta-path design and improves sampling efficiency.In addition,this article introduces some new techniques to optimize multi-context embedded models.On the one hand,semi-supervised classification loss is introduced to optimize the model objective function.On the other hand,the graph convolution model is used to train more advanced context vectors and embedding vectors.At the same time,the paper also improved the graph convolution model to improve the training speed.Experiments show that pre-training context and edge sampling can greatly improve the convergence speed and sampling speed of the model.In the experiment of semisupervised classification loss to optimize the objective function,the similar nodes in the space are close to each other,and the non-similar nodes are far away from each other,and the clustering property is very strong.In the experiment of Improved GCN,the training speed of the model is significantly improved,and the performance is not lost.
Keywords/Search Tags:Network Embedding, Graph Convolutional Networks, Context Embedding, Semi-Supervised
PDF Full Text Request
Related items