Font Size: a A A

Social Trust Network Hash Embedding

Posted on:2022-08-26Degree:MasterType:Thesis
Country:ChinaCandidate:Z Z XieFull Text:PDF
GTID:2480306500450404Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
In recent years,with the development of the Internet,social media like Twitter,We Chat have kept growing up and more and more people join those platforms.To provide better services,some platforms allow users to trust or distrust each other.Similar to social networks,social trust networks also have large scale and rich user behaviors and the major difference between them is that the relationships in social trust networks are more complex.To analyze social trust networks and learn a low-dimensional representation for each node,regular network embedding methods such as Deep Walk,LINE are not useful,because they cannot consider both trust and distrust relationships at the same time.Moreover,with the scale of the network increasing,node representations face storage challenges.In order to solve the above problems,this article fully considers social trust networks' two different relationships and uses two types of hash embedding methods to reduce the total memory used for saving node representations.Firstly,a network embedding algorithm based on matrix factorization is proposed and learns a binary code for each node.This method largely reduces the memory,but the quantization process results in information loss.To get more efficient representations,we proposed two way to learn the binary codes.One is directly learning binary matrix.During the learning process,the embedding matrix is always in binary format.By adding the distance between the binary embedding matrix and the set whose items satisfy the binary constraints,the way to learn the final node representations is easier.While learning one parameter,the others are fixed.The other has two stage.First learn a continues representation and then choose appropriate thresholds and use double bit quantization to transform it into a discrete one.For binary hash embedding,the way to represent the whole network is important.Based on node neighbors,we construct three different similarity matrices.The results of sign prediction on four real-world datasets show that both methods have a good performance,and the directly learning method is better.Three similarity matrices have differences in directly method,but in two stage method they are almost the same.Secondly,we add hash function into the embedding method based on random walk.In social trust networks,a node trusts another of two concepts,one is node's attributes and the other is node role.The former can be captured by latent feature while the latter is modeled by graphlet.In most cases,the length of latent feature is large.With the scale of networks increasing,it takes a lot of memory to store node latent feature.For this reason,we design a hash process for latent feature.Since hash is many-to-one,to reduce the probability of conflict,multiple hash functions are used to compress the node latent feature.During training,nodes in the same context window but not directly connected also have strong relationships.Structural balance theory plays an important role in inferring the polarity of their relationship.At the same time,a new negative sampling strategy is designed for social trust network.Experimental results demonstrate the efficacy of our proposed method and it performs comparably well to competitive embedding algorithms,including SIGNet and STNE.
Keywords/Search Tags:Social trust network, Network embedding, Hash embedding, Sign prediction, Link prediction
PDF Full Text Request
Related items