Font Size: a A A

Structure-Preserving Network Representation Learning:Theories And Methods

Posted on:2022-07-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z W ZhangFull Text:PDF
GTID:1480306746956659Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Network data,also known as graph data,is ubiquitous to represent complex relationships between objects,such as social networks,internet of things,transportation networks,and knowledge graphs.Network representation learning uses vector representation of nodes,edges,or the entire network to analyze network data,which has received considerable attention from both academia and industry in recent years.In network representation learning,how to preserve network structures is the core problem and largely affects the performance of representations in downstream tasks.Structure-preserving network representation learning faces many challenges,such as large-scale networks,dynamic networks,etc.In order to tackle these challenges,in this dissertation,we conduct research on the two types of mainstream network representation learning methods,i.e.,network embedding and graph neural networks(GNNs).We summary the main contributions as follows:· To tackle large-scale networks,we study large-scale structure-preserving network embedding.First,based on the observation that high-order proximity is an important network structure,and different tasks and networks often require different high-order proximities,we propose an arbitrary-order proximity preserved network embedding,which has a low marginal cost in shifting between different proximities.Second,in order to support embedding super-large networks,we propose an iterative random projection network embedding method for billion-scale networks.Its complexity is much lower than the existing methods and is friendly to distributed computing.· To tackle dynamic networks,we study dynamic structure-preserving network embedding.Based on the fact that singular value decomposition(SVD)is widely used in network representation learning and SVD faces error accumulation in dynamic networks,we propose an error-bounded SVD restart for dynamic networks,which can automatically set the restart time of SVD and effectively avoid error accumulation while ensuring algorithm efficiency.· To tackle the deficiency of graph neural networks in preserving network structures,we study structure-preserving graph neural networks.First,we propose a structurepreserving plug-in for GNNs.It is a simple yet effective method to enhance the ability of existing GNNs in preserving network structures.Second,after showing that the message-passing framework of GNNs cannot simultaneously maintain permutation-equivariance and proximity-awareness,we propose a stochastic message-passing framework for GNNs.By learning stochastic node representations,our proposed method can simultaneously satisfy permutation-equivariance and proximity-awareness.Finally,we find that GNNs cannot maintain transformation invariance when applied to geometric data.Therefore,we propose a transformation invariant GNN,which uses a simple and general mechanism to strictly maintain transformation invariance.The research in this dissertation provides new theories and methods for structurepreserving network representation learning,which can significantly enhance the effectiveness of network representation learning on multiple tasks as well as deepen our understanding of the underlying mechanism of network representation learning algorithms.This dissertation may also inspire new directions for future studies of network representation learning.
Keywords/Search Tags:Network Representation Learning, Structure Preserving, Network Embed-ding, Graph Neural Network
PDF Full Text Request
Related items