Font Size: a A A

Discussion On Graph Convolutional Neural Networks With Deep Layers

Posted on:2021-10-26Degree:MasterType:Thesis
Country:ChinaCandidate:J Y QuanFull Text:PDF
GTID:2518306107459504Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With the development of Cloud Computing,Big Data and Parallel-Computing,Deep Learning algorithms represented by Convolutional Neural Networks(CNN)have achieved great success in fields of face recognition,object detection,and semantic segmentation.This success is largely due to the Deep-Layer structure of CNN —by repeatedly stacking convolutional modules to achieve the effects that low-levels extracting the detailed features and high-levels extracting the content information.To some existence,it is the Deep-Layer structure of the convolutional neural network that gives CNN an important position in the field of digital image processing.Inspired by the CNN network on image,Graph Convolutional Networks(GCN),which is specially used to process non-Euclidean data,has received more and more attention in recent years.In order to cope with the irregularity of the graph data structure,the general GCNs is based on the Spectral Graph Theory,where a Fourier transform on graph data is used to inversely define the "convolution" operation on the graph.This kind of network has obtained preliminary applications in the fields of node classification,social networks,point cloud data,and chemical molecular recognition.However,a large amount of research shows that,due to the emergence of oversmoothing and the vanishing gradients,graph networks based on spectral graph theory are often limited to extremely shallow depths(2?3 layers),which greatly limits the performance of graph networks and further theoretical improvement space.In this paper,we attempt to discuss the depth-problem about the Graph Convolutional Neural Networks,based on a detail review of related work in the field of graph networks.The main results of the article are as follows:(1)From the perspective of Message Passing Process and Heat Equations,the problem of oversmoothing in graph networks is pointed out.We point out that the over-smoothing problem in Message Passing Neural Networks is inevitable,and was speed up by GCN's unique spectrum-based standardized information processing method.(2)We give the convergence results of GCN in the case of bipartite connected graphs and nonbipartite connected graphs,by mathematical derivation.(3)We have abandoned the minor repair on the depth improvement of the spectral domain GCN,and improved the residual mechanism in the vertex domain graph method.So we propose a residualgraph attention network Res GAT,which makes the accuracy of the network continuously increase with depth to about 6 layers.The best calculation results have been obtained on the three mainstream literature cited datasets of Cora,Citeseer,and Pubmed.At the end of the article,we conducted a preliminary discussion of some relative future works on graph networks.
Keywords/Search Tags:Graph Convolutional Networks, Deep-Layer Networks, Spectral Graph Theory, over-smoothing, Residual-Graph Attention Network
PDF Full Text Request
Related items