Font Size: a A A

Research And Implementation Of Graph Classification Model Based On Deep Learning

Posted on:2020-11-16Degree:MasterType:Thesis
Country:ChinaCandidate:R G HuangFull Text:PDF
GTID:2428330602952128Subject:Engineering
Abstract/Summary:PDF Full Text Request
Graphs are natural data structures adapted to represent realworld data of complex relationships.Recently,there exist several-predictive models over graphs,which are applicable to many fields such as chemical,computational biology and social networking;The complex graph structures make most graph-based method hard to extract interpretable and discriminative structural features.As connectivity and size of graphs could vary significantly across instances,how to capture the key structural information from graphbased objects remains an open problem.Traditional graph kernel methods have some deficiencies such as high computational complexity and poor scalability.Furthermore,the calculation of the similarity matrix and the training of classifier are two independent steps,which will cause the problem that the subsequent classification task cannot guide the selection of features.In recent years,deep learning has achieved remarkable achievements in graph classification.However,most deep learning methods only concentrate on global graph features,and ignore the local subgraph features.In view of the above problems and shortcomings,in this thesis,we propose a new neural network structure called Graph Attention Model to extract graph features and improve the generalization performance of graph classification;The key innovation of our work is the attention layer of the model,which adapt the attention mechanism;The attention layer is used to assemble the subgraph structure with high contribution value in a hierarchical way,so as to fabricate discriminant subgraph,subsequent pooling,and full connection layers classify input graphs according to discriminant subgraphs;First,Graph Attention Model adopts a parallel,probabilistic decision process,according to the size of the center node and its neighborhood attention coefficient to determine whether to choose the neighborhood node,which allows the attention layer to explore highly complex graph spaces,and then it can tune substructure features in a finer granularity;Meanwhile,the model is guided by the class label of the graph,and the parameters can be continuously improved with supervised learning through back-propagation,leading to optimizable the assembly process of subgraph structure;Finally,we compare the graph attention model with several popular graph kernel methods and deep learning methods in the data set,and find that the graph attention model ranks first in the overall performance.Overall,Graph Attention Model calculation is efficient,at the edge of attention mechanism in all is parallel computing,the characteristics of the output vector calculation on all the nodes are parallel,embraces both the flexibility of combinatorial pattern search and the strong optimization of deep learning,and delivers promising results as well as interpretable structural features in graph classification against state-of-the-art techniques.
Keywords/Search Tags:graph classification, deep learning, neural network, graph attention model, attention mechanism
PDF Full Text Request
Related items