| With the advent of the big data era,the increasing volume of data has promoted the rapid development of artificial intelligence(AI)technologies.Graph neural networks(GNNs),as a type of graph representation learning method,have rapidly developed due to their remarkable performance and high efficiency in mining node information and their relationships in graph data.GNNs have been widely applied in various fields,such as recommendation systems,computer vision,and natural language processing,and have achieved important research progresses.Although various carefully designed GNN structures have been developed for different application scenarios,they all follow the message-passing paradigm,which propagates and aggregates node features along the topology in the neighborhood space,and finally learns advanced node representations.The message-passing process is the key for GNNs to achieve excellent expressive power and computational efficiency.Therefore,in this research,we will focus on studying and implementing the message-passing mechanism of GNNs.Through exploring more expressive message-passing frameworks and implementing and optimizing them in the GNN algorithm library,we aim to achieve the exploration of this research topic.Firstly,we studied and summarized the unified characteristics of the message-passing mechanism of GNNs,which is that the message-passing process continuously propagates and aggregates node features on the topology to obtain advanced node representations.Based on this unified theoretical framework,we proposed to add a label constraint matrix to the message-passing process to integrate label information,and derived a generalized model architecture to learn more expressive node representations.Finally,we conducted experiments on representative datasets to demonstrate the effectiveness of our model.We surveyed the current mainstream GNN algorithm libraries,which all rely on some specific deep learning backends and cannot be well compatible across multiple frameworks.We designed a multi-framework message-passing module,which supported the development and implementation of over 40 models in the multi-framework GNN algorithm library.Furthermore,since the message-passing process is the core process of GNN computation,we optimized the inefficient parts of the operators for different backends and evaluated them on multiple datasets,demonstrating the high efficiency of our optimized operators. |