Message passing has evolved as an effective tool for designing Graph Neural Networks (GNNs). However, most existing methods for message passing simply sum or average all the neighboring features to update node representations. They are restricted by two problems, i.e., (i) lack of interpretability to identify node features significant to the prediction of GNNs, and (ii) feature over-mixing that leads to the over-smoothing issue in capturing long-range dependencies and inability to handle graphs under heterophily or low homophily. In this paper, we propose a Node-level Capsule Graph Neural Network (NCGNN) to address these problems with an improved message passing scheme. Specifically, NCGNN represents nodes as groups of node-level capsules, ...
Existing Graph Neural Networks (GNNs) follow the message-passing mechanism that conducts information...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Hypergraph representations are both more efficient and better suited to describe data characterized ...
Graph neural networks (GNNs) have demonstrated superior performance for semi-supervised node classif...
Node classification tasks on graphs are addressed via fully-trained deep message-passing models that...
Recent works have investigated the role of graph bottlenecks in preventing long-range information pr...
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs, i.e., graphs in whi...
Graph Neural Networks (GNNs) have been widely applied in the semi-supervised node classification tas...
For node classification, Graph Neural Networks (GNN) assign predefined labels to graph nodes accordi...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...
Graph Neural Networks (GNNs) are a promising deep learning approach for circumventing many real-worl...
In recent years, we have witnessed a surge of Graph Neural Networks (GNNs), most of which can learn ...
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may ...
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and se...
The core operation of current Graph Neural Networks (GNNs) is the aggregation enabled by the graph L...
Existing Graph Neural Networks (GNNs) follow the message-passing mechanism that conducts information...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Hypergraph representations are both more efficient and better suited to describe data characterized ...
Graph neural networks (GNNs) have demonstrated superior performance for semi-supervised node classif...
Node classification tasks on graphs are addressed via fully-trained deep message-passing models that...
Recent works have investigated the role of graph bottlenecks in preventing long-range information pr...
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs, i.e., graphs in whi...
Graph Neural Networks (GNNs) have been widely applied in the semi-supervised node classification tas...
For node classification, Graph Neural Networks (GNN) assign predefined labels to graph nodes accordi...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...
Graph Neural Networks (GNNs) are a promising deep learning approach for circumventing many real-worl...
In recent years, we have witnessed a surge of Graph Neural Networks (GNNs), most of which can learn ...
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may ...
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and se...
The core operation of current Graph Neural Networks (GNNs) is the aggregation enabled by the graph L...
Existing Graph Neural Networks (GNNs) follow the message-passing mechanism that conducts information...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Hypergraph representations are both more efficient and better suited to describe data characterized ...