Shallow GNNs tend to have sub-optimal performance dealing with large-scale graphs or graphs with missing features. Therefore, it is necessary to increase the depth (i.e., the number of layers) of GNNs to capture more latent knowledge of the input data. On the other hand, including more layers in GNNs typically decreases their performance due to, e.g., vanishing gradient and oversmoothing. Existing methods (e.g., PairNorm and DropEdge) mainly focus on addressing oversmoothing, but they suffer from some drawbacks such as requiring hard-to-acquire knowledge or having large training randomness. In addition, these methods simply incorporate ResNet to address vanishing gradient. They ignore an important fact: by stacking more and more layers with...
Graphs are omnipresent and GNNs are a powerful family of neural networks for learning over graphs. D...
Graph Convolutional Networks (GCNs) have emerged as powerful tools for learning on network structure...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard plights in trai...
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may ...
End-to-end training of graph neural networks (GNN) on large graphs presents several memory and compu...
Graph Neural Networks (GNNs) are a promising deep learning approach for circumventing many real-worl...
Graph Neural Network (GNN) training and inference involve significant challenges of scalability with...
The performance limit of Graph Convolutional Networks (GCNs) and the fact that we cannot stack more ...
Recent works have investigated the role of graph bottlenecks in preventing long-range information pr...
International audienceGraph Neural Networks (GNNs) have succeeded in various computer science applic...
Recently, graph-based models designed for downstream tasks have significantly advanced research on g...
Over-smoothing is a challenging problem, which degrades the performance of deep graph convolutional ...
In recent years, hypergraph learning has attracted great attention due to its capacity in representi...
Graphs are omnipresent and GNNs are a powerful family of neural networks for learning over graphs. D...
Graph Convolutional Networks (GCNs) have emerged as powerful tools for learning on network structure...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard plights in trai...
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may ...
End-to-end training of graph neural networks (GNN) on large graphs presents several memory and compu...
Graph Neural Networks (GNNs) are a promising deep learning approach for circumventing many real-worl...
Graph Neural Network (GNN) training and inference involve significant challenges of scalability with...
The performance limit of Graph Convolutional Networks (GCNs) and the fact that we cannot stack more ...
Recent works have investigated the role of graph bottlenecks in preventing long-range information pr...
International audienceGraph Neural Networks (GNNs) have succeeded in various computer science applic...
Recently, graph-based models designed for downstream tasks have significantly advanced research on g...
Over-smoothing is a challenging problem, which degrades the performance of deep graph convolutional ...
In recent years, hypergraph learning has attracted great attention due to its capacity in representi...
Graphs are omnipresent and GNNs are a powerful family of neural networks for learning over graphs. D...
Graph Convolutional Networks (GCNs) have emerged as powerful tools for learning on network structure...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...