Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of over-smoothing and under-reaching to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, mis-simplification, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural in...
Shallow GNNs tend to have sub-optimal performance dealing with large-scale graphs or graphs with mis...
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard plights in trai...
Recently, graph-based models designed for downstream tasks have significantly advanced research on g...
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may ...
International audienceGraph Neural Networks (GNNs) have succeeded in various computer science applic...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Recent works have investigated the role of graph bottlenecks in preventing long-range information pr...
Message-passing graph neural networks (MPNNs) emerged as powerful tools for processing graph-structu...
In designing and applying graph neural networks, we often fall into some optimization pitfalls, the ...
We analyze graph smoothing with \emph{mean aggregation}, where each node successively receives the a...
Graph Neural Networks (GNNs) are a promising deep learning approach for circumventing many real-worl...
International audienceWe analyze graph smoothing with \emph{mean aggregation}, where each node succe...
Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based task...
In recent years, hypergraph learning has attracted great attention due to its capacity in representi...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...
Shallow GNNs tend to have sub-optimal performance dealing with large-scale graphs or graphs with mis...
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard plights in trai...
Recently, graph-based models designed for downstream tasks have significantly advanced research on g...
Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may ...
International audienceGraph Neural Networks (GNNs) have succeeded in various computer science applic...
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to incur perfor...
Recent works have investigated the role of graph bottlenecks in preventing long-range information pr...
Message-passing graph neural networks (MPNNs) emerged as powerful tools for processing graph-structu...
In designing and applying graph neural networks, we often fall into some optimization pitfalls, the ...
We analyze graph smoothing with \emph{mean aggregation}, where each node successively receives the a...
Graph Neural Networks (GNNs) are a promising deep learning approach for circumventing many real-worl...
International audienceWe analyze graph smoothing with \emph{mean aggregation}, where each node succe...
Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based task...
In recent years, hypergraph learning has attracted great attention due to its capacity in representi...
Graph Convolutional Networks (GCN) is a pioneering model for graph-based semi-supervised learning. H...
Shallow GNNs tend to have sub-optimal performance dealing with large-scale graphs or graphs with mis...
Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard plights in trai...
Recently, graph-based models designed for downstream tasks have significantly advanced research on g...