Transformer-based models have recently shown success in representation learning on graph-structured data beyond natural language processing and computer vision. However, the success is limited to small-scale graphs due to the drawbacks of full dot-product attention on graphs such as the quadratic complexity with respect to the number of nodes and message aggregation from enormous irrelevant nodes. To address these issues, we propose Deformable Graph Transformer (DGT) that performs sparse attention via dynamically sampled relevant nodes for efficiently handling large-scale graphs with a linear complexity in the number of nodes. Specifically, our framework first constructs multiple node sequences with various criteria to consider both structu...
In the Big Data era, large graph datasets are becoming increasingly popular due to their capability ...
Graph Neural Networks (GNNs) have shown tremendous strides in performance for graph-structured probl...
In recent years, deep learning has made a significant impact in various fields – helping to push the...
In this paper we provide, to the best of our knowledge, the first comprehensive approach for incorpo...
The Transformer architecture has achieved remarkable success in a number of domains including natura...
Transformers have made progress in miscellaneous tasks, but suffer from quadratic computational and ...
Transformers have achieved great success in several domains, including Natural Language Processing a...
Transformer architectures have been applied to graph-specific data such as protein structure and sho...
A few models have tried to tackle the link prediction problem, also known as knowledge graph complet...
The Transformer architecture has gained growing attention in graph representation learning recently,...
Modelling long-range dependencies is critical for scene understanding tasks in computer vision. Alth...
Though graph representation learning (GRL) has made significant progress, it is still a challenge to...
To overcome the quadratic cost of self-attention, recent works have proposed various sparse attentio...
We introduce Performers, Transformer architectures which can estimate regular (softmax) full-rank-at...
Transformers have become widely used in modern models for various tasks such as natural language pro...
In the Big Data era, large graph datasets are becoming increasingly popular due to their capability ...
Graph Neural Networks (GNNs) have shown tremendous strides in performance for graph-structured probl...
In recent years, deep learning has made a significant impact in various fields – helping to push the...
In this paper we provide, to the best of our knowledge, the first comprehensive approach for incorpo...
The Transformer architecture has achieved remarkable success in a number of domains including natura...
Transformers have made progress in miscellaneous tasks, but suffer from quadratic computational and ...
Transformers have achieved great success in several domains, including Natural Language Processing a...
Transformer architectures have been applied to graph-specific data such as protein structure and sho...
A few models have tried to tackle the link prediction problem, also known as knowledge graph complet...
The Transformer architecture has gained growing attention in graph representation learning recently,...
Modelling long-range dependencies is critical for scene understanding tasks in computer vision. Alth...
Though graph representation learning (GRL) has made significant progress, it is still a challenge to...
To overcome the quadratic cost of self-attention, recent works have proposed various sparse attentio...
We introduce Performers, Transformer architectures which can estimate regular (softmax) full-rank-at...
Transformers have become widely used in modern models for various tasks such as natural language pro...
In the Big Data era, large graph datasets are becoming increasingly popular due to their capability ...
Graph Neural Networks (GNNs) have shown tremendous strides in performance for graph-structured probl...
In recent years, deep learning has made a significant impact in various fields – helping to push the...