Machine Learning for graphs is nowadays a research topic of consolidated relevance. Common approaches in the field typically resort to complex deep neural network architectures and demanding training algorithms, highlighting the need for more efficient solutions. The class of Reservoir Computing (RC) models can play an important role in this context, enabling to develop fruitful graph embeddings through untrained recursive architectures. In this paper, we study progressive simplifications to the design strategy of RC neural networks for graphs. Our core proposal is based on shaping the organization of the hidden neurons to follow a ring topology. Experimental results on graph classification tasks indicate that ring-reservoirs architectures ...
Echo State Networks and Liquid State Machines introduced a new paradigm in artificial recurrent neur...
Reservoir computing (RC), first applied to temporal signal processing, is a recurrent neural network...
In several applications the information is naturally represented by graphs. Traditional approaches c...
Machine Learning for graphs is nowadays a research topic of consolidated relevance. Common approache...
The study of learning models for direct processing complex data structures has gained an increasing ...
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers. The first ty...
Graph neural networks are receiving increasing attention as state-of-the-art methods to process grap...
Reservoir computing (RC) studies the properties of large recurrent networks of artificial neurons, w...
Reservoir computing (RC), a relatively new approach to machine learning, utilizes untrained recurren...
Recurrent Neural Networks (RNNs) are amongst the most powerful Machine Learning models to deal with ...
In this paper we introduce the Graph Echo State Network (GraphESN) model, a generalization of the E...
Reservoir Computing (RC) is an emerging Machine Learning (ML) paradigm. RC systems contain randomly ...
Current advances in reservoir computing have demonstrated that fixed random recurrent networks with ...
We analyze graph neural network models that combine iterative message-passing implemented by a funct...
Typical Artificial Neural Networks (ANNs) have static architectures. The number of nodes and their o...
Echo State Networks and Liquid State Machines introduced a new paradigm in artificial recurrent neur...
Reservoir computing (RC), first applied to temporal signal processing, is a recurrent neural network...
In several applications the information is naturally represented by graphs. Traditional approaches c...
Machine Learning for graphs is nowadays a research topic of consolidated relevance. Common approache...
The study of learning models for direct processing complex data structures has gained an increasing ...
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers. The first ty...
Graph neural networks are receiving increasing attention as state-of-the-art methods to process grap...
Reservoir computing (RC) studies the properties of large recurrent networks of artificial neurons, w...
Reservoir computing (RC), a relatively new approach to machine learning, utilizes untrained recurren...
Recurrent Neural Networks (RNNs) are amongst the most powerful Machine Learning models to deal with ...
In this paper we introduce the Graph Echo State Network (GraphESN) model, a generalization of the E...
Reservoir Computing (RC) is an emerging Machine Learning (ML) paradigm. RC systems contain randomly ...
Current advances in reservoir computing have demonstrated that fixed random recurrent networks with ...
We analyze graph neural network models that combine iterative message-passing implemented by a funct...
Typical Artificial Neural Networks (ANNs) have static architectures. The number of nodes and their o...
Echo State Networks and Liquid State Machines introduced a new paradigm in artificial recurrent neur...
Reservoir computing (RC), first applied to temporal signal processing, is a recurrent neural network...
In several applications the information is naturally represented by graphs. Traditional approaches c...