We propose a framework for localized learning with Reservoir Computing dynamical neural systems in pervasive environments, where data is distributed and dynamic. We use biologically plausible intrinsic plasticity (IP) learning to optimize the non-linearity of system dynamics based on local objectives, and extend it to account for data uncertainty. We develop two algorithms for federated and continual learning, FedIP and FedCLIP, which respectively extend IP to client-server topologies and to prevent catastrophic forgetting in streaming data scenarios. Results on real-world datasets from human monitoring show that our approach improves performance and robustness, while preserving privacy and efficiency
Most known learning algorithms for dynamic neural networks in non-stationary environments need globa...
Networks are everywhere and we are confronted with many networks in our daily life. Networks such as...
Artificial Recurrent Neural Networks are a powerful information processing abstraction, and Reservoi...
We propose a framework for localized learning with Reservoir Computing dynamical neural systems in p...
We propose a novel algorithm for performing federated learning with Echo State Networks (ESNs) in a ...
The fixed random connectivity of networks in reservoir computing leads to significant variation in p...
Reservoir computing (RC) studies the properties of large recurrent networks of artificial neurons, w...
Dynamic neural fields (DNFs) are dynamical systems models that approximate the activity of large, ho...
Dynamic neural fields (DNFs) are dynamical systems models that approximate the activity of large, ho...
Dynamical systems have been used to describe a vast range of phenomena, including physical sciences...
Abstract. Generating stable yet performant spiking neural reservoirs for classification applications...
Artificial recurrent neural networks are a powerful information processing abstraction, and Reservoi...
A major puzzle in the field of computational neuroscience is how to relate system-level learning in ...
A fundamental aspect of learning in biological neural networks is the plasticity property which allo...
Reservoir computing approaches have been successfully applied to a variety of tasks. An inherent pro...
Most known learning algorithms for dynamic neural networks in non-stationary environments need globa...
Networks are everywhere and we are confronted with many networks in our daily life. Networks such as...
Artificial Recurrent Neural Networks are a powerful information processing abstraction, and Reservoi...
We propose a framework for localized learning with Reservoir Computing dynamical neural systems in p...
We propose a novel algorithm for performing federated learning with Echo State Networks (ESNs) in a ...
The fixed random connectivity of networks in reservoir computing leads to significant variation in p...
Reservoir computing (RC) studies the properties of large recurrent networks of artificial neurons, w...
Dynamic neural fields (DNFs) are dynamical systems models that approximate the activity of large, ho...
Dynamic neural fields (DNFs) are dynamical systems models that approximate the activity of large, ho...
Dynamical systems have been used to describe a vast range of phenomena, including physical sciences...
Abstract. Generating stable yet performant spiking neural reservoirs for classification applications...
Artificial recurrent neural networks are a powerful information processing abstraction, and Reservoi...
A major puzzle in the field of computational neuroscience is how to relate system-level learning in ...
A fundamental aspect of learning in biological neural networks is the plasticity property which allo...
Reservoir computing approaches have been successfully applied to a variety of tasks. An inherent pro...
Most known learning algorithms for dynamic neural networks in non-stationary environments need globa...
Networks are everywhere and we are confronted with many networks in our daily life. Networks such as...
Artificial Recurrent Neural Networks are a powerful information processing abstraction, and Reservoi...