Random neural networks (RNN) have been efficiently used as learning tools in many applications of different types. The learning procedure followed so far is the gradient descent one. In this paper we explore the use of the Levenberg—Marquardt (LM) optimization procedure, more powerful when it is applicable, together with one of its major extensions, the LM procedure with adaptive momentum. We show how these methods can be used with RNN and run several experiments to evaluate their performances. The use of these techniques in the case of RNN lead to similar conclusions than when using standard artificial neural network: they clearly improve the learning efficiency
Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural netwo...
International audienceThe prediction of complex signals is among the most important applications of ...
Designing deep neural networks is an art that often involves an expensive search over candidate arch...
Random neural networks (RNN) have been efficiently used as learning tools in many applications of di...
International audienceRandom Neural Networks (RNNs) are a class of Neural Networks (NNs) that can al...
Random Neural Networks (RNNs) area classof Neural Networks (NNs) that can also be seen as a specific...
In big data fields, with increasing computing capability, artificial neural networks have shown grea...
The random neural network model proposed by Gelenbe has a number of interesting features in addition...
Recurrent Neural Networks (RNNs) are powerful sequence models that were believed to be difficult to ...
The Random Neural Network (RNN) has received, since its inception in 1989, considerable attention an...
Random cost simulations were introduced as a method to investigate optimization prob-lems in systems...
The learning rate is the most crucial hyper-parameter of a neural network that has a significant imp...
The Random Neural Network (RNN) has received, since its inception in 1989, considerable attention an...
In this thesis we introduce new models and learning algorithms for the Random Neural Network (RNN), ...
In this paper, a linear approximation for Gelenbe's Learning Algorithm developed for training Recurr...
Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural netwo...
International audienceThe prediction of complex signals is among the most important applications of ...
Designing deep neural networks is an art that often involves an expensive search over candidate arch...
Random neural networks (RNN) have been efficiently used as learning tools in many applications of di...
International audienceRandom Neural Networks (RNNs) are a class of Neural Networks (NNs) that can al...
Random Neural Networks (RNNs) area classof Neural Networks (NNs) that can also be seen as a specific...
In big data fields, with increasing computing capability, artificial neural networks have shown grea...
The random neural network model proposed by Gelenbe has a number of interesting features in addition...
Recurrent Neural Networks (RNNs) are powerful sequence models that were believed to be difficult to ...
The Random Neural Network (RNN) has received, since its inception in 1989, considerable attention an...
Random cost simulations were introduced as a method to investigate optimization prob-lems in systems...
The learning rate is the most crucial hyper-parameter of a neural network that has a significant imp...
The Random Neural Network (RNN) has received, since its inception in 1989, considerable attention an...
In this thesis we introduce new models and learning algorithms for the Random Neural Network (RNN), ...
In this paper, a linear approximation for Gelenbe's Learning Algorithm developed for training Recurr...
Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural netwo...
International audienceThe prediction of complex signals is among the most important applications of ...
Designing deep neural networks is an art that often involves an expensive search over candidate arch...