In this paper, we present a new class of quasi-Newton methods for the effective learning in large multilayer perceptron (MLP)-networks. The algorithms introduced in this work, named LQN, utilize an iterative scheme of a generalized BFGS-type method, involving a suitable family of matrix algebras L. The main advantages of these innovative methods are based upon the fact that they have an O(n log_2 n) complexity per step and that they require O(n) memory allocations. Numerical experiences, performed on a set of standard benchmarks of MLP-networks, show the competitivity of the LQN methods, especially for large values of n
The restricted Boltzmann machine (RBM) has been used as building blocks for many successful deep lea...
30th AAAI Conference on Artificial Intelligence, AAAI 2016, Phoenix, US, 12-17 February 2016The rest...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
In this paper, we present a new class of quasi-Newton methods for the effective learning in large mu...
In this work the authors implement in a Multi-Layer Perceptron (MLP) environment a new class of quas...
This paper presents a novel quasi-Newton method fo the minimization of the error function of a feed-...
Abstract. This paper presents a novel Quasi-Newton method for the minimization of the error function...
In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve uncons...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
This paper presents some numerical experiments related to a new global "pseudo-backpropagation" algo...
A learning algorithm for multilayer perceptrons is presented which is based on finding the principal...
The learning of neural networks is becoming more and more important. Researchers have constructed do...
Incorporating curvature information in stochastic methods has been a challenging task. This paper pr...
This paper presents a new constructive algorithm to design multilayer perceptron networks used as cl...
The restricted Boltzmann machine (RBM) has been used as building blocks for many successful deep lea...
30th AAAI Conference on Artificial Intelligence, AAAI 2016, Phoenix, US, 12-17 February 2016The rest...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...
In this paper, we present a new class of quasi-Newton methods for the effective learning in large mu...
In this work the authors implement in a Multi-Layer Perceptron (MLP) environment a new class of quas...
This paper presents a novel quasi-Newton method fo the minimization of the error function of a feed-...
Abstract. This paper presents a novel Quasi-Newton method for the minimization of the error function...
In this paper a new class of quasi-Newton methods, named LQN, is introduced in order to solve uncons...
Abstract: The quasi-Newton training method is the most effective method for feed-forward neural netw...
Interest in algorithms which dynamically construct neural networks has been growing in recent years....
This paper presents some numerical experiments related to a new global "pseudo-backpropagation" algo...
A learning algorithm for multilayer perceptrons is presented which is based on finding the principal...
The learning of neural networks is becoming more and more important. Researchers have constructed do...
Incorporating curvature information in stochastic methods has been a challenging task. This paper pr...
This paper presents a new constructive algorithm to design multilayer perceptron networks used as cl...
The restricted Boltzmann machine (RBM) has been used as building blocks for many successful deep lea...
30th AAAI Conference on Artificial Intelligence, AAAI 2016, Phoenix, US, 12-17 February 2016The rest...
ABSTRACT A new fast training algorithm for the Multilayer Perceptron (MLP) is proposed. This new alg...