Natural gradient learning is an efficient and principled method for improving on-line learning. In practical applications there will be an increased cost required in estimat-ing and inverting the Fisher information ma-trix. We propose to use the matrix momen-tum algorithm in order to carry out efficient inversion and study the efficacy of a sin-gle step estimation of the Fisher information matrix. We analyse the proposed algorithms in a two-layer neural network, using a statis-tical mechanics framework which allows us to describe analytically the learning dynam-ics, and compare performance with true nat-ural gradient learning and standard gradi-ent descent.
The process of machine learning can be considered in two stages: model selection and parameter estim...
The momentum parameter is common within numerous optimization and local search algorithms, particula...
The process of machine learning can be considered in two stages model selection and parameter estim...
Natural gradient learning is an efficient and principled method for improving online learning. In pr...
We analyse the dynamics of a number of second order on-line learning algorithms training multi-layer...
We analyse natural gradient learning in a two-layer feed-forward neural network using a statistical ...
Natural gradient descent (NGD) is an on-line algorithm for redefining the steepest descent direction...
We analyse the matrix momentum algorithm, which provides an efficient approximation to on-line Newto...
Second-order optimization methods, such as nat-ural gradient, are difficult to apply to high-dimensi...
This paper applies natural gradient (NG) learning neural networks (NNs) for modeling and identificat...
When a parameter space has a certain underlying structure, the ordinary gradient of a function does ...
We analyse the matrix momentum algorithm, which provides an efficient approx-imation to on-line Newt...
The natural gradient descent method is applied to train an n-m-1 mul-tilayer perceptron. Based on an...
The parameter space of neural networks has the Riemannian metric structure. The natural Riemannian g...
Pre-PrintThe process of machine learning can be considered in two stages model selection and paramet...
The process of machine learning can be considered in two stages: model selection and parameter estim...
The momentum parameter is common within numerous optimization and local search algorithms, particula...
The process of machine learning can be considered in two stages model selection and parameter estim...
Natural gradient learning is an efficient and principled method for improving online learning. In pr...
We analyse the dynamics of a number of second order on-line learning algorithms training multi-layer...
We analyse natural gradient learning in a two-layer feed-forward neural network using a statistical ...
Natural gradient descent (NGD) is an on-line algorithm for redefining the steepest descent direction...
We analyse the matrix momentum algorithm, which provides an efficient approximation to on-line Newto...
Second-order optimization methods, such as nat-ural gradient, are difficult to apply to high-dimensi...
This paper applies natural gradient (NG) learning neural networks (NNs) for modeling and identificat...
When a parameter space has a certain underlying structure, the ordinary gradient of a function does ...
We analyse the matrix momentum algorithm, which provides an efficient approx-imation to on-line Newt...
The natural gradient descent method is applied to train an n-m-1 mul-tilayer perceptron. Based on an...
The parameter space of neural networks has the Riemannian metric structure. The natural Riemannian g...
Pre-PrintThe process of machine learning can be considered in two stages model selection and paramet...
The process of machine learning can be considered in two stages: model selection and parameter estim...
The momentum parameter is common within numerous optimization and local search algorithms, particula...
The process of machine learning can be considered in two stages model selection and parameter estim...