For training fully-connected neural networks (FCNNs), we propose a practical approximate second-order method including: 1) an approximation of the Hessian matrix and 2) a conjugate gradient (CG) based method. Our proposed approximate Hessian matrix is memory-efficient and can be applied to any FCNNs where the activation and criterion functions are twice differentiable. We devise a CG-based method incorporating one-rank approximation to derive Newton directions for training FCNNs, which significantly reduces both space and time complexity. This CG-based method can be employed to solve any linear equation where the coefficient matrix is Kroneckerfactored, symmetric and positive definite. Empirical studies show the efficacy and efficiency of o...
Minimization methods for training feed-forward networks with Backpropagation are compared. Feedforwa...
Second-order optimization methods applied to train deep neural net- works use the curvature informat...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...
Abstract—Conjugate gradient methods constitute an excellent choice for efficiently training large ne...
We derive two second-order algorithms, based on the conjugate gradient method, for online training o...
In this dissertation, we are concerned with the advancement of optimization algorithms for training ...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Neural networks are an important class of highly flexible and powerful models inspired by the struct...
Several methods for training feed-forward neural networks require second order information from the ...
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simpli...
The conjugate gradient optimization algorithm is combined with the modified back propagation algorit...
This research was partially supported by the Italian MURST. A new second order algorithm based on Sc...
Introduction Training algorithms for Multilayer Perceptrons optimize the set of W weights and biase...
Several methods for training feed-forward neural networks require second order information from the...
First-order methods such as stochastic gradient descent (SGD) have recently become popular optimizat...
Minimization methods for training feed-forward networks with Backpropagation are compared. Feedforwa...
Second-order optimization methods applied to train deep neural net- works use the curvature informat...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...
Abstract—Conjugate gradient methods constitute an excellent choice for efficiently training large ne...
We derive two second-order algorithms, based on the conjugate gradient method, for online training o...
In this dissertation, we are concerned with the advancement of optimization algorithms for training ...
Since the discovery of the back-propagation method, many modified and new algorithms have been propo...
Neural networks are an important class of highly flexible and powerful models inspired by the struct...
Several methods for training feed-forward neural networks require second order information from the ...
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simpli...
The conjugate gradient optimization algorithm is combined with the modified back propagation algorit...
This research was partially supported by the Italian MURST. A new second order algorithm based on Sc...
Introduction Training algorithms for Multilayer Perceptrons optimize the set of W weights and biase...
Several methods for training feed-forward neural networks require second order information from the...
First-order methods such as stochastic gradient descent (SGD) have recently become popular optimizat...
Minimization methods for training feed-forward networks with Backpropagation are compared. Feedforwa...
Second-order optimization methods applied to train deep neural net- works use the curvature informat...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...