In this paper, we propose the SCORE (self-concordant regularization) framework for unconstrained minimization problems which incorporates second-order information in the Newton decrement framework for convex optimization. We propose the generalized Gauss-Newton with Self-Concordant Regularization (GGN-SCORE) algorithm that updates the minimization variables each time it receives a new input batch. The proposed algorithm exploits the structure of the second-order information in the Hessian matrix, thereby reducing computational overhead. GGN-SCORE demonstrates how we may speed up convergence while also improving model generalization for problems that involve regularized minimization under the SCORE framework. Numerical experiments show the e...
We consider the class of convex minimization problems, composed of a self-concordant function, such ...
Many modern applications in machine learning, image/signal processing, and statistics require to sol...
International audiencePopular machine learning estimators involve regularization parameters that can...
We introduce the notion of self-concordant smoothing for minimizing the sum of two convex functions:...
Recently, there has been a surge of interest in designing variants of the classical Newton-CG in whi...
Training deep neural networks consumes increasing computational resource shares in many compute cent...
We consider a variant of inexact Newton Method, called Newton-MR, in which the least-squares sub-pro...
This work proposes a universal and adaptive second-order method for minimizing second-order smooth, ...
Trust-region (TR) and adaptive regularization using cubics (ARC) have proven to have some very appea...
We present an efficient block-diagonal approximation to the Gauss-Newton matrix for feedforward neur...
In this paper, we propose new methods to efficiently solve convex optimization problems encountered ...
While first-order methods are popular for solving optimization problems that arise in large-scale de...
Hessian-based analysis/computation is widely used in scientific computing. However, due to the (inco...
International audienceIn this paper, we study large-scale convex optimization algorithms based on th...
In this dissertation, we are concerned with the advancement of optimization algorithms for training ...
We consider the class of convex minimization problems, composed of a self-concordant function, such ...
Many modern applications in machine learning, image/signal processing, and statistics require to sol...
International audiencePopular machine learning estimators involve regularization parameters that can...
We introduce the notion of self-concordant smoothing for minimizing the sum of two convex functions:...
Recently, there has been a surge of interest in designing variants of the classical Newton-CG in whi...
Training deep neural networks consumes increasing computational resource shares in many compute cent...
We consider a variant of inexact Newton Method, called Newton-MR, in which the least-squares sub-pro...
This work proposes a universal and adaptive second-order method for minimizing second-order smooth, ...
Trust-region (TR) and adaptive regularization using cubics (ARC) have proven to have some very appea...
We present an efficient block-diagonal approximation to the Gauss-Newton matrix for feedforward neur...
In this paper, we propose new methods to efficiently solve convex optimization problems encountered ...
While first-order methods are popular for solving optimization problems that arise in large-scale de...
Hessian-based analysis/computation is widely used in scientific computing. However, due to the (inco...
International audienceIn this paper, we study large-scale convex optimization algorithms based on th...
In this dissertation, we are concerned with the advancement of optimization algorithms for training ...
We consider the class of convex minimization problems, composed of a self-concordant function, such ...
Many modern applications in machine learning, image/signal processing, and statistics require to sol...
International audiencePopular machine learning estimators involve regularization parameters that can...