International audienceIn this paper, we study large-scale convex optimization algorithms based on the Newton method applied to regularized generalized self-concordant losses, which include logistic regression and softmax regression. We first prove that our new simple scheme based on a sequence of problems with decreasing regularization parameters is provably globally convergent, that this convergence is linear with a constant factor which scales only logarithmically with the condition number. In the parametric setting, we obtain an algorithm with the same scaling than regular first-order methods but with an improved behavior, in particular in ill-conditioned problems. Second, in the non parametric machine learning setting, we provide an exp...
We present a globally convergent method for regularized risk minimization problems. Our method appli...
In this paper, we propose the SCORE (self-concordant regularization) framework for unconstrained min...
Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ...
We extend the well-known BFGS quasi-Newton method and its memory-limited variant LBFGS to the optimi...
We consider learning methods based on the regularization of a convex empirical risk by a squared Hil...
Motivated by machine learning problems over large data sets and distributed optimization over networ...
We consider the class of convex minimization problems, composed of a self-concordant function, such ...
Projection-free optimization via different variants of the Frank–Wolfe method has become one of the ...
Large-scale logistic regression arises in many applications such as document classification and natu...
Many modern applications in machine learning, image/signal processing, and statistics require to sol...
Many statistical M-estimators are based on convex optimization problems formed by the combination of...
International audiencePopular machine learning estimators involve regularization parameters that can...
We consider the class of convex minimization prob-lems, composed of a self-concordant function, such...
Projection-free optimization via different variants of the Frank-Wolfe method has become one of the ...
We consider the problem of supervised learning with convex loss functions and propose a new form of ...
We present a globally convergent method for regularized risk minimization problems. Our method appli...
In this paper, we propose the SCORE (self-concordant regularization) framework for unconstrained min...
Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ...
We extend the well-known BFGS quasi-Newton method and its memory-limited variant LBFGS to the optimi...
We consider learning methods based on the regularization of a convex empirical risk by a squared Hil...
Motivated by machine learning problems over large data sets and distributed optimization over networ...
We consider the class of convex minimization problems, composed of a self-concordant function, such ...
Projection-free optimization via different variants of the Frank–Wolfe method has become one of the ...
Large-scale logistic regression arises in many applications such as document classification and natu...
Many modern applications in machine learning, image/signal processing, and statistics require to sol...
Many statistical M-estimators are based on convex optimization problems formed by the combination of...
International audiencePopular machine learning estimators involve regularization parameters that can...
We consider the class of convex minimization prob-lems, composed of a self-concordant function, such...
Projection-free optimization via different variants of the Frank-Wolfe method has become one of the ...
We consider the problem of supervised learning with convex loss functions and propose a new form of ...
We present a globally convergent method for regularized risk minimization problems. Our method appli...
In this paper, we propose the SCORE (self-concordant regularization) framework for unconstrained min...
Sparse optimization has seen an evolutionary advance in the past decade with extensive applications ...