Motivated by machine learning problems over large data sets and distributed optimization over networks, we develop and analyze a new method called incremental Newton method for minimizing the sum of a large number of strongly convex functions. We show that our method is globally convergent for a variable stepsize rule. We further show that under a gradient growth condition, convergence rate is linear for both variable and constant stepsize rules. By means of an example, we show that without the gradient growth condition, incremental Newton method cannot achieve linear convergence. Our analysis can be extended to study other incremental methods: in particular, we obtain a linear convergence rate result for the incremental Gauss–Newton algori...
We consider a new algorithm, a reflective Newton method, for the problem of minimizing a smooth no...
We study stochastic Cubic Newton methods for solving general possibly non-convex minimization proble...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
We consider the class of incremental gradient methods for minimizing a sum of continuously different...
We focus on the problem of minimizing the sum of smooth component functions (where the sum is strong...
We survey incremental methods for minimizing a sum ∑m i=1 fi(x) consisting of a large number of conv...
A fundamental classication problem of data mining and machine learning is that of minimizing a stron...
Abstract. Majorization-minimization algorithms consist of successively minimizing a sequence of uppe...
We study the convergence properties of a class of low memory methods for solving large-scale unconst...
International audienceIn this paper, we study large-scale convex optimization algorithms based on th...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...
In this paper, we study the iteration complexity of cubic regularization of Newton method for solvin...
Abstract This paper studies an acceleration technique for incremental aggregated gradient (IAG) met...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex F...
Laboratory for Information and Decision Systems Report LIDS-P-2847We consider the minimization of a ...
We consider a new algorithm, a reflective Newton method, for the problem of minimizing a smooth no...
We study stochastic Cubic Newton methods for solving general possibly non-convex minimization proble...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...
We consider the class of incremental gradient methods for minimizing a sum of continuously different...
We focus on the problem of minimizing the sum of smooth component functions (where the sum is strong...
We survey incremental methods for minimizing a sum ∑m i=1 fi(x) consisting of a large number of conv...
A fundamental classication problem of data mining and machine learning is that of minimizing a stron...
Abstract. Majorization-minimization algorithms consist of successively minimizing a sequence of uppe...
We study the convergence properties of a class of low memory methods for solving large-scale unconst...
International audienceIn this paper, we study large-scale convex optimization algorithms based on th...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...
In this paper, we study the iteration complexity of cubic regularization of Newton method for solvin...
Abstract This paper studies an acceleration technique for incremental aggregated gradient (IAG) met...
Global Convergence of a Class of Collinear Scaling Algorithms with Inexact Line Searches on Convex F...
Laboratory for Information and Decision Systems Report LIDS-P-2847We consider the minimization of a ...
We consider a new algorithm, a reflective Newton method, for the problem of minimizing a smooth no...
We study stochastic Cubic Newton methods for solving general possibly non-convex minimization proble...
In this paper we propose an accelerated version of the cubic regularization of Newton's method [6]. ...