Coordinate descent methods usually minimize a cost function by updating a random decision variable (corresponding to one coordinate) at a time. Ideally, we would update the decision variable that yields the largest decrease in the cost function. However, finding this coordinate would require checking all of them, which would effectively negate the improvement in computational tractability that coordinate descent is intended to afford. To address this, we propose a new adaptive method for selecting a coordinate. First, we find a lower bound on the amount the cost function decreases when a coordinate is updated. We then use a multi-armed bandit algorithm to learn which coordinates result in the largest lower bound by interleaving this learnin...
Abstract. A perceptron is a linear threshold classifier that separates examples with a hyperplane. I...
International audienceAcceleration of first order methods is mainly obtained via inertial techniques...
This work looks at large-scale machine learning, with a particular focus on greedy methods. A recent...
Abstract Coordinate descent algorithms solve optimization problems by suc-cessively performing appro...
Abstract We propose and analyze a new parallel coordinate descent method-'NSyncin which at each...
Abstract Accelerated coordinate descent is widely used in optimization due to its cheap per-iteratio...
We study the problem of minimizing the sum of a smooth convex function and a convex block-separable ...
We propose and analyze a new parallel coordinate descent method—‘NSync— in which at each iteration a...
The coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of ...
International audienceAs the number of samples and dimensionality of optimization problems related t...
Coordinate descent with random coordinate selection is the current state of the art for many large s...
The unprecedented rate at which data is being created and stored calls for scalable optimization te...
In this paper we propose new methods for solving huge-scale optimization problems. For problems of t...
International audience<p>We propose a new randomized coordinate descent method for minimizing the s...
In this work we show that randomized (block) coordinate descent methods can be accelerated by parall...
Abstract. A perceptron is a linear threshold classifier that separates examples with a hyperplane. I...
International audienceAcceleration of first order methods is mainly obtained via inertial techniques...
This work looks at large-scale machine learning, with a particular focus on greedy methods. A recent...
Abstract Coordinate descent algorithms solve optimization problems by suc-cessively performing appro...
Abstract We propose and analyze a new parallel coordinate descent method-'NSyncin which at each...
Abstract Accelerated coordinate descent is widely used in optimization due to its cheap per-iteratio...
We study the problem of minimizing the sum of a smooth convex function and a convex block-separable ...
We propose and analyze a new parallel coordinate descent method—‘NSync— in which at each iteration a...
The coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of ...
International audienceAs the number of samples and dimensionality of optimization problems related t...
Coordinate descent with random coordinate selection is the current state of the art for many large s...
The unprecedented rate at which data is being created and stored calls for scalable optimization te...
In this paper we propose new methods for solving huge-scale optimization problems. For problems of t...
International audience<p>We propose a new randomized coordinate descent method for minimizing the s...
In this work we show that randomized (block) coordinate descent methods can be accelerated by parall...
Abstract. A perceptron is a linear threshold classifier that separates examples with a hyperplane. I...
International audienceAcceleration of first order methods is mainly obtained via inertial techniques...
This work looks at large-scale machine learning, with a particular focus on greedy methods. A recent...