International audienceWe develop and analyze an asynchronous algorithm for distributed convex optimization when the objective writes a sum of smooth functions, local to each worker, and a non-smooth function. Unlike many existing methods, our distributed algorithm is adjustable to various levels of communication cost, delays, machines computational power, and functions smoothness. A unique feature is that the stepsizes do not depend on communication delays nor number of machines, which is highly desirable for scalability. We prove that the algorithm converges linearly in the strongly convex case, and provide guarantees of convergence for the non-strongly convex case. The obtained rates are the same as the vanilla proximal gradient algorithm...
In many large-scale optimization problems arising in the context of machine learning the decision va...
International audienceOne of the most widely used methods for solving large-scale stochastic optimiz...
International audienceOne of the most widely used methods for solving large-scale stochastic optimiz...
International audienceWe develop and analyze an asynchronous algorithm for distributed convex optimi...
We develop and analyze an asynchronous algorithm for distributed convex optimization when the object...
International audienceDistributed learning aims at computing high-quality models by training over sc...
International audienceDistributed learning aims at computing high-quality models by training over sc...
This thesis proposes and analyzes several first-order methods for convex optimization, designed for ...
We present a distributed proximal-gradient method for optimizing the average of convex functions, ea...
International audienceThis work proposes a theoretical analysis of distributed optimization of conve...
In this paper we consider distributed optimization problems in which the cost function is separable,...
We describe several features of parallel or distributed asynchronous iterative algorithms such as un...
In this paper we consider distributed optimization problems in which the cost function is separable,...
In this paper we consider distributed optimization problems in which the cost function is separable,...
Aiming at solving large-scale optimization problems, this paper studies distributed optimization met...
In many large-scale optimization problems arising in the context of machine learning the decision va...
International audienceOne of the most widely used methods for solving large-scale stochastic optimiz...
International audienceOne of the most widely used methods for solving large-scale stochastic optimiz...
International audienceWe develop and analyze an asynchronous algorithm for distributed convex optimi...
We develop and analyze an asynchronous algorithm for distributed convex optimization when the object...
International audienceDistributed learning aims at computing high-quality models by training over sc...
International audienceDistributed learning aims at computing high-quality models by training over sc...
This thesis proposes and analyzes several first-order methods for convex optimization, designed for ...
We present a distributed proximal-gradient method for optimizing the average of convex functions, ea...
International audienceThis work proposes a theoretical analysis of distributed optimization of conve...
In this paper we consider distributed optimization problems in which the cost function is separable,...
We describe several features of parallel or distributed asynchronous iterative algorithms such as un...
In this paper we consider distributed optimization problems in which the cost function is separable,...
In this paper we consider distributed optimization problems in which the cost function is separable,...
Aiming at solving large-scale optimization problems, this paper studies distributed optimization met...
In many large-scale optimization problems arising in the context of machine learning the decision va...
International audienceOne of the most widely used methods for solving large-scale stochastic optimiz...
International audienceOne of the most widely used methods for solving large-scale stochastic optimiz...