We present a generic framework for par-allel coordinate descent (CD) algorithms that includes, as special cases, the orig-inal sequential algorithms Cyclic CD and Stochastic CD, as well as the recent paral-lel Shotgun algorithm. We introduce two novel parallel algorithms that are also spe-cial cases—Thread-Greedy CD and Coloring-Based CD—and give performance measure-ments for an OpenMP implementation of these. 1
We propose two simple polynomial-time algorithms to find a positive solution to Ax = 0. Both algo...
Coordinate descent with random coordinate selection is the current state of the art for many large s...
AbstractThis work addresses the problem of regularized linear least squares (RLS) with non-quadratic...
Large-scale `1-regularized loss minimization problems arise in high-dimensional applications such as...
Abstract Coordinate descent algorithms solve optimization problems by suc-cessively performing appro...
© 2017 Elsevier B.V. We consider a large-scale minimization problem (not necessarily convex) with n...
The recent years have witnessed advances in parallel algorithms for large scale optimization problem...
In this work we show that randomized (block) coordinate descent methods can be accelerated by parall...
This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separab...
We consider convex-concave saddle point problems with a separable structure and non-strongly convex ...
The coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of ...
Large-scale optimization problems appear quite frequently in data science and machine learning appli...
In this paper we show how to accelerate randomized coordinate descent methods and achieve faster con...
We study the problem of minimizing the sum of a smooth convex function and a convex block-separable ...
This thesis focuses on coordinate update methods (CU), which are useful for solving problems involvi...
We propose two simple polynomial-time algorithms to find a positive solution to Ax = 0. Both algo...
Coordinate descent with random coordinate selection is the current state of the art for many large s...
AbstractThis work addresses the problem of regularized linear least squares (RLS) with non-quadratic...
Large-scale `1-regularized loss minimization problems arise in high-dimensional applications such as...
Abstract Coordinate descent algorithms solve optimization problems by suc-cessively performing appro...
© 2017 Elsevier B.V. We consider a large-scale minimization problem (not necessarily convex) with n...
The recent years have witnessed advances in parallel algorithms for large scale optimization problem...
In this work we show that randomized (block) coordinate descent methods can be accelerated by parall...
This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separab...
We consider convex-concave saddle point problems with a separable structure and non-strongly convex ...
The coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of ...
Large-scale optimization problems appear quite frequently in data science and machine learning appli...
In this paper we show how to accelerate randomized coordinate descent methods and achieve faster con...
We study the problem of minimizing the sum of a smooth convex function and a convex block-separable ...
This thesis focuses on coordinate update methods (CU), which are useful for solving problems involvi...
We propose two simple polynomial-time algorithms to find a positive solution to Ax = 0. Both algo...
Coordinate descent with random coordinate selection is the current state of the art for many large s...
AbstractThis work addresses the problem of regularized linear least squares (RLS) with non-quadratic...