© 2016 IEEE. Recently, the primal-dual method of multipliers (PDMM) has been proposed to solve a convex optimization problem defined over a general graph. In this paper, we consider simplifying PDMM for a subclass of the convex optimization problems. This subclass includes the consensus problem as a special form. By using algebra, we show that the update expressions of PDMM can be simplified significantly. We then evaluate PDMM for training a support vector machine (SVM). The experimental results indicate that PDMM converges considerably faster than the alternating direction method of multipliers (ADMM)
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
that CPA and variants are closely related to preconditioned versions of the popular alternating dire...
© 2015 IEEE. In this paper, we extend the bi-alternating direction method of multipliers (BiADMM) de...
Recently, the primal-dual method of multipliers (PDMM) has been proposed to solve a convex optimizat...
In this paper, we present a novel derivation of an existing algorithm for distributed optimization t...
We propose two algorithms based on the Primal-Dual Method of Multipliers (PDMM) to be used in distri...
© 2015 IEEE. We propose two algorithms based on the Primal-Dual Method of Multipliers (PDMM) to be u...
We describe how the powerful “Divide and Concur ” algorithm for constraint satisfac-tion can be deri...
Many statistical learning problems can be posed as minimization of a sum of two convex functions, on...
Convex optimization is at the core of many of today's analysis tools for large datasets, and in par...
Many problems of recent interest in statistics and machine learning can be posed in the framework of...
© 2018 IEEE. Edge consensus computing is a framework to optimize a cost function when distributed no...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
Dual decomposition has been successfully employed in a variety of distributed convex optimization pr...
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
that CPA and variants are closely related to preconditioned versions of the popular alternating dire...
© 2015 IEEE. In this paper, we extend the bi-alternating direction method of multipliers (BiADMM) de...
Recently, the primal-dual method of multipliers (PDMM) has been proposed to solve a convex optimizat...
In this paper, we present a novel derivation of an existing algorithm for distributed optimization t...
We propose two algorithms based on the Primal-Dual Method of Multipliers (PDMM) to be used in distri...
© 2015 IEEE. We propose two algorithms based on the Primal-Dual Method of Multipliers (PDMM) to be u...
We describe how the powerful “Divide and Concur ” algorithm for constraint satisfac-tion can be deri...
Many statistical learning problems can be posed as minimization of a sum of two convex functions, on...
Convex optimization is at the core of many of today's analysis tools for large datasets, and in par...
Many problems of recent interest in statistics and machine learning can be posed in the framework of...
© 2018 IEEE. Edge consensus computing is a framework to optimize a cost function when distributed no...
We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical con...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
Dual decomposition has been successfully employed in a variety of distributed convex optimization pr...
In many machine learning problems such as the dual form of SVM, the objective function to be minimiz...
that CPA and variants are closely related to preconditioned versions of the popular alternating dire...
© 2015 IEEE. In this paper, we extend the bi-alternating direction method of multipliers (BiADMM) de...