The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems. In this paper, a new framework to design and analyze FAS-like schemes for convex optimization problems is developed. The new method, the fast subspace descent (FASD) scheme, which generalizes classical FAS, can be recast as an inexact version of nonlinear multigrid methods based on space decomposition and subspace correction. The local problem in each subspace can be simplified to be linear and one gradient descent iteration (with an appropriate step size) is enough to ensure a global linear (geometric) convergence of FASD for convex optimization problems
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
xvi, 152 p. : ill. ; 30 cm.PolyU Library Call No.: [THS] LG51 .H577P AMA 2013 HuThe purpose of this ...
In this study, we propose a sequential convex programming (SCP) method that uses an enhanced two-poi...
The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
Convergence of a space decomposition method is proved for a class of convex programming problems. A ...
We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization approach fo...
We implement and test a globally convergent sequential approximate optimization algorithm based on (...
Abstract. We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization a...
This paper presents an acceleration of the optimal subgradient algorithm OSGA [30] for solving conve...
<p>The rapid growth in data availability has led to modern large scale convex optimization problems ...
We consider optimization problems in which the goal is find a $k$-dimensional subspace of $\reals^n$...
International audienceWe consider the minimization of a differentiable Lipschitz gradient but non ne...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
xvi, 152 p. : ill. ; 30 cm.PolyU Library Call No.: [THS] LG51 .H577P AMA 2013 HuThe purpose of this ...
In this study, we propose a sequential convex programming (SCP) method that uses an enhanced two-poi...
The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
Convergence of a space decomposition method is proved for a class of convex programming problems. A ...
We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization approach fo...
We implement and test a globally convergent sequential approximate optimization algorithm based on (...
Abstract. We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization a...
This paper presents an acceleration of the optimal subgradient algorithm OSGA [30] for solving conve...
<p>The rapid growth in data availability has led to modern large scale convex optimization problems ...
We consider optimization problems in which the goal is find a $k$-dimensional subspace of $\reals^n$...
International audienceWe consider the minimization of a differentiable Lipschitz gradient but non ne...
In this paper we introduce a new primal-dual technique for convergence analysis of gradient schemes ...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
xvi, 152 p. : ill. ; 30 cm.PolyU Library Call No.: [THS] LG51 .H577P AMA 2013 HuThe purpose of this ...
In this study, we propose a sequential convex programming (SCP) method that uses an enhanced two-poi...