We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization approach for solving discretized versions of convex infinite dimensional optimization problems. Global convergence is proved under fairly minimal requirements on the minimization method used at all grid levels. In particular, our convergence proof does not require that these minimization, or so-called “smoothing ” steps, which we interpret in the context of optimization, be taken at each grid level in contrast with multigrid algorithms for PDEs, which fail to converge without such steps. Preliminary numerical experiments show that our method is promising
This dissertation has investigated the use of multigrid methods in certain classes of optimization p...
The first order condition of the constrained minimization problem leads to a saddle point problem. A...
Transforming smoothers are known as a successful approach to the multigrid treatment of saddlepoint ...
Abstract. We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization a...
For the constrained minimization of convex or non-convex functionals on the basis of multilevel or d...
The well-known Conjugate Gradient (CG) method minimizes a strictly convex quadratic function for s...
This paper presents and analyzes a new multigrid framework to solve shape optimization problems gove...
A quadratically convergent line-search algorithm for piecewise smooth convex optimization based on a...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
Several types of line search methods are documented in the literature and are well known for unconst...
The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems...
Inspired by multigrid methods for linear systems of equations, multilevel optimization methods have ...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
This paper presents a new method for global optimization. We use exact quadratic regularization for ...
This article proposes large-scale convex optimization problems to be solved via saddle points of the...
This dissertation has investigated the use of multigrid methods in certain classes of optimization p...
The first order condition of the constrained minimization problem leads to a saddle point problem. A...
Transforming smoothers are known as a successful approach to the multigrid treatment of saddlepoint ...
Abstract. We present a line search multigrid method based on Nash’s MG/OPT multilevel optimization a...
For the constrained minimization of convex or non-convex functionals on the basis of multilevel or d...
The well-known Conjugate Gradient (CG) method minimizes a strictly convex quadratic function for s...
This paper presents and analyzes a new multigrid framework to solve shape optimization problems gove...
A quadratically convergent line-search algorithm for piecewise smooth convex optimization based on a...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
Several types of line search methods are documented in the literature and are well known for unconst...
The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems...
Inspired by multigrid methods for linear systems of equations, multilevel optimization methods have ...
This paper gives some global and uniform convergence estimates for a class of subspace correction (b...
This paper presents a new method for global optimization. We use exact quadratic regularization for ...
This article proposes large-scale convex optimization problems to be solved via saddle points of the...
This dissertation has investigated the use of multigrid methods in certain classes of optimization p...
The first order condition of the constrained minimization problem leads to a saddle point problem. A...
Transforming smoothers are known as a successful approach to the multigrid treatment of saddlepoint ...