© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modifications of conditional gradient and gradient projection methods for smooth convex optimization problems in Hilbert spaces. Usually, the custom methods attain only weak convergence. We prove strong convergence of the new versions and establish their complexity estimates, which appear similar to the convergence rate of the weakly convergent versions. Preliminary results of computational tests confirm efficiency of the proposed modification
Convergence of a projected stochastic gradient algorithm is demonstrated for convex objective functi...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
The gradient projection algorithm plays an important role in solving constrained convex minimization...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
Abstract Let H be a real Hilbert space and C be a nonempty closed convex subset o...
International audienceMotivated by some applications in signal processing and machine learning, we c...
Motivated by some applications in signal processing and machine learning, we consider two convex opt...
In this paper, we consider gradient methods for minimizing smooth convex functions, which employ the...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
Let C and Q be closed convex subsets of real Hilbert spaces H1 and H2, respectively, and let g:C→R b...
In this paper, we first study in a Hilbertian framework the weak convergence of a general Gradient P...
Convergence of a projected stochastic gradient algorithm is demonstrated for convex objective functi...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
The gradient projection algorithm plays an important role in solving constrained convex minimization...
The convergence behavior of gradient methods for minimizing convex differentiable functions is one o...
Abstract Let H be a real Hilbert space and C be a nonempty closed convex subset o...
International audienceMotivated by some applications in signal processing and machine learning, we c...
Motivated by some applications in signal processing and machine learning, we consider two convex opt...
In this paper, we consider gradient methods for minimizing smooth convex functions, which employ the...
We provide Frank–Wolfe (≡ Conditional Gradients) method with a convergence analysis allowing to appr...
Let C and Q be closed convex subsets of real Hilbert spaces H1 and H2, respectively, and let g:C→R b...
In this paper, we first study in a Hilbertian framework the weak convergence of a general Gradient P...
Convergence of a projected stochastic gradient algorithm is demonstrated for convex objective functi...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
The standard assumption for proving linear convergence of first order methods for smooth convex opti...