© 2019, Springer Science+Business Media, LLC, part of Springer Nature. We present a novel fully adaptive conditional gradient method with the step length regulation for solving pseudo-convex constrained optimization problems. We propose some deterministic rules of the step length regulation in a normalized direction. These rules guarantee to find the step length by utilizing the finite procedures and provide the strict relaxation of the objective function at each iteration. We prove that the sequence of the function values for the iterates generated by the algorithm converges globally to the objective function optimal value with sublinear rate
International audienceConditional Gradients (aka Frank-Wolfe algorithms) form a classical set of met...
We propose two novel conditional gradient-based methods for solving structured stochastic convex opt...
International audience<p>We propose a conditional gradient framework for a composite convex minimiza...
© 2019, Springer Science+Business Media, LLC, part of Springer Nature. We present a novel fully adap...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
© 2018, Allerton Press, Inc. We propose a simple rule for the step-size choice in the conditional gr...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
We provide new adaptive first-order methods for constrained convex optimization. Our main algorithms...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2018, Allerton Press, Inc. We propose a simple rule for the step-size choice in the conditional gr...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
International audienceIn this paper we study the convex problem of optimizing the sum of a smooth fu...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
International audienceConditional Gradients (aka Frank-Wolfe algorithms) form a classical set of met...
We propose two novel conditional gradient-based methods for solving structured stochastic convex opt...
International audience<p>We propose a conditional gradient framework for a composite convex minimiza...
© 2019, Springer Science+Business Media, LLC, part of Springer Nature. We present a novel fully adap...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. We suggest simple modificati...
© 2018, Allerton Press, Inc. We propose a simple rule for the step-size choice in the conditional gr...
Abstract. Linear optimization is many times algorithmically simpler than non-linear convex optimizat...
We provide new adaptive first-order methods for constrained convex optimization. Our main algorithms...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2018, Allerton Press, Inc. We propose a simple rule for the step-size choice in the conditional gr...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
International audienceIn this paper we study the convex problem of optimizing the sum of a smooth fu...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
© 2017 Informa UK Limited, trading as Taylor & Francis Group We suggest simple implementable modif...
International audienceConditional Gradients (aka Frank-Wolfe algorithms) form a classical set of met...
We propose two novel conditional gradient-based methods for solving structured stochastic convex opt...
International audience<p>We propose a conditional gradient framework for a composite convex minimiza...