This work studies minimization problems with zero-order noisy oracle information under the assumption that the objective function is highly smooth and possibly satisfies additional properties. We consider two kinds of zero-order projected gradient descent algorithms, which differ in the form of the gradient estimator. The first algorithm uses a gradient estimator based on randomization over the $\ell_2$ sphere due to Bach and Perchet (2016). We present an improved analysis of this algorithm on the class of highly smooth and strongly convex functions studied in the prior work, and we derive rates of convergence for two more general classes of non-convex functions. Namely, we consider highly smooth functions satisfying the Polyak-{\L}ojasiewi...
We consider unconstrained randomized optimization of smooth convex functions in the gradient-free se...
Polyak-{\L}ojasiewicz (PL) [Polyak, 1963] condition is a weaker condition than the strong convexity ...
We propose STARS, a randomized derivative-free algorithm for unconstrained opti-mization when the fu...
This work studies minimization problems with zero-order noisy oracle information under the assumptio...
International audienceThe minimization of convex functions which are only available through partial ...
We consider non-smooth saddle point optimization problems. To solve these problems, we propose a zer...
International audienceThis work studies online zero-order optimization of convex and Lipschitz funct...
This work studies online zero-order optimization of convex and Lipschitz functions. We present a nov...
In this paper, we prove new complexity bounds for methods of convex optimization based only on compu...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
We consider unconstrained randomized optimization of smooth convex objective functions in the gradie...
We consider unconstrained randomized optimization of smooth convex objective functions in the gradie...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
For bandit convex optimization we propose a model, where a gradient estimation oracle acts as an int...
Classical global convergence results for first-order methods rely on uniform smoothness and the \L{}...
We consider unconstrained randomized optimization of smooth convex functions in the gradient-free se...
Polyak-{\L}ojasiewicz (PL) [Polyak, 1963] condition is a weaker condition than the strong convexity ...
We propose STARS, a randomized derivative-free algorithm for unconstrained opti-mization when the fu...
This work studies minimization problems with zero-order noisy oracle information under the assumptio...
International audienceThe minimization of convex functions which are only available through partial ...
We consider non-smooth saddle point optimization problems. To solve these problems, we propose a zer...
International audienceThis work studies online zero-order optimization of convex and Lipschitz funct...
This work studies online zero-order optimization of convex and Lipschitz functions. We present a nov...
In this paper, we prove new complexity bounds for methods of convex optimization based only on compu...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
We consider unconstrained randomized optimization of smooth convex objective functions in the gradie...
We consider unconstrained randomized optimization of smooth convex objective functions in the gradie...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
For bandit convex optimization we propose a model, where a gradient estimation oracle acts as an int...
Classical global convergence results for first-order methods rely on uniform smoothness and the \L{}...
We consider unconstrained randomized optimization of smooth convex functions in the gradient-free se...
Polyak-{\L}ojasiewicz (PL) [Polyak, 1963] condition is a weaker condition than the strong convexity ...
We propose STARS, a randomized derivative-free algorithm for unconstrained opti-mization when the fu...