The usual approach to developing and analyzing first-order methods for smooth convex optimization assumes that the gradient of the objective function is uniformly smooth with some Lipschitz constant L. However, in many settings the differentiable convex function f(?) is not uniformly smooth-for example, in D-optimal design where f(x) := -ln det(HXHT) and X := Diag(x), or even the univariate setting with f(x) := -ln(x)+x2. In this paper we develop a notion of "relative smoothness" and relative strong convexity that is determined relative to a user-specified "reference function" h(?) (that should be computationally tractable for algorithms), and we show that many differentiable convex functions are relatively smooth with respect to a correspo...
Counterexamples to some old-standing optimization problems in the smooth convex coercive setting are...
The primary concern of this thesis is to explore efficient first-order methods of computing approxim...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The usual approach to developing and analyzing first-order methods for smooth convex optimization as...
In this thesis, we focus on convergence performance of first-order methods to compute an $\epsilon$-...
In this thesis, we focus on convergence performance of first-order methods to compute an $\epsilon$-...
We discuss non-Euclidean deterministic and stochastic algorithms for optimization problems with stro...
The analysis of gradient descent-type methods typically relies on the Lipschitz continuity of the ob...
In this Master’s thesis, we study the role of convexification as it is used in un- constrained optim...
In this paper we propose a general algorithmic framework for first-order methods in optimization in ...
In this paper we propose a general algorithmic framework for first-order methods in optimization in ...
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable conv...
We propose a new and low per-iteration complexity first-order primal-dual optimization framework for...
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable conv...
Recently there were proposed some innovative convex optimization concepts, namely, relative smoothne...
Counterexamples to some old-standing optimization problems in the smooth convex coercive setting are...
The primary concern of this thesis is to explore efficient first-order methods of computing approxim...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...
The usual approach to developing and analyzing first-order methods for smooth convex optimization as...
In this thesis, we focus on convergence performance of first-order methods to compute an $\epsilon$-...
In this thesis, we focus on convergence performance of first-order methods to compute an $\epsilon$-...
We discuss non-Euclidean deterministic and stochastic algorithms for optimization problems with stro...
The analysis of gradient descent-type methods typically relies on the Lipschitz continuity of the ob...
In this Master’s thesis, we study the role of convexification as it is used in un- constrained optim...
In this paper we propose a general algorithmic framework for first-order methods in optimization in ...
In this paper we propose a general algorithmic framework for first-order methods in optimization in ...
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable conv...
We propose a new and low per-iteration complexity first-order primal-dual optimization framework for...
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable conv...
Recently there were proposed some innovative convex optimization concepts, namely, relative smoothne...
Counterexamples to some old-standing optimization problems in the smooth convex coercive setting are...
The primary concern of this thesis is to explore efficient first-order methods of computing approxim...
International audienceWe introduce and analyze a new family of first-order optimization algorithms w...