Traditionally, much of the research in the field of optimization algorithms has assumed that problem parameters are correctly specified. Recent efforts under the robust optimization framework have relaxed this assumption by allowing unknown parameters to vary in a prescribed uncertainty set and by subsequently solving for a worst-case solution. This dissertation considers a rather different approach in which the unknown or misspecified parameter is a solution to a suitably defined (stochastic) learning problem based on having access to a set of samples. Practical approaches in resolving such a set of coupled problems have been either sequential or direct variational approaches. In the case of the former, this entails the following steps: (...
International audienceMotivated by the numerical resolution of stochastic optimization problems subj...
In this paper, we study distributionally robust optimization approaches for a one-stage stochastic m...
Standard stochastic optimization methods are brittle, sensitive to stepsize choices and other algori...
Traditionally, much of the research in the field of optimization algorithms has assumed that problem...
Stochastic approximation (SA) methods, first proposed by Robbins and Monro in 1951 for root- findin...
We consider the solution of a stochastic convex optimization problem E [ f (x;θ ∗,ξ)] in x over a cl...
This thesis has two themes. In chapters 1 and 2 we investigate tractable approximations to specific ...
Stochastic approximation (SA) methods, first proposed by Robbins and Monro in 1951 for root- findin...
Uncertainty has a tremendous impact on decision making. The more connected we get, it seems, the mor...
In this thesis we study several machine learning problems that are all linked with the minimization ...
In this thesis we study several machine learning problems that are all linked with the minimization ...
In this thesis we study several machine learning problems that are all linked with the minimization ...
In this thesis we study iterative algorithms in order to solve constrained and unconstrained convex ...
In this thesis we study iterative algorithms in order to solve constrained and unconstrained convex ...
We consider a Cartesian stochastic variational inequality problem with a monotone map. For this prob...
International audienceMotivated by the numerical resolution of stochastic optimization problems subj...
In this paper, we study distributionally robust optimization approaches for a one-stage stochastic m...
Standard stochastic optimization methods are brittle, sensitive to stepsize choices and other algori...
Traditionally, much of the research in the field of optimization algorithms has assumed that problem...
Stochastic approximation (SA) methods, first proposed by Robbins and Monro in 1951 for root- findin...
We consider the solution of a stochastic convex optimization problem E [ f (x;θ ∗,ξ)] in x over a cl...
This thesis has two themes. In chapters 1 and 2 we investigate tractable approximations to specific ...
Stochastic approximation (SA) methods, first proposed by Robbins and Monro in 1951 for root- findin...
Uncertainty has a tremendous impact on decision making. The more connected we get, it seems, the mor...
In this thesis we study several machine learning problems that are all linked with the minimization ...
In this thesis we study several machine learning problems that are all linked with the minimization ...
In this thesis we study several machine learning problems that are all linked with the minimization ...
In this thesis we study iterative algorithms in order to solve constrained and unconstrained convex ...
In this thesis we study iterative algorithms in order to solve constrained and unconstrained convex ...
We consider a Cartesian stochastic variational inequality problem with a monotone map. For this prob...
International audienceMotivated by the numerical resolution of stochastic optimization problems subj...
In this paper, we study distributionally robust optimization approaches for a one-stage stochastic m...
Standard stochastic optimization methods are brittle, sensitive to stepsize choices and other algori...