In this paper we derive high probability lower and upper bounds on the excess risk of stochastic optimization of exponentially concave loss functions. Exponentially concave loss functions encom-pass several fundamental problems in machine learning such as squared loss in linear regression, logistic loss in classification, and negative logarithm loss in portfolio management. We demonstrate an O(d log T/T) upper bound on the excess risk of stochastic online Newton step algorithm, and an O(d/T) lower bound on the excess risk of any stochastic optimization method for squared loss, indicating that the obtained upper bound is optimal up to a logarithmic factor. The analysis of up-per bound is based on recent advances in concentration inequalities...
summary:Optimization problems depending on a probability measure correspond to many applications. Th...
Statistical Learning Theory studies the problem of learning an unknown relationship between observed...
Stochastic and adversarial data are two widely studied settings in online learning. But many optimiz...
We consider the problem of stochastic convex optimization with exp-concave losses using Empirical Ri...
The logistic loss function is often advocated in machine learning and statistics as a smooth and str...
We present new excess risk bounds for general unbounded loss functions including log loss and square...
This work provides test error bounds for iterative fixed point methods on linear predictors -- speci...
Many of the classification algorithms developed in the machine learning literature, including the su...
Many of the classification algorithms developed in the machine learning literature, including the s...
There is an accumulating evidence in the literature that stability of learning algorithms is a key c...
International audienceWe provide novel theoretical insights on structured prediction in the context ...
We present an argument based on the multidimensional and the uniform central limit theorems, proving...
This dissertation presents several contributions at the interface of methods for convex optimization...
We consider the problem of bounding the expected value of a linear program (LP) containing random co...
This dissertation applies convex optimization techniques to a class of stochastic optimization probl...
summary:Optimization problems depending on a probability measure correspond to many applications. Th...
Statistical Learning Theory studies the problem of learning an unknown relationship between observed...
Stochastic and adversarial data are two widely studied settings in online learning. But many optimiz...
We consider the problem of stochastic convex optimization with exp-concave losses using Empirical Ri...
The logistic loss function is often advocated in machine learning and statistics as a smooth and str...
We present new excess risk bounds for general unbounded loss functions including log loss and square...
This work provides test error bounds for iterative fixed point methods on linear predictors -- speci...
Many of the classification algorithms developed in the machine learning literature, including the su...
Many of the classification algorithms developed in the machine learning literature, including the s...
There is an accumulating evidence in the literature that stability of learning algorithms is a key c...
International audienceWe provide novel theoretical insights on structured prediction in the context ...
We present an argument based on the multidimensional and the uniform central limit theorems, proving...
This dissertation presents several contributions at the interface of methods for convex optimization...
We consider the problem of bounding the expected value of a linear program (LP) containing random co...
This dissertation applies convex optimization techniques to a class of stochastic optimization probl...
summary:Optimization problems depending on a probability measure correspond to many applications. Th...
Statistical Learning Theory studies the problem of learning an unknown relationship between observed...
Stochastic and adversarial data are two widely studied settings in online learning. But many optimiz...