An usual problem in statistics consists in estimating the minimizer of a convex function. When we have to deal with large samples taking values in high dimensional spaces, stochastic gradient algorithms and their averaged versions are efficient candidates. Indeed, (1) they do not need too much computational efforts, (2) they do not need to store all the data, which is crucial when we deal with big data, (3) they allow to simply update the estimates, which is important when data arrive sequentially. The aim of this work is to give asymptotic and non asymptotic rates of convergence of stochastic gradient estimates as well as of their averaged versions when the function we would like to minimize is only locally strongly convex
In this paper, we consider supervised learning problems such as logistic regression and study the st...
Stochastic Gradient Descent (SGD) is one of the simplest and most popular stochastic optimization me...
In this paper, we consider the minimization of a convex objective function defined on a Hilbert spac...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
The vast majority of convergence rates analysis for stochastic gradient methods in the literature fo...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
Stochastic gradient descent (SGD) is a simple and very popular iterative method to solve stochastic ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
Consider the problem of minimizing functions that are Lipschitz and strongly convex, but not necessa...
International audienceThe growing interest for high dimensional and functional data analysis led in ...
With a weighting scheme proportional to t, a traditional stochastic gradient descent (SGD) algorithm...
In this paper, we consider supervised learning problems such as logistic regression and study the st...
Stochastic Gradient Descent (SGD) is one of the simplest and most popular stochastic optimization me...
In this paper, we consider the minimization of a convex objective function defined on a Hilbert spac...
An usual problem in statistics consists in estimating the minimizer of a convex function. When we ha...
The vast majority of convergence rates analysis for stochastic gradient methods in the literature fo...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
Stochastic gradient descent (SGD) is a simple and very popular iterative method to solve stochastic ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
The growing interest for high dimensional and functional data analysis led in the last decade to an ...
Consider the problem of minimizing functions that are Lipschitz and strongly convex, but not necessa...
International audienceThe growing interest for high dimensional and functional data analysis led in ...
With a weighting scheme proportional to t, a traditional stochastic gradient descent (SGD) algorithm...
In this paper, we consider supervised learning problems such as logistic regression and study the st...
Stochastic Gradient Descent (SGD) is one of the simplest and most popular stochastic optimization me...
In this paper, we consider the minimization of a convex objective function defined on a Hilbert spac...