Stochastic gradient descent (SGD) type optimization schemes are fundamental ingredients in a large number of machine learning based algorithms. In particular, SGD type optimization schemes are frequently employed in applications involving natural language processing, object and face recognition,fraud detection, computational advertisement, and numerical approximations of partial differential equations. In mathematical convergence results for SGD type optimization schemes there are usually two types of error criteria studied in the scientific literature, that is, the error in the strong sense and the error with respect to the objective function. In applications one is often not only interested in the size of the error with respect to the obj...
In this paper we analyze different schemes for obtaining gradient estimates when the underlying func...
Gradient-based algorithms are popular when solving unconstrained optimization problems. By exploitin...
Stochastic optimization algorithms have been growing rapidly in popularity over the last decade or t...
Stochastic gradient descent (SGD) optimization algorithms are key ingredients in a series of machine...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
The main aim of this paper is to provide an analysis of gradient descent (GD) algorithms with gradie...
Stochastic Gradient Descent (SGD) is one of the simplest and most popular stochastic optimization me...
We prove the convergence to minima and estimates on the rate of convergence for the stochastic gradi...
The gradient noise of Stochastic Gradient Descent (SGD) is considered to play a key role in its prop...
In this thesis we want to give a theoretical and practical introduction to stochastic gradient desce...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
We study to what extent may stochastic gradient descent (SGD) be understood as a "conventional" lear...
© Springer International Publishing AG 2016. The convergence of Stochastic Gradient Descent (SGD) us...
This thesis reports on experiments aimed at explaining why machine learning algorithms using the gre...
Stochastic gradient descent (SGD) is a sim-ple and popular method to solve stochas-tic optimization ...
In this paper we analyze different schemes for obtaining gradient estimates when the underlying func...
Gradient-based algorithms are popular when solving unconstrained optimization problems. By exploitin...
Stochastic optimization algorithms have been growing rapidly in popularity over the last decade or t...
Stochastic gradient descent (SGD) optimization algorithms are key ingredients in a series of machine...
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization pr...
The main aim of this paper is to provide an analysis of gradient descent (GD) algorithms with gradie...
Stochastic Gradient Descent (SGD) is one of the simplest and most popular stochastic optimization me...
We prove the convergence to minima and estimates on the rate of convergence for the stochastic gradi...
The gradient noise of Stochastic Gradient Descent (SGD) is considered to play a key role in its prop...
In this thesis we want to give a theoretical and practical introduction to stochastic gradient desce...
We develop the mathematical foundations of the stochastic modified equations (SME) framework for ana...
We study to what extent may stochastic gradient descent (SGD) be understood as a "conventional" lear...
© Springer International Publishing AG 2016. The convergence of Stochastic Gradient Descent (SGD) us...
This thesis reports on experiments aimed at explaining why machine learning algorithms using the gre...
Stochastic gradient descent (SGD) is a sim-ple and popular method to solve stochas-tic optimization ...
In this paper we analyze different schemes for obtaining gradient estimates when the underlying func...
Gradient-based algorithms are popular when solving unconstrained optimization problems. By exploitin...
Stochastic optimization algorithms have been growing rapidly in popularity over the last decade or t...