Published: March 16, 2018When standard optimization methods fail to find a satisfactory solution for a parameter fitting problem, a tempting recourse is to adjust parameters manually. While tedious, this approach can be surprisingly powerful in terms of achieving optimal or near-optimal solutions. This paper outlines an optimization algorithm, Adaptive Stochastic Descent (ASD), that has been designed to replicate the essential aspects of manual parameter fitting in an automated way. Specifically, ASD uses simple principles to form probabilistic assumptions about (a) which parameters have the greatest effect on the objective function, and (b) optimal step sizes for each parameter. We show that for a certain class of optimization problems (na...
abstract (abridged): many of the present problems in automatic control economic systems and living o...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
<p>The four algorithms used are Nelder-Mead nonlinear simplex, Levenberg-Marquardt gradient descent,...
When standard optimization methods fail to find a satisfactory solution for a parameter fitting prob...
<div><p>When standard optimization methods fail to find a satisfactory solution for a parameter fitt...
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective ...
The goal of this paper is to debunk and dispel the magic behind black-box optimizers and stochastic ...
Stochastic optimization (SO) is extensively studied in various fields, such as control engineering, ...
Stochastic optimization algorithms have been growing rapidly in popularity over the last decade or t...
Many systems and processes, both natural and artificial, may be described by parameter-driven mathem...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
This dissertation work combines two lines of work related to stochastic optimization, one focused on...
where the function f or its gradient rf are not directly accessible except through Monte Carlo estim...
This thesis is concerned with stochastic optimization methods. The pioneering work in the field is t...
Stochastic Gradient Descent (SGD) has played a crucial role in the success of modern machine learnin...
abstract (abridged): many of the present problems in automatic control economic systems and living o...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
<p>The four algorithms used are Nelder-Mead nonlinear simplex, Levenberg-Marquardt gradient descent,...
When standard optimization methods fail to find a satisfactory solution for a parameter fitting prob...
<div><p>When standard optimization methods fail to find a satisfactory solution for a parameter fitt...
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective ...
The goal of this paper is to debunk and dispel the magic behind black-box optimizers and stochastic ...
Stochastic optimization (SO) is extensively studied in various fields, such as control engineering, ...
Stochastic optimization algorithms have been growing rapidly in popularity over the last decade or t...
Many systems and processes, both natural and artificial, may be described by parameter-driven mathem...
Current machine learning practice requires solving huge-scale empirical risk minimization problems q...
This dissertation work combines two lines of work related to stochastic optimization, one focused on...
where the function f or its gradient rf are not directly accessible except through Monte Carlo estim...
This thesis is concerned with stochastic optimization methods. The pioneering work in the field is t...
Stochastic Gradient Descent (SGD) has played a crucial role in the success of modern machine learnin...
abstract (abridged): many of the present problems in automatic control economic systems and living o...
Optimization has been the workhorse of solving machine learning problems. However, the efficiency of...
<p>The four algorithms used are Nelder-Mead nonlinear simplex, Levenberg-Marquardt gradient descent,...