While the design of algorithms is traditionally a discrete endeavour, in recent years many advances have come from continuous perspectives. Typically, a continuous process, deterministic or randomized, is designed and shown to have desirable properties, such as approaching an optimal solution or a target distribution, and an algorithm is derived from this by appropriate discretization. We will discuss examples of this for optimization (gradient descent, interior-point method) and sampling (Brownian motion, Hamiltonian Monte Carlo), with applications to learning. In some interesting and rather general settings, the current fastest methods have been obtained via this approach
Research into the dynamics of Genetic Algorithms (GAs) has led to the ¯eld of Estimation{of{Distribu...
Optimization is a fundamental tool in modern science. Numerous important tasks in biology, economy, ...
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We propose new continuous-time formulations for first-order stochastic optimization algorithms such ...
The problem of learning from data is prevalent in the modern scientific age, and optimization provid...
The direct application of statistics to stochastic optimization based on iterated density estimation...
Optimization problems with continuous data appear in, e.g., robust machine learning, functional data...
Continuous optimization seems to be the ubiquitous formulation of an impressive number of different ...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
Continuous optimization is never easy: the exact solution is always a luxury demand and ...
The problem of drawing samples from a discrete distribution can be converted into a discrete optimiz...
Slides of a talk given at Dortmund University, Dept. of Statistics, on March 2015 the 11th. Invitati...
One method to solve expensive black-box optimization problems is to use a surrogate model that appro...
The possibility that a discrete process can be closely approximated by a continuous one, with the la...
A Complex System can be defined as a natural, artificial, social, or economic entity whose model inv...
Research into the dynamics of Genetic Algorithms (GAs) has led to the ¯eld of Estimation{of{Distribu...
Optimization is a fundamental tool in modern science. Numerous important tasks in biology, economy, ...
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We propose new continuous-time formulations for first-order stochastic optimization algorithms such ...
The problem of learning from data is prevalent in the modern scientific age, and optimization provid...
The direct application of statistics to stochastic optimization based on iterated density estimation...
Optimization problems with continuous data appear in, e.g., robust machine learning, functional data...
Continuous optimization seems to be the ubiquitous formulation of an impressive number of different ...
Abstract: Stochastic gradient descent is an optimisation method that combines classical gradient des...
Continuous optimization is never easy: the exact solution is always a luxury demand and ...
The problem of drawing samples from a discrete distribution can be converted into a discrete optimiz...
Slides of a talk given at Dortmund University, Dept. of Statistics, on March 2015 the 11th. Invitati...
One method to solve expensive black-box optimization problems is to use a surrogate model that appro...
The possibility that a discrete process can be closely approximated by a continuous one, with the la...
A Complex System can be defined as a natural, artificial, social, or economic entity whose model inv...
Research into the dynamics of Genetic Algorithms (GAs) has led to the ¯eld of Estimation{of{Distribu...
Optimization is a fundamental tool in modern science. Numerous important tasks in biology, economy, ...
In this paper we are concerned with global optimization, which can be defined as the problem of find...