How long should we run a stochastic global optimisation algorithm such as simulated annealing? How should we tune such an algorithm? This paper proposes an approach to the study of these questions through successive approximation of a generic stochastic global optimisation algorithm with a sequence of stochastic processes, culminating in a backtracking adaptive search process. Our emerging understanding of backtracking adaptive search can thus be used to study the original algorithm. The first approximation, the averaged range process, has the same expected number of iterations to convergence as the original process.14 page(s
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We study a class of random sampling-based algorithms for solving general non-convex, nondifferentiab...
Despite the success of simulated annealing to find near-optimal solutions of intractable discrete op...
A useful measure of quality of a global optimisation algorithm such as simulated annealing is the le...
This thesis addresses aspects of stochastic algorithms for the solution of global optimisation probl...
In this paper we propose a modified version of the simulated annealing algorithm for solving a stoch...
This paper presents some simple technical conditions that guarantee the convergence of a general cla...
This paper presents some simple technical conditions that guarantee the convergence of a general cla...
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
This article analyses a counting process associated with a stochastic process arising in global opti...
International audienceWe study convergence rates of R-d-valued algorithms, especially in the case of...
AbstractWe introduce the notion of expected hitting time to a goal as a measure of the convergence r...
We propose a new adaptive algorithm with decreasing step-size for stochastic approximations. The use...
AbstractWe propose a new adaptive algorithm with decreasing step-size for stochastic approximations....
We propose a new adaptive algorithm with decreasing step-size for stochastic approximations. The use...
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We study a class of random sampling-based algorithms for solving general non-convex, nondifferentiab...
Despite the success of simulated annealing to find near-optimal solutions of intractable discrete op...
A useful measure of quality of a global optimisation algorithm such as simulated annealing is the le...
This thesis addresses aspects of stochastic algorithms for the solution of global optimisation probl...
In this paper we propose a modified version of the simulated annealing algorithm for solving a stoch...
This paper presents some simple technical conditions that guarantee the convergence of a general cla...
This paper presents some simple technical conditions that guarantee the convergence of a general cla...
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
This article analyses a counting process associated with a stochastic process arising in global opti...
International audienceWe study convergence rates of R-d-valued algorithms, especially in the case of...
AbstractWe introduce the notion of expected hitting time to a goal as a measure of the convergence r...
We propose a new adaptive algorithm with decreasing step-size for stochastic approximations. The use...
AbstractWe propose a new adaptive algorithm with decreasing step-size for stochastic approximations....
We propose a new adaptive algorithm with decreasing step-size for stochastic approximations. The use...
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We study a class of random sampling-based algorithms for solving general non-convex, nondifferentiab...
Despite the success of simulated annealing to find near-optimal solutions of intractable discrete op...