AbstractPassive algorithms for global optimization of a function choose observation points independently of past observed values. We study the average performance of two common passive algorithms under the assumption of Brownian motion prior. The first algorithm chooses equally spaced observation points, while the second algorithm chooses the observation points independently and uniformly distributed. The average convergence rate for both is O(n−1/2), with the second algorithm approximately 82% as efficient as the first
We consider a variety of issues that arise when designing and analyzing computational experiments fo...
Abstract. We present an algorithm for finding a global minimum of a multimodal, multivariate functio...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
Algorithms based on statistical models compete favorably with other global optimization algorithms a...
In this paper we propose a modified version of the simulated annealing algorithm for solving a stoch...
AbstractA problem of one-dimensional global optimization in the presence of noise is considered. The...
A useful measure of quality of a global optimisation algorithm such as simulated annealing is the le...
There are many global optimization algorithms which do not use global information. We broaden previo...
This paper is a study of the error in approximating the global maximum of a Brownian motion on the u...
The envelope used by the algorithm of Breiman and Cutler [4] can be smoothed to create a better algo...
How long should we run a stochastic global optimisation algorithm such as simulated annealing? How s...
In this paper, we analyze a generic algorithm scheme for sequential global opti-mization using Gauss...
A multitude of heuristic stochastic optimization algorithms have been described in literature to obt...
Stochastic global optimization methods are methods for solving a global optimization prob-lem incorp...
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We consider a variety of issues that arise when designing and analyzing computational experiments fo...
Abstract. We present an algorithm for finding a global minimum of a multimodal, multivariate functio...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
Algorithms based on statistical models compete favorably with other global optimization algorithms a...
In this paper we propose a modified version of the simulated annealing algorithm for solving a stoch...
AbstractA problem of one-dimensional global optimization in the presence of noise is considered. The...
A useful measure of quality of a global optimisation algorithm such as simulated annealing is the le...
There are many global optimization algorithms which do not use global information. We broaden previo...
This paper is a study of the error in approximating the global maximum of a Brownian motion on the u...
The envelope used by the algorithm of Breiman and Cutler [4] can be smoothed to create a better algo...
How long should we run a stochastic global optimisation algorithm such as simulated annealing? How s...
In this paper, we analyze a generic algorithm scheme for sequential global opti-mization using Gauss...
A multitude of heuristic stochastic optimization algorithms have been described in literature to obt...
Stochastic global optimization methods are methods for solving a global optimization prob-lem incorp...
In this paper we are concerned with global optimization, which can be defined as the problem of find...
We consider a variety of issues that arise when designing and analyzing computational experiments fo...
Abstract. We present an algorithm for finding a global minimum of a multimodal, multivariate functio...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...