In this paper, we propose multi-timescale, sequential algorithms for deterministic optimization which can find high-quality solutions. The algorithms fundamentally track the well-known derivative-free model-based search methods in an efficient and resourceful manner with additional heuristics to accelerate the scheme. Our approaches exhibit competitive performance on a selected few global optimization benchmark problems
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
This dissertation considers several common notions of complexity that arise in large-scale systems o...
We introduce a new randomized method called Model Reference Adaptive Search (MRAS) for solving globa...
We extend the idea of model-based algorithms for deterministic optimization to simulation optimizati...
We study a class of random sampling-based algorithms for solving general non-convex, nondifferentiab...
We examine the conventional wisdom that commends the use of directe search methods in the presence o...
International audienceRecently, a convergence proof of stochastic search algorithms toward finite si...
This thesis addresses aspects of stochastic algorithms for the solution of global optimisation probl...
In data mining we come across many problems such as function optimization problem or parameter estim...
A useful measure of quality of a global optimisation algorithm such as simulated annealing is the le...
Thesis (Master's)--University of Washington, 2021Black-box optimization is ubiquitous in machine lea...
The goal of this article is to provide a general framework for locally convergent random-search algo...
Stochastic search is a key mechanism underlying many metaheuristics. The chapter starts with the pre...
It is standard engineering practice to use approximation models in place of expensive simulations to...
It is frequently the case that deterministic optimization models could be made more practical by exp...
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
This dissertation considers several common notions of complexity that arise in large-scale systems o...
We introduce a new randomized method called Model Reference Adaptive Search (MRAS) for solving globa...
We extend the idea of model-based algorithms for deterministic optimization to simulation optimizati...
We study a class of random sampling-based algorithms for solving general non-convex, nondifferentiab...
We examine the conventional wisdom that commends the use of directe search methods in the presence o...
International audienceRecently, a convergence proof of stochastic search algorithms toward finite si...
This thesis addresses aspects of stochastic algorithms for the solution of global optimisation probl...
In data mining we come across many problems such as function optimization problem or parameter estim...
A useful measure of quality of a global optimisation algorithm such as simulated annealing is the le...
Thesis (Master's)--University of Washington, 2021Black-box optimization is ubiquitous in machine lea...
The goal of this article is to provide a general framework for locally convergent random-search algo...
Stochastic search is a key mechanism underlying many metaheuristics. The chapter starts with the pre...
It is standard engineering practice to use approximation models in place of expensive simulations to...
It is frequently the case that deterministic optimization models could be made more practical by exp...
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
This dissertation considers several common notions of complexity that arise in large-scale systems o...
We introduce a new randomized method called Model Reference Adaptive Search (MRAS) for solving globa...