In this article we study stochastic multistart methods for global optimization, which combine local search with random initialization, and their parallel implementations. It is shown that in a minimax sense the optimal restart distribution is uniform. We further establish the rate of decrease of the ensemble probability that the global minimum has not been found by the nth iteration. Turning to parallelization issues, we show that under independent identical processing (iip), exponential speedup in the time to hit the goal bin normally results. Our numerical studies are in close agreement with these finndings
Thesis (Master's)--University of Washington, 2021Black-box optimization is ubiquitous in machine lea...
On t.p. "d̳" is superscript. Cover title.Includes bibliographical references (p. 28-29).Research su...
The majority of stochastic optimization algorithms can be writ- ten in the general form $x_{t+1}= T...
In this paper we prove that for algorithms which proceed to the next state based on information ava...
We introduce the notion of expected hitting time to a goal as a measure of the con- vergence rate o...
AbstractIn this paper, we establish some bounds for the probability that stimulated annealing produc...
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
Most state-of-the-art optimization algorithms utilize restart to resample new initial solutions to a...
AbstractWe introduce the notion of expected hitting time to a goal as a measure of the convergence r...
The optimization method employing iterated improvementwith random restart (I2R2) is studied. Associa...
International audienceMulti-Modal Optimization (MMO) is ubiquitous in engineer- ing, machine learnin...
We present some typical algorithms used for finding global minimum/ maximum of a function defined on...
We present some typical algorithms used for finding global minimum/maximum of a function defined on...
When a deterministic algorithm for finding the minimum of a function C on a set Ω is em-ployed it ma...
Stochastic global optimization methods are methods for solving a global optimization prob-lem incorp...
Thesis (Master's)--University of Washington, 2021Black-box optimization is ubiquitous in machine lea...
On t.p. "d̳" is superscript. Cover title.Includes bibliographical references (p. 28-29).Research su...
The majority of stochastic optimization algorithms can be writ- ten in the general form $x_{t+1}= T...
In this paper we prove that for algorithms which proceed to the next state based on information ava...
We introduce the notion of expected hitting time to a goal as a measure of the con- vergence rate o...
AbstractIn this paper, we establish some bounds for the probability that stimulated annealing produc...
Two common questions when one uses a stochastic global optimization algorithm, e.g., simulated annea...
Most state-of-the-art optimization algorithms utilize restart to resample new initial solutions to a...
AbstractWe introduce the notion of expected hitting time to a goal as a measure of the convergence r...
The optimization method employing iterated improvementwith random restart (I2R2) is studied. Associa...
International audienceMulti-Modal Optimization (MMO) is ubiquitous in engineer- ing, machine learnin...
We present some typical algorithms used for finding global minimum/ maximum of a function defined on...
We present some typical algorithms used for finding global minimum/maximum of a function defined on...
When a deterministic algorithm for finding the minimum of a function C on a set Ω is em-ployed it ma...
Stochastic global optimization methods are methods for solving a global optimization prob-lem incorp...
Thesis (Master's)--University of Washington, 2021Black-box optimization is ubiquitous in machine lea...
On t.p. "d̳" is superscript. Cover title.Includes bibliographical references (p. 28-29).Research su...
The majority of stochastic optimization algorithms can be writ- ten in the general form $x_{t+1}= T...