Many randomized heuristic derivative-free optimization methods share a framework that iteratively learns a model for promising search areas and samples solutions from the model. This paper studies a particular setting of such framework, where the model is implemented by a classification model discriminating good solutions from bad ones. This setting allows a general theoretical characterization, where critical factors to the optimization are discovered. We also prove that optimization problems with Local Lipschitz continuity can be solved in polynomial time by proper configurations of this framework. Following the critical factors, we propose the randomized coordinate shrinking classification algorithm to learn the model, forming the RACOS ...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
We propose feasible descent methods for constrained minimization that do not make explicit use of ob...
We propose a random-subspace algorithmic framework for global optimization of Lipschitz-continuous o...
We consider unconstrained randomized optimization of smooth convex functions in the gradient-free se...
none2siA structured version of derivative-free random pattern search optimization algorithms is intr...
Classification-based optimization is a recently developed framework for derivative-free optimization...
In data mining we come across many problems such as function optimization problem or parameter estim...
AbstractWe present an algorithmic framework for unconstrained derivative-free optimization based on ...
The connection between the conditioning of a problem instance -- the sensitivity of a problem instan...
In this work, we propose a new globally convergent derivative-free algorithm for the minimization of...
Problem statement: The aim of data classification is to establish rules for the classification of so...
Nonconvex optimization problems have always been one focus in deep learning, in which many fast adap...
A novel derivative-free algorithm, called optimization by moving ridge functions (OMoRF), for uncons...
We consider the problem of unconstrained minimization of a smooth objective function in ℝn in a sett...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
We propose feasible descent methods for constrained minimization that do not make explicit use of ob...
We propose a random-subspace algorithmic framework for global optimization of Lipschitz-continuous o...
We consider unconstrained randomized optimization of smooth convex functions in the gradient-free se...
none2siA structured version of derivative-free random pattern search optimization algorithms is intr...
Classification-based optimization is a recently developed framework for derivative-free optimization...
In data mining we come across many problems such as function optimization problem or parameter estim...
AbstractWe present an algorithmic framework for unconstrained derivative-free optimization based on ...
The connection between the conditioning of a problem instance -- the sensitivity of a problem instan...
In this work, we propose a new globally convergent derivative-free algorithm for the minimization of...
Problem statement: The aim of data classification is to establish rules for the classification of so...
Nonconvex optimization problems have always been one focus in deep learning, in which many fast adap...
A novel derivative-free algorithm, called optimization by moving ridge functions (OMoRF), for uncons...
We consider the problem of unconstrained minimization of a smooth objective function in ℝn in a sett...
The dissertation addresses the research topics of machine learning outlined below. We developed the ...
This work addresses the sequential optimization of an unknown and potentially nonconvex function ove...
We propose feasible descent methods for constrained minimization that do not make explicit use of ob...
We propose a random-subspace algorithmic framework for global optimization of Lipschitz-continuous o...