We provide a comprehensive, effective and very efficient methodology for the design and experimental analysis of algorithms. We rely on modern statistical techniques for tuning and understanding algorithms from an experimental perspective. Therefore, we make use of the sequential parameter optimization (SPO) method that has been successfully applied as a tuning procedure to numerous heuristics for practical and theoretical optimization problems. Two case studies, which illustrate the applicability of SPO to algorithm tuning and model selection, are presented
Several common general purpose optimization algorithms are compared for findingA- and D-optimal desi...
It is a common technique in global optimization with expensive black-box functions to learn a surrog...
Most real-world optimization problems behave stochastically. Evolutionary optimization algorithms ha...
We provide a comprehensive, effective and very efficient methodology for the design and experimental...
Abstract- Sequential parameter optimization is a heuristic that combines classical and modern statis...
Abstract. Obviously, it is not a good idea to apply an optimization algorithm with wrongly specified...
There is a strong need for sound statistical analysis of simulation and optimization algorithms. Bas...
Tuning parameters is an important step for the application of metaheuristics to specific problem cla...
The focus of this thesis is on solving a sequence of optimization problems that change over time in ...
Sequential Parameter Optimization is a model-based optimization methodology, which includes several ...
Abstract. This paper describes a sequential experimentation approach for efficiently screening and t...
Design of experiments is an established approach to parameter optimization of industrial processes. ...
Hyperparameter tuning is one of the the most time-consuming parts in machine learning. Despite the e...
This Dagstuhl seminar brought together researchers from statistical ranking and selection; experimen...
Parameter tuning aims to find suitable parameter values for heuristic optimisation algorithms that a...
Several common general purpose optimization algorithms are compared for findingA- and D-optimal desi...
It is a common technique in global optimization with expensive black-box functions to learn a surrog...
Most real-world optimization problems behave stochastically. Evolutionary optimization algorithms ha...
We provide a comprehensive, effective and very efficient methodology for the design and experimental...
Abstract- Sequential parameter optimization is a heuristic that combines classical and modern statis...
Abstract. Obviously, it is not a good idea to apply an optimization algorithm with wrongly specified...
There is a strong need for sound statistical analysis of simulation and optimization algorithms. Bas...
Tuning parameters is an important step for the application of metaheuristics to specific problem cla...
The focus of this thesis is on solving a sequence of optimization problems that change over time in ...
Sequential Parameter Optimization is a model-based optimization methodology, which includes several ...
Abstract. This paper describes a sequential experimentation approach for efficiently screening and t...
Design of experiments is an established approach to parameter optimization of industrial processes. ...
Hyperparameter tuning is one of the the most time-consuming parts in machine learning. Despite the e...
This Dagstuhl seminar brought together researchers from statistical ranking and selection; experimen...
Parameter tuning aims to find suitable parameter values for heuristic optimisation algorithms that a...
Several common general purpose optimization algorithms are compared for findingA- and D-optimal desi...
It is a common technique in global optimization with expensive black-box functions to learn a surrog...
Most real-world optimization problems behave stochastically. Evolutionary optimization algorithms ha...