We present a heuristic meta-learning search method for finding a set of optimized algorithmic parameters for a range of machine learning algo- rithms. The method, wrapped progressive sampling, is a combination of classifier wrapping and progressive sampling of training data. A series of experiments on UCI benchmark data sets with nominal features, and five machine learning algorithms to which simple wrapping and wrapped progres- sive sampling is applied, yields results that show little improvement for the algorithm which offers few parameter variations, but marked improvements for the algorithms offering many possible testable parameter combinations, yielding up to 32.2% error reduction with the winnow learning algorithm
Abstract. As the size of the databases increases, machine learning algorithms face more demanding re...
Abstract. The success of machine learning on a given task depends on, among other things, which lear...
One of the challenges in Machine Learning to find a classifier and parameter settings that work well...
We present a heuristic meta-learning search method for finding a set of optimized algorithmic parame...
Framework for Similarity-Based Methods (SBMs) allows to create many algorithms that differ in import...
We address the problem of nding the pa-rameter settings that will result in optimal performance of a...
Hyperparameter optimization in machine learning is a critical task that aims to find the hyper-param...
We address the problem of finding the parameter settings that will result in optimal performance of ...
The majority of the algorithms used to solve hard optimization problems today are population metaheu...
Given a large data set and a classification learning algorithm, Progressive Sampling (PS) uses incre...
For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the...
The proposed metaheuristic optimization algorithm based on the two-step Adams-Bashforth scheme (MOAB...
The field of machine learning has seen explosive growth over the past decade, largely due to increas...
Machine learning algorithms have been used widely in various applications and areas. To fit a machin...
Identifying the best machine learning algorithm for a given problem continues to be an active area o...
Abstract. As the size of the databases increases, machine learning algorithms face more demanding re...
Abstract. The success of machine learning on a given task depends on, among other things, which lear...
One of the challenges in Machine Learning to find a classifier and parameter settings that work well...
We present a heuristic meta-learning search method for finding a set of optimized algorithmic parame...
Framework for Similarity-Based Methods (SBMs) allows to create many algorithms that differ in import...
We address the problem of nding the pa-rameter settings that will result in optimal performance of a...
Hyperparameter optimization in machine learning is a critical task that aims to find the hyper-param...
We address the problem of finding the parameter settings that will result in optimal performance of ...
The majority of the algorithms used to solve hard optimization problems today are population metaheu...
Given a large data set and a classification learning algorithm, Progressive Sampling (PS) uses incre...
For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the...
The proposed metaheuristic optimization algorithm based on the two-step Adams-Bashforth scheme (MOAB...
The field of machine learning has seen explosive growth over the past decade, largely due to increas...
Machine learning algorithms have been used widely in various applications and areas. To fit a machin...
Identifying the best machine learning algorithm for a given problem continues to be an active area o...
Abstract. As the size of the databases increases, machine learning algorithms face more demanding re...
Abstract. The success of machine learning on a given task depends on, among other things, which lear...
One of the challenges in Machine Learning to find a classifier and parameter settings that work well...