While the amount of data that we are able to collect keeps growing, the use of Machine Learning and statistical models has become more computationally demanding. Moreover, especially in industry applications, these models need to be trained quickly and efficiently, while also being updated frequently. With this increased complexity comes the necessity of finding ways to make the creation and training of models as automated as possible. In the first part of this thesis, we develop two methods to adaptively tune the learning rate of iterative methods to perform optimization and sampling. While the two settings are substantially different, a similar underlying idea related to stationarity detection of the updates can be used to gain informatio...
This thesis explores one of the most fundamental questions in Machine Learning, namely, how should t...
In machine learning, active learning is becoming increasingly more widely used, especially for type...
Iterative compiler optimization has been shown to outperform static approaches. This, however, is at...
While the amount of data that we are able to collect keeps growing, the use of Machine Learning and ...
Modern technological advances have prompted massive scale data collection in manymodern fields such ...
The tuning of learning algorithm parameters has become more and more important during the last years...
In addition to high accuracy, robustness is becoming increasingly important for machine learning mod...
Object classification by learning from data is a vast area of statistics and machine learning. Withi...
Training machine learning models requires users to select many tuning parameters. For example, a pop...
Kernel-based active learning strategies were studied for the optimization of environmental monitorin...
Active learning refers to the settings in which a machine learning algorithm (learner) is able to s...
Most active learning methods avoid model selection by training models of one type (SVMs, boosted tre...
Since performance is not portable between platforms, engineers must fine-tune heuristics for each pr...
Traditional supervised machine learning algorithms are expected to have access to a large corpus of ...
[Abstract] Non-active adaptive sampling is a way of building machine learning models from a training...
This thesis explores one of the most fundamental questions in Machine Learning, namely, how should t...
In machine learning, active learning is becoming increasingly more widely used, especially for type...
Iterative compiler optimization has been shown to outperform static approaches. This, however, is at...
While the amount of data that we are able to collect keeps growing, the use of Machine Learning and ...
Modern technological advances have prompted massive scale data collection in manymodern fields such ...
The tuning of learning algorithm parameters has become more and more important during the last years...
In addition to high accuracy, robustness is becoming increasingly important for machine learning mod...
Object classification by learning from data is a vast area of statistics and machine learning. Withi...
Training machine learning models requires users to select many tuning parameters. For example, a pop...
Kernel-based active learning strategies were studied for the optimization of environmental monitorin...
Active learning refers to the settings in which a machine learning algorithm (learner) is able to s...
Most active learning methods avoid model selection by training models of one type (SVMs, boosted tre...
Since performance is not portable between platforms, engineers must fine-tune heuristics for each pr...
Traditional supervised machine learning algorithms are expected to have access to a large corpus of ...
[Abstract] Non-active adaptive sampling is a way of building machine learning models from a training...
This thesis explores one of the most fundamental questions in Machine Learning, namely, how should t...
In machine learning, active learning is becoming increasingly more widely used, especially for type...
Iterative compiler optimization has been shown to outperform static approaches. This, however, is at...