In this thesis we discuss machine learning methods performing automated variable selection for learning sparse predictive models. There are multiple reasons for promoting sparsity in the predictive models. By relying on a limited set of input variables the models naturally counteract the overfitting problem ubiquitous in learning from finite sets of training points. Sparse models are cheaper to use for predictions, they usually require lower computational resources and by relying on smaller sets of inputs can possibly reduce costs for data collection and storage. Sparse models can also contribute to better understanding of the investigated phenomenons as they are easier to interpret than full models
In this work we are interested in the problems of supervised learning and variable selection when th...
Recent advances in machine learning have spawned progress in various fields. In the context of machi...
The use of L1 regularisation for sparse learning has generated immense research interest, with succe...
In this work we are interested in the problems of supervised learning and variable selection when th...
Abstract—Learning sparse structures in high dimensions de-fines a combinatorial selection problem of...
We investigate structured sparsity methods for variable selection in regression problems where the t...
<p>The development of modern information technology has enabled collecting data of unprecedented siz...
In this paper we investigate cross validation and Geisser’s sample reuse approaches for designing li...
In many practical scenarios, prediction for high-dimensional observations can be accurately performe...
This paper introduces a general Bayesian framework for obtaining sparse solutions to re-gression and...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...
The aim of this work package (WP) is to explore approaches to learn structured sparse models, that i...
Sparsity plays a key role in machine learning for several reasons including interpretability. Interp...
In the context of statistical machine learning, sparse learning is a procedure that seeks a reconcil...
Learning Compact High-Dimensional Models in Noisy Environments Building compact, interpretable sta...
In this work we are interested in the problems of supervised learning and variable selection when th...
Recent advances in machine learning have spawned progress in various fields. In the context of machi...
The use of L1 regularisation for sparse learning has generated immense research interest, with succe...
In this work we are interested in the problems of supervised learning and variable selection when th...
Abstract—Learning sparse structures in high dimensions de-fines a combinatorial selection problem of...
We investigate structured sparsity methods for variable selection in regression problems where the t...
<p>The development of modern information technology has enabled collecting data of unprecedented siz...
In this paper we investigate cross validation and Geisser’s sample reuse approaches for designing li...
In many practical scenarios, prediction for high-dimensional observations can be accurately performe...
This paper introduces a general Bayesian framework for obtaining sparse solutions to re-gression and...
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research...
The aim of this work package (WP) is to explore approaches to learn structured sparse models, that i...
Sparsity plays a key role in machine learning for several reasons including interpretability. Interp...
In the context of statistical machine learning, sparse learning is a procedure that seeks a reconcil...
Learning Compact High-Dimensional Models in Noisy Environments Building compact, interpretable sta...
In this work we are interested in the problems of supervised learning and variable selection when th...
Recent advances in machine learning have spawned progress in various fields. In the context of machi...
The use of L1 regularisation for sparse learning has generated immense research interest, with succe...