We present a flexible formulation for variable selection in multi-task regression to allow for discrepancies in the estimated sparsity pat-terns accross the multiple tasks, while lever-aging the common structure among them. Our approach is based on an intuitive de-composition of the regression coefficients into a product between a component that is com-mon to all tasks and another component that captures task-specificity. This decomposition yields the Multi-level Lasso objective that can be solved efficiently via alternating opti-mization. The analysis of the “orthonormal design ” case reveals some interesting insights on the nature of the shrinkage performed by our method, compared to that of related work. Theoretical guarantees are provid...
National audienceMotivated by diagnostic applications in the field of clinical microbiology, we intr...
We consider the problem of learning a structured multi-task regression, where the output consists of...
Binary logistic regression with a sparsity constraint on the solution plays a vital role in many hig...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
International audienceIn high dimension, it is customary to consider Lasso-type estimators to enforc...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
Multitask learning can be effective when features useful in one task are also useful for other tasks...
International audienceIn high dimensional settings, sparse structures are crucial for efficiency, bo...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
Regression models are a form of supervised learning methods that are important for machine learning,...
We address the problem of joint feature selection across a group of related classification or regres...
This thesis considers the problem of feature selection when the number of predictors is larger than ...
The application of the lasso is espoused in high-dimensional settings where only a small number of t...
We propose a new sparse regression method called the component lasso, based on a simple idea. The me...
We develop a Smooth Lasso for sparse, high dimensional, contingency tables and compare its performan...
National audienceMotivated by diagnostic applications in the field of clinical microbiology, we intr...
We consider the problem of learning a structured multi-task regression, where the output consists of...
Binary logistic regression with a sparsity constraint on the solution plays a vital role in many hig...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
International audienceIn high dimension, it is customary to consider Lasso-type estimators to enforc...
We consider the problem of learning a sparse multi-task regression, where the structure in the outpu...
Multitask learning can be effective when features useful in one task are also useful for other tasks...
International audienceIn high dimensional settings, sparse structures are crucial for efficiency, bo...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
Regression models are a form of supervised learning methods that are important for machine learning,...
We address the problem of joint feature selection across a group of related classification or regres...
This thesis considers the problem of feature selection when the number of predictors is larger than ...
The application of the lasso is espoused in high-dimensional settings where only a small number of t...
We propose a new sparse regression method called the component lasso, based on a simple idea. The me...
We develop a Smooth Lasso for sparse, high dimensional, contingency tables and compare its performan...
National audienceMotivated by diagnostic applications in the field of clinical microbiology, we intr...
We consider the problem of learning a structured multi-task regression, where the output consists of...
Binary logistic regression with a sparsity constraint on the solution plays a vital role in many hig...