International audienceWe consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an ℓ 1 -oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss. This result is an extension of the ℓ 1 -oracle inequality established by Meynet in the multivariate case. We focus on the Lasso for its ℓ 1 -regularization properties rather than for the variable selection procedure, as it was done in Städler et al
We propose a self-tuning √Lasso method that simultaneously resolves three important practical proble...
Linear mixed models (LMMs) are suitable for clustered data and are common in biometrics, medicine, s...
The Lasso is a method for high-dimensional regression, which is now commonly used when the number of...
We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, w...
20 pagesInternational audienceWe consider a finite mixture of Gaussian regression model for high- di...
Revise and add further explanationsMixture of experts (MoE) has a well-principled finite mixture mod...
During the last few years, a great deal attention has been focused on lasso and Dantzig selector in ...
We consider the linear regression problem. We propose the S-Lasso procedure to estimate the unknown ...
The Lasso has attracted the attention of many authors these last years. While many efforts have been...
In this thesis, we study the approximation capabilities, model estimation and selection properties, ...
Finite mixture regression models are useful for modeling the relationship between a response andpred...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
17 pagesWe consider the linear regression model with Gaussian error. We estimate the unknown paramet...
Les modèles de mélange pour la régression sont utilisés pour modéliser la relation entre la réponse ...
In this paper we study post-penalized estimators which apply ordinary, unpenalized linear regression...
We propose a self-tuning √Lasso method that simultaneously resolves three important practical proble...
Linear mixed models (LMMs) are suitable for clustered data and are common in biometrics, medicine, s...
The Lasso is a method for high-dimensional regression, which is now commonly used when the number of...
We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, w...
20 pagesInternational audienceWe consider a finite mixture of Gaussian regression model for high- di...
Revise and add further explanationsMixture of experts (MoE) has a well-principled finite mixture mod...
During the last few years, a great deal attention has been focused on lasso and Dantzig selector in ...
We consider the linear regression problem. We propose the S-Lasso procedure to estimate the unknown ...
The Lasso has attracted the attention of many authors these last years. While many efforts have been...
In this thesis, we study the approximation capabilities, model estimation and selection properties, ...
Finite mixture regression models are useful for modeling the relationship between a response andpred...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
17 pagesWe consider the linear regression model with Gaussian error. We estimate the unknown paramet...
Les modèles de mélange pour la régression sont utilisés pour modéliser la relation entre la réponse ...
In this paper we study post-penalized estimators which apply ordinary, unpenalized linear regression...
We propose a self-tuning √Lasso method that simultaneously resolves three important practical proble...
Linear mixed models (LMMs) are suitable for clustered data and are common in biometrics, medicine, s...
The Lasso is a method for high-dimensional regression, which is now commonly used when the number of...