We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an ℓ1-oracle inequality satisfied by the Lasso estimator according to the Kullback−Leibler loss. This result is an extension of the ℓ1-oracle inequality established by Meynet in [ESAIM: PS 17 (2013) 650–671]. in the multivariate case. We focus on the Lasso for its ℓ1-regularization properties rather than for the variable selection procedure
We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
37 pagesWe consider the problem of estimating a sparse linear regression vector $\beta^*$ under a ga...
International audienceWe consider a multivariate finite mixture of Gaussian regression models for hi...
Revise and add further explanationsMixture of experts (MoE) has a well-principled finite mixture mod...
20 pagesInternational audienceWe consider a finite mixture of Gaussian regression model for high- di...
17 pagesWe consider the linear regression model with Gaussian error. We estimate the unknown paramet...
Abstract. We consider a high-dimensional regression model with a possible change-point due to a cova...
Finite mixture regression models are useful for modeling the relationship between a response andpred...
During the last few years, a great deal of attention has been focused on Lasso and Dantzig selector ...
Les modèles de mélange pour la régression sont utilisés pour modéliser la relation entre la réponse ...
© 2015 The Authors Journal of the Royal Statistical Society: Series B (Statistics in Society) Publis...
We present a Group Lasso procedure for generalized linear models (GLMs) and we study the properties ...
During the last few years, a great deal attention has been focused on lasso and Dantzig selector in ...
Abstract. We consider a high-dimensional regression model with a possible change-point due to a cova...
We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
37 pagesWe consider the problem of estimating a sparse linear regression vector $\beta^*$ under a ga...
International audienceWe consider a multivariate finite mixture of Gaussian regression models for hi...
Revise and add further explanationsMixture of experts (MoE) has a well-principled finite mixture mod...
20 pagesInternational audienceWe consider a finite mixture of Gaussian regression model for high- di...
17 pagesWe consider the linear regression model with Gaussian error. We estimate the unknown paramet...
Abstract. We consider a high-dimensional regression model with a possible change-point due to a cova...
Finite mixture regression models are useful for modeling the relationship between a response andpred...
During the last few years, a great deal of attention has been focused on Lasso and Dantzig selector ...
Les modèles de mélange pour la régression sont utilisés pour modéliser la relation entre la réponse ...
© 2015 The Authors Journal of the Royal Statistical Society: Series B (Statistics in Society) Publis...
We present a Group Lasso procedure for generalized linear models (GLMs) and we study the properties ...
During the last few years, a great deal attention has been focused on lasso and Dantzig selector in ...
Abstract. We consider a high-dimensional regression model with a possible change-point due to a cova...
We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox...
In more and more applications, a quantity of interest may depend on several covariates, with at leas...
37 pagesWe consider the problem of estimating a sparse linear regression vector $\beta^*$ under a ga...