In this thesis, we consider a class of regularization techniques, called thresholding, which assumes a certain transform of the parameter vector is sparse, meaning it has only few nonzero coordinates. The parsimony assumption is natural in high-dimensional data where the number of features of the model can be dramatically larger than the sample size. These techniques are indexed by a nonnegative tuning parameter which governs the sparsity level of the estimate. We first introduce the quantile universal threshold, a tuning parameter selection methodology which follows the same paradigm in various domains. We then propose a new class of testing procedures in linear models, thresholding tests, which are based on thresholding estimators. Finall...
We revisit the problem of designing an efficient binary classifier in a challenging high-dimensional...
Thresholding is a regularization method commonly used for covariance estimation (Bickel and Levina, ...
International audience<p>High dimensional regression benefits from sparsity promoting regularization...
In this thesis, we consider a class of regularization techniques, called thresholding, which assumes...
We review a family of model selection techniques called thresholding that assume the vector of param...
Given n noisy samples with p dimensions, where n ≪ p, we show that the multi-step thresholding proce...
Researchers in many disciplines face the formidable task of analyzing massive amounts of high-dimens...
2015-04-08This dissertation addresses two challenging problems with respect to feature selection in ...
We consider a problem of recovering a high-dimensional vector µ observed in white noise, where the u...
195 pagesHigh-dimensional data is ubiquitous nowadays in many areas. Over the last twenty to thirty ...
The use of M-estimators in generalized linear regression models in high dimensional settings require...
We study the distribution of hard-, soft-, and adaptive soft-thresholding estimators within a linear...
High-dimensional correlated data pose challenges in model selec-tion and predictive learning. The pr...
This thesis considers estimation and statistical inference for high dimensional model with constrain...
The use of regularization, or penalization, has become increasingly common in highdimensional statis...
We revisit the problem of designing an efficient binary classifier in a challenging high-dimensional...
Thresholding is a regularization method commonly used for covariance estimation (Bickel and Levina, ...
International audience<p>High dimensional regression benefits from sparsity promoting regularization...
In this thesis, we consider a class of regularization techniques, called thresholding, which assumes...
We review a family of model selection techniques called thresholding that assume the vector of param...
Given n noisy samples with p dimensions, where n ≪ p, we show that the multi-step thresholding proce...
Researchers in many disciplines face the formidable task of analyzing massive amounts of high-dimens...
2015-04-08This dissertation addresses two challenging problems with respect to feature selection in ...
We consider a problem of recovering a high-dimensional vector µ observed in white noise, where the u...
195 pagesHigh-dimensional data is ubiquitous nowadays in many areas. Over the last twenty to thirty ...
The use of M-estimators in generalized linear regression models in high dimensional settings require...
We study the distribution of hard-, soft-, and adaptive soft-thresholding estimators within a linear...
High-dimensional correlated data pose challenges in model selec-tion and predictive learning. The pr...
This thesis considers estimation and statistical inference for high dimensional model with constrain...
The use of regularization, or penalization, has become increasingly common in highdimensional statis...
We revisit the problem of designing an efficient binary classifier in a challenging high-dimensional...
Thresholding is a regularization method commonly used for covariance estimation (Bickel and Levina, ...
International audience<p>High dimensional regression benefits from sparsity promoting regularization...