This paper deals with subspace estimation in the small sample size regime, where the number of samples is comparable in magnitude with the observation dimension. The traditional estimators, mostly based on the sample correlation matrix, are known to perform well as long as the number of available samples is much larger than the observation dimension. However, in the small sample size regime, the performance degrades. Recently, based on random matrix theory results, a new subspace estimator was introduced, which was shown to be consistent in the asymptotic regime where the number of samples and the observation dimension converge to infinity at the same rate. In practice, this estimator outperforms the traditional ones even for certain scenar...
Sufficient dimension reduction (SDR) methods target finding lower-dimensional representations of a m...
In this thesis we shall consider sample covariance matrices Sn in the case when the dimension of the...
<p>For <i>strong</i> contrast (Task vs. Control), model performance is shown for different subspace ...
International audienceWe consider the problem of subspace estimation in situations where the number ...
This paper addresses subspace-based estimation and its pur-pose is to complement previously availabl...
We investigate the estimation efficiency of the central mean subspace in the framework of sufficient...
We study sparse principal components analysis in high dimensions, where p (the number of variables) ...
© 2019 Elsevier B.V. Dimension reduction is often an important step in the analysis of high-dimensio...
We consider learning the principal subspace of a large set of vectors from an extremely small number...
In this paper one of the main open questions in the area of subspace methods is answered partly. One...
In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing...
In linear dimension reduction for a p-variate random vector x, the general idea is to find an orthog...
The questions brought by high dimensional data is interesting and challenging. Our study is targetin...
ii In this dissertation, we discuss the problem of robust linear subspace estimation using low-rank ...
<p>For <i>weak</i> contrast (TaskB vs. TaskA), model performance is shown for different subspace est...
Sufficient dimension reduction (SDR) methods target finding lower-dimensional representations of a m...
In this thesis we shall consider sample covariance matrices Sn in the case when the dimension of the...
<p>For <i>strong</i> contrast (Task vs. Control), model performance is shown for different subspace ...
International audienceWe consider the problem of subspace estimation in situations where the number ...
This paper addresses subspace-based estimation and its pur-pose is to complement previously availabl...
We investigate the estimation efficiency of the central mean subspace in the framework of sufficient...
We study sparse principal components analysis in high dimensions, where p (the number of variables) ...
© 2019 Elsevier B.V. Dimension reduction is often an important step in the analysis of high-dimensio...
We consider learning the principal subspace of a large set of vectors from an extremely small number...
In this paper one of the main open questions in the area of subspace methods is answered partly. One...
In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing...
In linear dimension reduction for a p-variate random vector x, the general idea is to find an orthog...
The questions brought by high dimensional data is interesting and challenging. Our study is targetin...
ii In this dissertation, we discuss the problem of robust linear subspace estimation using low-rank ...
<p>For <i>weak</i> contrast (TaskB vs. TaskA), model performance is shown for different subspace est...
Sufficient dimension reduction (SDR) methods target finding lower-dimensional representations of a m...
In this thesis we shall consider sample covariance matrices Sn in the case when the dimension of the...
<p>For <i>strong</i> contrast (Task vs. Control), model performance is shown for different subspace ...