We study the problem of estimating the leading eigenvectors of a high-dimensional population covariance matrix based on independent Gaussian observations. We establish a lower bound on the minimax risk of estimators under the l2 loss, in the joint limit as dimension and sample size increase to infinity, under various models of sparsity for the population eigenvectors. The lower bound on the risk points to the existence of different regimes of sparsity of the eigenvectors. We also propose a new method for estimating the eigenvectors by a two-stage coordinate selection scheme
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
Consider the standard Gaussian linear regression model Y = X theta(0) + epsilon, where Y is an eleme...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study sparse principal components analysis in high dimensions, where p (the number of variables) ...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
Sparse Principal Component Analysis (PCA) methods are efficient tools to reduce the dimension (or nu...
The Principal Component Analysis (PCA) is a famous technique from multivariate statistics. It is fre...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
Consider the standard Gaussian linear regression model Y = X theta(0) + epsilon, where Y is an eleme...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study sparse principal components analysis in high dimensions, where p (the number of variables) ...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
Sparse Principal Component Analysis (PCA) methods are efficient tools to reduce the dimension (or nu...
The Principal Component Analysis (PCA) is a famous technique from multivariate statistics. It is fre...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
Consider the standard Gaussian linear regression model Y = X theta(0) + epsilon, where Y is an eleme...