We study sparse principal components analysis in high dimensions, where p (the number of variables) can be much larger than n (the number of observations), and analyze the problem of estimating the subspace spanned by the principal eigenvectors of the population covariance matrix. We prove optimal, non-asymptotic lower and upper bounds on the minimax subspace estimation error under two different, but related notions of ℓq subspace sparsity for 0 ≤ q ≤ 1. Our upper bounds apply to general classes of covariance matrices, and they show that ℓq constrained estimates can achieve optimal minimax rates without restrictive spiked covariance conditions.</p
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional population covaria...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
We perform a finite sample analysis of the detection levels for sparse principal components of a hig...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
Sparse Principal Component Analysis (PCA) methods are efficient tools to reduce the dimension (or nu...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional population covaria...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, Sparse Principal Component Analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
We perform a finite sample analysis of the detection levels for sparse principal components of a hig...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
Sparse Principal Component Analysis (PCA) methods are efficient tools to reduce the dimension (or nu...
This paper considers a sparse spiked covariance matrix model in the high-dimensional setting and stu...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...
In recent years, sparse principal component analysis has emerged as an extremely popular dimension r...