In this paper, we introduce a new error measure, integrated reconstruction error (IRE) and show that the minimization of IRE leads to principal eigenvectors (without rotational ambiguity) of the data covariance matrix. Then, we present iterative algorithms for the IRE minimization, where we use the projection approximation. The proposed algorithm is referred to as COnstrained Projection Approximation (COPA) algorithm and its limiting case is called COPAL. Numerical experiments demonstrate that these algorithms successfully find exact principal eigenvectors of the data covariance matrix.X113sciescopu
Projection spectral analysis is investigated and refined in this paper, in order to unify principal ...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from ...
Compressive-projection principal component analysis recon-structs vectors from random projections by...
Principal component analysis is an important pattern recognition and dimensionality reduction tool i...
Observed data often belong to some specific intervals of values (for instance in case of percentages...
Abstract We introduce two new methods of deriving the classical PCA in the framework of minimizing t...
The Principal Component Analysis (PCA) is a famous technique from multivariate statistics. It is fre...
We introduce a new method for sparse principal component analysis, based on the aggregation of eigen...
This paper introduces a Projected Principal Component Analysis (Projected-PCA), which is based on th...
We consider learning the principal subspace of a large set of vectors from an extremely small number...
We study the problem of finding the dom-inant eigenvector of the sample covariance matrix, under add...
Robust principal component analysis (RPCA) is a well-studied problem whose goal is to decompose a ma...
We introduce an algorithm for producing simple approximate principal components directly from a vari...
We develop in this work a new dimension reduction method for high-dimensional settings. The proposed...
Projection spectral analysis is investigated and refined in this paper, in order to unify principal ...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from ...
Compressive-projection principal component analysis recon-structs vectors from random projections by...
Principal component analysis is an important pattern recognition and dimensionality reduction tool i...
Observed data often belong to some specific intervals of values (for instance in case of percentages...
Abstract We introduce two new methods of deriving the classical PCA in the framework of minimizing t...
The Principal Component Analysis (PCA) is a famous technique from multivariate statistics. It is fre...
We introduce a new method for sparse principal component analysis, based on the aggregation of eigen...
This paper introduces a Projected Principal Component Analysis (Projected-PCA), which is based on th...
We consider learning the principal subspace of a large set of vectors from an extremely small number...
We study the problem of finding the dom-inant eigenvector of the sample covariance matrix, under add...
Robust principal component analysis (RPCA) is a well-studied problem whose goal is to decompose a ma...
We introduce an algorithm for producing simple approximate principal components directly from a vari...
We develop in this work a new dimension reduction method for high-dimensional settings. The proposed...
Projection spectral analysis is investigated and refined in this paper, in order to unify principal ...
We study the problem of estimating the leading eigenvectors of a high-dimensional populatio...
This paper presents a non-asymptotic statistical analysis of Kernel-PCA with a focus different from ...