International audienceA novel approximate representation of non-Gaussian random vectors is introduced and validated, which can be viewed as a Compressed Principal Component Analysis (CPCA). This representation relies on the eigenvectors of the covariance matrix obtained as in a Principal Component Analysis (PCA) but expresses the random vector as a linear combination of a random sample of N of these eigenvectors. In this model, the indices of these eigenvectors are independent discrete random variables with probabilities proportional to the corresponding eigenvalues. Moreover, the coefficients of the linear combination are zero mean unit variance random variables. Under these conditions, it is first shown that the covariance matrix of this ...
We introduce a new method for sparse principal component analysis, based on the aggregation of eigen...
ABSTRACT: Sparse polynomial chaos expansions have recently emerged in uncertainty quantification ana...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
International audienceIn this paper, we present a random matrix approach to recover sparse principal...
Robust Principal Component Analysis (PCA) (or robust subspace recovery) is a particularly important ...
Summarising a high dimensional data set with a low dimensional embedding is a standard approach for ...
Principal component analysis (PCA) is a widespread technique for data analysis that relies on the co...
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data s...
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data s...
Abstract — The class of complex random vectors whose covariance matrix is linearly parameterized by ...
AbstractLet the kp-variate random vector X be partitioned into k subvectors Xi of dimension p each, ...
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data s...
In this article, we present new ideas concerning Non-Gaussian Component Analysis (NGCA). We use the ...
Consider a Bernoulli-Gaussian complex n-vector whose components are Vi = XiBi, with Xi ∼ CN (0,Px) a...
Kernel Principal Component Analysis (Kernel PCA) is a useful technique to extract nonlinear structur...
We introduce a new method for sparse principal component analysis, based on the aggregation of eigen...
ABSTRACT: Sparse polynomial chaos expansions have recently emerged in uncertainty quantification ana...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...
International audienceIn this paper, we present a random matrix approach to recover sparse principal...
Robust Principal Component Analysis (PCA) (or robust subspace recovery) is a particularly important ...
Summarising a high dimensional data set with a low dimensional embedding is a standard approach for ...
Principal component analysis (PCA) is a widespread technique for data analysis that relies on the co...
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data s...
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data s...
Abstract — The class of complex random vectors whose covariance matrix is linearly parameterized by ...
AbstractLet the kp-variate random vector X be partitioned into k subvectors Xi of dimension p each, ...
Probabilistic principal component analysis (PPCA) seeks a low dimensional representation of a data s...
In this article, we present new ideas concerning Non-Gaussian Component Analysis (NGCA). We use the ...
Consider a Bernoulli-Gaussian complex n-vector whose components are Vi = XiBi, with Xi ∼ CN (0,Px) a...
Kernel Principal Component Analysis (Kernel PCA) is a useful technique to extract nonlinear structur...
We introduce a new method for sparse principal component analysis, based on the aggregation of eigen...
ABSTRACT: Sparse polynomial chaos expansions have recently emerged in uncertainty quantification ana...
In sparse principal component analysis we are given noisy observations of a low-rank matrix of di-me...