In many areas of machine learning, it becomes necessary to find the eigenvector decompositions of large matrices. We discuss two methods for reducing the computa-tional burden of spectral decompositions: the more venerable Nyström extension and a newly introduced algorithm based on random projections. Previous work has centered on the ability to reconstruct the original matrix. We argue that a more interesting and relevant comparison is their relative performance in clustering and classification tasks using the approximate eigenvectors as features. We demonstrate that performance is task specific and depends on the rank of the approximation.
In this thesis, we investigate how well we can reconstruct the best rank-? approximation of a large ...
Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding ...
Abstract. Spectral methods have received attention as powerful theoretical and prac-tical approaches...
Spectral clustering is arguably one of the most important algorithms in data mining and machine inte...
The Nyström method is an efficient technique for the eigenvalue decomposition of large kernel matric...
Kernel (or similarity) matrix plays a key role in many machine learning algorithms such as kernel me...
Spectral clustering refers to a class of techniques which rely on the eigenstructure of a similarity...
Abstract. We propose and analyze a fast spectral clustering algorithm with computational complexity ...
Despite many empirical successes of spectral clustering methods-algorithms that cluster points using...
Spectral methods requiring the computation of eigenvalues and eigenvectors of a positive definite ma...
International audienceSpectral clustering refers to a family of well-known unsupervised learning alg...
International audienceLeveraging on recent random matrix advances in the performance analysis of ker...
International audienceThis article introduces a random matrix framework for the analysis of clusteri...
In this thesis we develop a spectral approach to large kernel matrices, graphs and the Hessians of n...
© 2017 IEEE. Spectral methods refer to the problem of finding eigenvectors of an affinity matrix. De...
In this thesis, we investigate how well we can reconstruct the best rank-? approximation of a large ...
Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding ...
Abstract. Spectral methods have received attention as powerful theoretical and prac-tical approaches...
Spectral clustering is arguably one of the most important algorithms in data mining and machine inte...
The Nyström method is an efficient technique for the eigenvalue decomposition of large kernel matric...
Kernel (or similarity) matrix plays a key role in many machine learning algorithms such as kernel me...
Spectral clustering refers to a class of techniques which rely on the eigenstructure of a similarity...
Abstract. We propose and analyze a fast spectral clustering algorithm with computational complexity ...
Despite many empirical successes of spectral clustering methods-algorithms that cluster points using...
Spectral methods requiring the computation of eigenvalues and eigenvectors of a positive definite ma...
International audienceSpectral clustering refers to a family of well-known unsupervised learning alg...
International audienceLeveraging on recent random matrix advances in the performance analysis of ker...
International audienceThis article introduces a random matrix framework for the analysis of clusteri...
In this thesis we develop a spectral approach to large kernel matrices, graphs and the Hessians of n...
© 2017 IEEE. Spectral methods refer to the problem of finding eigenvectors of an affinity matrix. De...
In this thesis, we investigate how well we can reconstruct the best rank-? approximation of a large ...
Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding ...
Abstract. Spectral methods have received attention as powerful theoretical and prac-tical approaches...