We develop an improved bound for the ap-proximation error of the Nyström method under the assumption that there is a large eigengap in the spectrum of kernel matrix. This is based on the empirical observation that the eigengap has a significant impact on the approximation error of the Nyström method. Our approach is based on the con-centration inequality of integral operator and the theory of matrix perturbation. Our anal-ysis shows that when there is a large eigen-gap, we can improve the approximation er-ror of the Nyström method from O(N/m1/4) to O(N/m1/2) when measured in Frobenius norm, where N is the size of the kernel matrix, and m is the number of sampled columns. 1
A problem for many kernel-based methods is that the amount of computation required to find the solut...
We study approximations of eigenvalue problems for integral operators associated with kernel functio...
The Nyström method is an efficient technique for large-scale kernel learning. It provides a low-rank...
Abstract—We develop two approaches for analyzing the ap-proximation error bound for the Nyström met...
Many kernel methods suffer from high time and space complexities and are thus prohibitive in big-dat...
The Nyström method is an efficient technique for the eigenvalue decomposition of large kernel matric...
The eigenvalues of the kernel matrix play an important role in a number of kernel methods, in partic...
The eigenvalues of the kernel matrix play an important role in a number of kernel methods, in parti...
The Nystrom method is a well-known sampling-based technique for approximating the eigensystem of lar...
Kernel-based methods in Numerical Analysis have the advantage of yielding optimal recovery processes...
Nystrom approximation is an effective approach to accelerate the computation of kernel matrices in m...
The Jacobi–Davidson method is known to converge at least quadratically if the correction equation is...
Covariance matrix estimates are an essential part of many signal processing algorithms, and are ofte...
Abstract. The Jacobi–Davidson method is known to converge at least quadratically if the correction e...
AbstractThe Jacobi–Davidson method is known to converge at least quadratically if the correction equ...
A problem for many kernel-based methods is that the amount of computation required to find the solut...
We study approximations of eigenvalue problems for integral operators associated with kernel functio...
The Nyström method is an efficient technique for large-scale kernel learning. It provides a low-rank...
Abstract—We develop two approaches for analyzing the ap-proximation error bound for the Nyström met...
Many kernel methods suffer from high time and space complexities and are thus prohibitive in big-dat...
The Nyström method is an efficient technique for the eigenvalue decomposition of large kernel matric...
The eigenvalues of the kernel matrix play an important role in a number of kernel methods, in partic...
The eigenvalues of the kernel matrix play an important role in a number of kernel methods, in parti...
The Nystrom method is a well-known sampling-based technique for approximating the eigensystem of lar...
Kernel-based methods in Numerical Analysis have the advantage of yielding optimal recovery processes...
Nystrom approximation is an effective approach to accelerate the computation of kernel matrices in m...
The Jacobi–Davidson method is known to converge at least quadratically if the correction equation is...
Covariance matrix estimates are an essential part of many signal processing algorithms, and are ofte...
Abstract. The Jacobi–Davidson method is known to converge at least quadratically if the correction e...
AbstractThe Jacobi–Davidson method is known to converge at least quadratically if the correction equ...
A problem for many kernel-based methods is that the amount of computation required to find the solut...
We study approximations of eigenvalue problems for integral operators associated with kernel functio...
The Nyström method is an efficient technique for large-scale kernel learning. It provides a low-rank...