We present a framework for efficient extrapolation of reduced rank approximations, graph kernels, and locally linear embeddings (LLE) to unseen data. We also present a principled method to combine many of these kernels and then extrapolate them. Central to our method is a theorem for matrix approximation, and an extension of the representer theorem to handle multiple joint regularization constraints. Experiments in protein classification demonstrate the feasibility of our approach
We propose hashing to facilitate efficient kernels. This generalizes previous work using sampling an...
In this paper, we proposed a new nonlinear dimensionality reduction algorithm called regularized Ker...
Abstract: Graphs are often used to describe and analyze the geometry and physic-ochemical compositio...
We present a framework for efficient extrapolation of reduced rank approximations, graph kernels, an...
We develop and apply a novel framework which is designed to extract information in the form of a pos...
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for a...
Positive denite kernels between labeled graphs have recently been proposed. They enable the applicat...
In kernel based methods such as Regularization Networks large datasets pose signi- cant problems s...
169 pagesKernel functions are used in a variety of scientific settings to measure relationships or i...
We present an algorithm based on convex optimization for constructing kernels for semi-supervised l...
We propose hashing to facilitate efficient kernels. This generalizes previous work using sampling an...
Kernel conditional random fields are introduced as a framework for discriminative modeling of graph-...
We present a kernel-based framework for pattern recognition, regression estimation, function approxi...
We present a unified framework to study graph kernels, special cases of which include the random wa...
We interpret several well-known algorithms for dimensionality reduction of manifolds as kernel metho...
We propose hashing to facilitate efficient kernels. This generalizes previous work using sampling an...
In this paper, we proposed a new nonlinear dimensionality reduction algorithm called regularized Ker...
Abstract: Graphs are often used to describe and analyze the geometry and physic-ochemical compositio...
We present a framework for efficient extrapolation of reduced rank approximations, graph kernels, an...
We develop and apply a novel framework which is designed to extract information in the form of a pos...
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for a...
Positive denite kernels between labeled graphs have recently been proposed. They enable the applicat...
In kernel based methods such as Regularization Networks large datasets pose signi- cant problems s...
169 pagesKernel functions are used in a variety of scientific settings to measure relationships or i...
We present an algorithm based on convex optimization for constructing kernels for semi-supervised l...
We propose hashing to facilitate efficient kernels. This generalizes previous work using sampling an...
Kernel conditional random fields are introduced as a framework for discriminative modeling of graph-...
We present a kernel-based framework for pattern recognition, regression estimation, function approxi...
We present a unified framework to study graph kernels, special cases of which include the random wa...
We interpret several well-known algorithms for dimensionality reduction of manifolds as kernel metho...
We propose hashing to facilitate efficient kernels. This generalizes previous work using sampling an...
In this paper, we proposed a new nonlinear dimensionality reduction algorithm called regularized Ker...
Abstract: Graphs are often used to describe and analyze the geometry and physic-ochemical compositio...