We present a novel approach to learn a kernel-based regression function. It is based on the use of conical combinations of data-based parameter-ized kernels and on a new stochastic convex op-timization procedure of which we establish con-vergence guarantees. The overall learning pro-cedure has the nice properties that a) the learned conical combination is automatically designed to perform the regression task at hand and b) the updates implicated by the optimization proce-dure are quite inexpensive. In order to shed light on the appositeness of our learning strategy, we present empirical results from experiments con-ducted on various benchmark datasets. 1
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
Kernel-based learning algorithms are well-known to poorly scale to large-scale applications. For suc...
New optimization models and algorithms for online learning with kernels (OLK) in regression are prop...
International audienceWe present a novel approach to learn a kernel-based regression function. It is...
This paper studies the general problem of learning kernels based on a polynomial combination of base...
Kernel methods are a well-studied approach for addressing regression problems by implicitly mapping ...
Kernel selection is a central issue in kernel methods of machine learning. In this paper, we investi...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
AbstractWe propose a stochastic gradient descent algorithm for learning the gradient of a regression...
The paper studies convex stochastic optimization problems in a reproducing kernel Hilbert space (RKH...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
We briefly describe the main ideas of statistical learning theory, support vector machines, and kern...
Previously, weighted kernel regression (WKR) for solving small samples problem has been reported. Th...
We follow a learning theory viewpoint to study a family of learning schemes for regression related t...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
Kernel-based learning algorithms are well-known to poorly scale to large-scale applications. For suc...
New optimization models and algorithms for online learning with kernels (OLK) in regression are prop...
International audienceWe present a novel approach to learn a kernel-based regression function. It is...
This paper studies the general problem of learning kernels based on a polynomial combination of base...
Kernel methods are a well-studied approach for addressing regression problems by implicitly mapping ...
Kernel selection is a central issue in kernel methods of machine learning. In this paper, we investi...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
AbstractWe propose a stochastic gradient descent algorithm for learning the gradient of a regression...
The paper studies convex stochastic optimization problems in a reproducing kernel Hilbert space (RKH...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
We briefly describe the main ideas of statistical learning theory, support vector machines, and kern...
Previously, weighted kernel regression (WKR) for solving small samples problem has been reported. Th...
We follow a learning theory viewpoint to study a family of learning schemes for regression related t...
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a posit...
Kernel-based learning algorithms are well-known to poorly scale to large-scale applications. For suc...
New optimization models and algorithms for online learning with kernels (OLK) in regression are prop...