When dealing with kernel methods, one has to decide which kernel and which values for the hyperparameters to use. Resampling techniques can address this issue but these procedures are time-consuming. This problem is particularly challenging when dealing with structured data, in particular with graphs, since several kernels for graph data have been proposed in literature, but no clear relationship among them in terms of learning properties is defined. In these cases, exhaustive search seems to be the only reasonable approach. Recently, the global Rademacher complexity (RC) and local Rademacher complexity (LRC), two powerful measures of the complexity of a hypothesis space, have shown to be suited for studying kernels properties. In particula...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
In this paper we develop a novel generalization bound for learning the kernel problem. First, we sho...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Di...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Kernels for structures, including graphs, generally suffer of the diagonally dominant gram matri...
We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our ...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
In this paper we develop a novel generalization bound for learning the kernel problem. First, we sho...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Di...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Graph kernels are widely adopted in real-world applications that involve learning on graph data. Dif...
Kernels for structures, including graphs, generally suffer of the diagonally dominant gram matri...
We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our ...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
In this paper we develop a novel generalization bound for learning the kernel problem. First, we sho...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...