We study algorithms for online nonparametric regression that learn the directions along which the regression function is smoother. Our algorithm learns the Mahalanobis metric based on the gradient outer product matrix G of the regression function (automatically adapting to the effective rank of this matrix), while simultaneously bounding the regret \u2014on the same data sequence\u2014 in terms of the spectrum of G. As a preliminary step in our analysis, we extend a nonparametric online learning algorithm by Hazan and Megiddo enabling it to compete against functions whose Lipschitzness is measured with respect to an arbitrary Mahalanobis metric
Abstract: "We present a randomized algorithm for semi-supervised learning of Mahalanobis metrics ove...
Online learning algorithms are fast, memory-efficient, easy to implement, and applicable to many pre...
International audienceWe consider the problem of online nonparametric regression with arbitrary dete...
We establish optimal rates for online regression for arbitrary classes of regression functions in te...
International audienceWe consider the problem of online nonparametric regression with arbitrary dete...
We establish optimal rates for online regression for arbitrary classes of regression functions in te...
The emerging field of learning-augmented online algorithms uses ML techniques to predict future inpu...
New optimization models and algorithms for online learning with kernels (OLK) in regression are prop...
We study Online Convex Optimization in the unbounded setting where neither predictions nor gradient ...
Abstract. We propose a unified approach to Mahalanobis metric learning: an online, regularized, posi...
We study Online Convex Optimization in the unbounded setting where neither predictions nor gradient ...
New optimization models and algorithms for online learning with kernels (OLK) in classification and ...
A randomized algorithm for learning Mahalanobis metrics: application to classification and regressio...
Abstract—The goal of a learner, in standard online learning, is to have the cumulative loss not much...
: In recent years, learning theory has been increasingly influenced by the fact that many learning a...
Abstract: "We present a randomized algorithm for semi-supervised learning of Mahalanobis metrics ove...
Online learning algorithms are fast, memory-efficient, easy to implement, and applicable to many pre...
International audienceWe consider the problem of online nonparametric regression with arbitrary dete...
We establish optimal rates for online regression for arbitrary classes of regression functions in te...
International audienceWe consider the problem of online nonparametric regression with arbitrary dete...
We establish optimal rates for online regression for arbitrary classes of regression functions in te...
The emerging field of learning-augmented online algorithms uses ML techniques to predict future inpu...
New optimization models and algorithms for online learning with kernels (OLK) in regression are prop...
We study Online Convex Optimization in the unbounded setting where neither predictions nor gradient ...
Abstract. We propose a unified approach to Mahalanobis metric learning: an online, regularized, posi...
We study Online Convex Optimization in the unbounded setting where neither predictions nor gradient ...
New optimization models and algorithms for online learning with kernels (OLK) in classification and ...
A randomized algorithm for learning Mahalanobis metrics: application to classification and regressio...
Abstract—The goal of a learner, in standard online learning, is to have the cumulative loss not much...
: In recent years, learning theory has been increasingly influenced by the fact that many learning a...
Abstract: "We present a randomized algorithm for semi-supervised learning of Mahalanobis metrics ove...
Online learning algorithms are fast, memory-efficient, easy to implement, and applicable to many pre...
International audienceWe consider the problem of online nonparametric regression with arbitrary dete...