We analyze in this paper a random feature map based on a theory of invariance (I-theory) introduced in [1]. More specifically, a group invariant signal signature is obtained through cumulative distributions of group-transformed random projections. Our analysis bridges invariant feature learning with kernel methods, as we show that this feature map defines an expected Haar-integration kernel that is invariant to the specified group action. We show how this non-linear random feature map approximates this group invariant kernel uniformly on a set of N points. Moreover, we show that it defines a function space that is dense in the equivalent Invariant Reproducing Kernel Hilbert Space. Finally, we quantify error rates of the convergence of the e...
International audienceThe success of deep convolutional architectures is often attributed in part to...
Due to spurious correlations, machine learning systems often fail to generalize to environments whos...
We propose a Gradient Boosting algorithm for learning an ensemble of kernel functions adapted to the...
http://afst.cedram.org/afst-bin/fitem?id=AFST_2012_6_21_3_501_0National audienceWe consider the prob...
Abstract. In many learning problems prior knowledge about pattern variations can be formalized and b...
Approximations based on random Fourier features have recently emerged as an efficient and formally c...
The present phase of Machine Learning is characterized by supervised learning algorithms relying on ...
Random Fourier features are a powerful framework to approximate shift invariant kernels with Monte C...
When solving data analysis problems it is important to integrate prior knowl-edge and/or structural ...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
International audienceThis paper jointly leverages two state-of-the-art learning strategies gradient...
We study the problem of learning from data representations that are invariant to transformations, an...
We show that the relevant information of a supervised learning problem is contained up to negligible...
A major paradigm for learning image representations in a self-supervised manner is to learn a model ...
With the goal of accelerating the training and test-ing complexity of nonlinear kernel methods, seve...
International audienceThe success of deep convolutional architectures is often attributed in part to...
Due to spurious correlations, machine learning systems often fail to generalize to environments whos...
We propose a Gradient Boosting algorithm for learning an ensemble of kernel functions adapted to the...
http://afst.cedram.org/afst-bin/fitem?id=AFST_2012_6_21_3_501_0National audienceWe consider the prob...
Abstract. In many learning problems prior knowledge about pattern variations can be formalized and b...
Approximations based on random Fourier features have recently emerged as an efficient and formally c...
The present phase of Machine Learning is characterized by supervised learning algorithms relying on ...
Random Fourier features are a powerful framework to approximate shift invariant kernels with Monte C...
When solving data analysis problems it is important to integrate prior knowl-edge and/or structural ...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
International audienceThis paper jointly leverages two state-of-the-art learning strategies gradient...
We study the problem of learning from data representations that are invariant to transformations, an...
We show that the relevant information of a supervised learning problem is contained up to negligible...
A major paradigm for learning image representations in a self-supervised manner is to learn a model ...
With the goal of accelerating the training and test-ing complexity of nonlinear kernel methods, seve...
International audienceThe success of deep convolutional architectures is often attributed in part to...
Due to spurious correlations, machine learning systems often fail to generalize to environments whos...
We propose a Gradient Boosting algorithm for learning an ensemble of kernel functions adapted to the...