see also: 2020 Generalisation error in learning with random features and the hidden manifold model, Proc. 37th Int. Conf. Machine Learning; vol 119 ed H D III and A Singh pp 3452–62International audienceWe study generalised linear regression and classification for a synthetically generated dataset encompassing different problems of interest, such as learning with random features, neural networks in the lazy training regime, and the hidden manifold model. We consider the high-dimensional regime and using the replica method from statistical physics, we provide a closed-form expression for the asymptotic generalisation performance in these problems, valid in both the under- and over-parametrised regimes and for a broad choice of generalised li...
We prove a non-asymptotic distribution-independent lower bound for the expected mean squared general...
Regression models usually tend to recover a noisy signal in the form of a combination of regressors,...
We study the generalization properties of ridge regression with random features in the statistical l...
see also: 2020 Generalisation error in learning with random features and the hidden manifold model, ...
We study generalised linear regression and classification for a synthetically generated dataset enco...
We study generalization properties of random features (RF) regression in high dimensions optimized b...
We prove a universality theorem for learning with random features. Our result shows that, in terms o...
The lack of crisp mathematical models that capture the structure of real-world data sets is a major ...
Machine learning models are typically configured by minimizing the training error over a given train...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
Modern machine learning models, particularly those used in deep networks, are characterized by massi...
In this work, we provide a characterization of the feature-learning process in two-layer ReLU networ...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
11 pages + 45 pages Supplementary Material / 5 figuresWe consider a commonly studied supervised clas...
From the sampling of data to the initialisation of parameters, randomness is ubiquitous in modern Ma...
We prove a non-asymptotic distribution-independent lower bound for the expected mean squared general...
Regression models usually tend to recover a noisy signal in the form of a combination of regressors,...
We study the generalization properties of ridge regression with random features in the statistical l...
see also: 2020 Generalisation error in learning with random features and the hidden manifold model, ...
We study generalised linear regression and classification for a synthetically generated dataset enco...
We study generalization properties of random features (RF) regression in high dimensions optimized b...
We prove a universality theorem for learning with random features. Our result shows that, in terms o...
The lack of crisp mathematical models that capture the structure of real-world data sets is a major ...
Machine learning models are typically configured by minimizing the training error over a given train...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
Modern machine learning models, particularly those used in deep networks, are characterized by massi...
In this work, we provide a characterization of the feature-learning process in two-layer ReLU networ...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
11 pages + 45 pages Supplementary Material / 5 figuresWe consider a commonly studied supervised clas...
From the sampling of data to the initialisation of parameters, randomness is ubiquitous in modern Ma...
We prove a non-asymptotic distribution-independent lower bound for the expected mean squared general...
Regression models usually tend to recover a noisy signal in the form of a combination of regressors,...
We study the generalization properties of ridge regression with random features in the statistical l...