We study generalised linear regression and classification for a synthetically generated dataset encompassing different problems of interest, such as learning with random features, neural networks in the lazy training regime, and the hidden manifold model. We consider the high-dimensional regime and using the replica method from statistical physics, we provide a closed-form expression for the asymptotic generalisation performance in these problems, valid in both the under- and over-parametrised regimes and for a broad choice of generalised linear model loss functions. In particular, we show how to obtain analytically the so-called double descent behaviour for logistic regression with a peak at the interpolation threshold, we illustrate the s...
This manuscript considers the problem of learning a random Gaussian network function using a fully c...
Understanding the reasons for the success of deep neural networks trained using stochastic gradient-...
We study the generalization properties of ridge regression with random features in the statistical l...
see also: 2020 Generalisation error in learning with random features and the hidden manifold model, ...
The lack of crisp mathematical models that capture the structure of real-world data sets is a major ...
We prove a universality theorem for learning with random features. Our result shows that, in terms o...
We study generalization properties of random features (RF) regression in high dimensions optimized b...
Modern machine learning models, particularly those used in deep networks, are characterized by massi...
Machine learning models are typically configured by minimizing the training error over a given train...
In this work, we provide a characterization of the feature-learning process in two-layer ReLU networ...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
11 pages + 45 pages Supplementary Material / 5 figuresWe consider a commonly studied supervised clas...
Regression models usually tend to recover a noisy signal in the form of a combination of regressors,...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
We compute precise asymptotic expressions for the learning curves of least squares random feature (R...
This manuscript considers the problem of learning a random Gaussian network function using a fully c...
Understanding the reasons for the success of deep neural networks trained using stochastic gradient-...
We study the generalization properties of ridge regression with random features in the statistical l...
see also: 2020 Generalisation error in learning with random features and the hidden manifold model, ...
The lack of crisp mathematical models that capture the structure of real-world data sets is a major ...
We prove a universality theorem for learning with random features. Our result shows that, in terms o...
We study generalization properties of random features (RF) regression in high dimensions optimized b...
Modern machine learning models, particularly those used in deep networks, are characterized by massi...
Machine learning models are typically configured by minimizing the training error over a given train...
In this work, we provide a characterization of the feature-learning process in two-layer ReLU networ...
Understanding how feature learning affects generalization is among the foremost goals of modern deep...
11 pages + 45 pages Supplementary Material / 5 figuresWe consider a commonly studied supervised clas...
Regression models usually tend to recover a noisy signal in the form of a combination of regressors,...
Kernel methods and neural networks are two important schemes in the supervised learning field. The t...
We compute precise asymptotic expressions for the learning curves of least squares random feature (R...
This manuscript considers the problem of learning a random Gaussian network function using a fully c...
Understanding the reasons for the success of deep neural networks trained using stochastic gradient-...
We study the generalization properties of ridge regression with random features in the statistical l...