A general analysis of the limiting distribution of neural network functions is performed, with emphasis on non-Gaussian limits. We show that with i.i.d. symmetric stable output weights, and more generally with weights distributed from the normal domain of attraction of a stable variable, that the neural functions converge in distribution to stable processes. Condi-tions are also investigated under which Gaussian limits do occur when the weights are independent but not identically distributed. Some par-ticularly tractable classes of stable distributions are examined, and the possibility of learning with such processes.
With the widespread explosion of sensing and computing, an increasing number of industrial applicati...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
The present paper shows that a su±cient condition for the existence of a stable solution to an autor...
This paper will discuss how a Gaussian process, which describes a probability distribution over an i...
This article studies the infinite-width limit of deep feedforward neural networks whose weights are ...
The thesis regards the quantitative convergence of randomly initialized fully connected deep neural ...
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes i...
This paper analyzes the convergence and generalization of training a one-hidden-layer neural network...
By making assumptions on the probability distribution of the potentials in a feed-forward neural net...
This thesis aims to study recent theoretical work in machine learning research that seeks to better ...
Machine learning, and in particular neural network models, have revolutionized fields such as image,...
Choosing appropriate architectures and regularization strategies of deep networks is crucial to good...
We rigorously prove a central limit theorem for neural network models with a single hidden layer. Th...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
A Neural Process (NP) is a map from a set of observed input-output pairs to a predictive distributio...
With the widespread explosion of sensing and computing, an increasing number of industrial applicati...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
The present paper shows that a su±cient condition for the existence of a stable solution to an autor...
This paper will discuss how a Gaussian process, which describes a probability distribution over an i...
This article studies the infinite-width limit of deep feedforward neural networks whose weights are ...
The thesis regards the quantitative convergence of randomly initialized fully connected deep neural ...
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes i...
This paper analyzes the convergence and generalization of training a one-hidden-layer neural network...
By making assumptions on the probability distribution of the potentials in a feed-forward neural net...
This thesis aims to study recent theoretical work in machine learning research that seeks to better ...
Machine learning, and in particular neural network models, have revolutionized fields such as image,...
Choosing appropriate architectures and regularization strategies of deep networks is crucial to good...
We rigorously prove a central limit theorem for neural network models with a single hidden layer. Th...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
A Neural Process (NP) is a map from a set of observed input-output pairs to a predictive distributio...
With the widespread explosion of sensing and computing, an increasing number of industrial applicati...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
The present paper shows that a su±cient condition for the existence of a stable solution to an autor...