We investigate the local spectral statistics of the loss surface Hessians of artificial neural networks, where we discover agreement with Gaussian Orthogonal Ensemble statistics across several network architectures and datasets. These results shed new light on the applicability of Random Matrix Theory to modelling neural networks and suggest a role for it in the study of loss surfaces in deep learning
International audienceThis article proposes an original approach to the performance understanding of...
International audienceIn this work, we investigate the asymptotic spectral density of the random fea...
International audienceSeveral machine learning problems such as latent variable model learning and c...
This paper considers several aspects of random matrix universality in deep neural networks. Motivate...
This paper considers several aspects of random matrix universality in deep neural networks. Motivate...
Neural networks have been used successfully in a variety of fields, which has led to a great deal of...
International audienceThis article studies the Gram random matrix model G = 1 T Σ T Σ, Σ = σ(W X), c...
This book presents a unified theory of random matrices for applications in machine learning, offerin...
We study the distribution of singular values of product of random matrices pertinent to the analysis...
International audienceThis paper shows that deep learning (DL) representations of data produced by g...
International audienceThis article provides a theoretical analysis of the asymptotic performance of ...
Loss landscape analysis is extremely useful for a deeper understanding of the generalization ability...
This manuscript considers the problem of learning a random Gaussian network function using a fully c...
International audienceThis article proposes an original approach to the performance understanding of...
International audienceIn this work, we investigate the asymptotic spectral density of the random fea...
International audienceSeveral machine learning problems such as latent variable model learning and c...
This paper considers several aspects of random matrix universality in deep neural networks. Motivate...
This paper considers several aspects of random matrix universality in deep neural networks. Motivate...
Neural networks have been used successfully in a variety of fields, which has led to a great deal of...
International audienceThis article studies the Gram random matrix model G = 1 T Σ T Σ, Σ = σ(W X), c...
This book presents a unified theory of random matrices for applications in machine learning, offerin...
We study the distribution of singular values of product of random matrices pertinent to the analysis...
International audienceThis paper shows that deep learning (DL) representations of data produced by g...
International audienceThis article provides a theoretical analysis of the asymptotic performance of ...
Loss landscape analysis is extremely useful for a deeper understanding of the generalization ability...
This manuscript considers the problem of learning a random Gaussian network function using a fully c...
International audienceThis article proposes an original approach to the performance understanding of...
International audienceIn this work, we investigate the asymptotic spectral density of the random fea...
International audienceSeveral machine learning problems such as latent variable model learning and c...