Exciting new work on generalization bounds for neural networks (NN) given by Bartlett et al. (2017); Neyshabur et al. (2018) closely depend on two parameter- dependant quantities: the Lipschitz constant upper bound and the stable rank (a softer version of rank). Even though these bounds typically have minimal practical utility, they facilitate questions on whether controlling such quantities together could improve the generalization behaviour of NNs in practice. To this end, we propose stable rank normalization (SRN), a novel, provably optimal, and computationally efficient weight-normalization scheme which minimizes the stable rank of a linear operator. Surprisingly we find that SRN, despite being non-convex, can be shown to have a unique ...
Various normalization layers have been proposed to help the training of neural networks. Group Norma...
Achieving efficient and robust multi-channel data learning is a challenging task in data science. By...
We propose a novel low-rank initialization framework for training low-rank deep neural networks -- n...
Exciting new work on the generalization bounds for neural networks (NN) givenby Neyshabur et al. ,...
International audienceRandomly initialized neural networks are known to become harder to train with ...
This paper presents a margin-based multiclass generalization bound for neural networks that scales w...
Optimization is the key component of deep learning. Increasing depth, which is vital for reaching a...
In recent years, a variety of normalization methods have been proposed to help training neural netwo...
Effective regularisation of neural networks is essential to combat overfitting due to the large numb...
The success of deep neural networks is in part due to the use of normalization layers. Normalization...
How to train deep neural networks (DNNs) to generalize well is a central concern in deep learning, e...
Despite the success of Lipschitz regularization in stabilizing GAN training, the exact reason of its...
International audienceModern neural networks are over-parametrized. In particular, each rectified li...
Since their invention, generative adversarial networks (GANs) have become a popular approach for lea...
We present a novel way of obtaining PAC-style bounds on the generalization error of learning algorit...
Various normalization layers have been proposed to help the training of neural networks. Group Norma...
Achieving efficient and robust multi-channel data learning is a challenging task in data science. By...
We propose a novel low-rank initialization framework for training low-rank deep neural networks -- n...
Exciting new work on the generalization bounds for neural networks (NN) givenby Neyshabur et al. ,...
International audienceRandomly initialized neural networks are known to become harder to train with ...
This paper presents a margin-based multiclass generalization bound for neural networks that scales w...
Optimization is the key component of deep learning. Increasing depth, which is vital for reaching a...
In recent years, a variety of normalization methods have been proposed to help training neural netwo...
Effective regularisation of neural networks is essential to combat overfitting due to the large numb...
The success of deep neural networks is in part due to the use of normalization layers. Normalization...
How to train deep neural networks (DNNs) to generalize well is a central concern in deep learning, e...
Despite the success of Lipschitz regularization in stabilizing GAN training, the exact reason of its...
International audienceModern neural networks are over-parametrized. In particular, each rectified li...
Since their invention, generative adversarial networks (GANs) have become a popular approach for lea...
We present a novel way of obtaining PAC-style bounds on the generalization error of learning algorit...
Various normalization layers have been proposed to help the training of neural networks. Group Norma...
Achieving efficient and robust multi-channel data learning is a challenging task in data science. By...
We propose a novel low-rank initialization framework for training low-rank deep neural networks -- n...