We propose an extremely simple approach to regularize a single deterministic neural network to obtain improved accuracy and reliable uncertainty estimates. Our approach, on top of the cross-entropy loss, simply puts an entropy maximization regularizer corresponding to the predictive distribution in the regions of the embedding space between the class clusters. This is achieved by synthetically generating between-cluster samples via the convex combination of two images from different classes and maximizing the entropy on these samples. Such a data-dependent regularization guides the maximum likelihood estimation to prefer a solution that (1) maps out-of-distribution samples to high entropy regions (creating an entropy barrier); and (2) is mo...
International audienceWhen physical sensors are involved, such as image sensors, the uncertainty ove...
Although deep learning models have achieved state-of-the art performance on a number of vision tasks...
The deep learning techniques have made neural networks the leading option for solving some computat...
The inaccuracy of neural network models on inputs that do not stem from the distribution underlying ...
Is it possible to train several classifiers to perform meaningful crowd-sourcing to produce a better...
Mutual Information (MI) has been widely used as a loss regularizer for training neural networks. Thi...
A common question regarding the application of neural networks is whether the predictions of the mod...
While machine learning is traditionally a resource intensive task, embedded systems, autonomous navi...
Adversarial examples easily mislead vision systems based on deep neural networks (DNNs) trained with...
Deep neural networks have amply demonstrated their prowess but estimating the reliability of their p...
This paper proposes a fast and scalable method for uncertainty quantification of machine learning mo...
Regularization of neural networks can alleviate overfitting in the training phase. Current regulariz...
International audienceStudies on generalization performance of machine learning algorithms under the...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Deep neural language models like GPT-2 is undoubtedly strong at text generation, but often requires ...
International audienceWhen physical sensors are involved, such as image sensors, the uncertainty ove...
Although deep learning models have achieved state-of-the art performance on a number of vision tasks...
The deep learning techniques have made neural networks the leading option for solving some computat...
The inaccuracy of neural network models on inputs that do not stem from the distribution underlying ...
Is it possible to train several classifiers to perform meaningful crowd-sourcing to produce a better...
Mutual Information (MI) has been widely used as a loss regularizer for training neural networks. Thi...
A common question regarding the application of neural networks is whether the predictions of the mod...
While machine learning is traditionally a resource intensive task, embedded systems, autonomous navi...
Adversarial examples easily mislead vision systems based on deep neural networks (DNNs) trained with...
Deep neural networks have amply demonstrated their prowess but estimating the reliability of their p...
This paper proposes a fast and scalable method for uncertainty quantification of machine learning mo...
Regularization of neural networks can alleviate overfitting in the training phase. Current regulariz...
International audienceStudies on generalization performance of machine learning algorithms under the...
Training probability-density estimating neural networks with the expectation-maximization (EM) algor...
Deep neural language models like GPT-2 is undoubtedly strong at text generation, but often requires ...
International audienceWhen physical sensors are involved, such as image sensors, the uncertainty ove...
Although deep learning models have achieved state-of-the art performance on a number of vision tasks...
The deep learning techniques have made neural networks the leading option for solving some computat...