International audienceDeep neural networks (DNNs) are powerful learning models yet their results are not always reliable. This is due to the fact that modern DNNs are usually uncalibrated and we cannot characterize their epistemic uncertainty. In this work, we propose a new technique to quantify the epistemic uncertainty of data easily. This method consists in mixing the predictions of an ensemble of DNNs trained to classify One class vs All the other classes (OVA) with predictions from a standard DNN trained to perform All vs All (AVA) classification. On the one hand, the adjustment provided by the AVA DNN to the score of the base classifiers allows for a more fine-grained inter-class separation. On the other hand, the two types of classif...
The inaccuracy of neural network models on inputs that do not stem from the distribution underlying ...
This paper presents a new approach to the problem of multiclass classification. The proposed approac...
Uncertainty estimation is essential to make neural networks trustworthy in real-world applications. ...
International audienceDeep neural networks (DNNs) are powerful learning models yet their results are...
We present an approach to quantifying both aleatoric and epistemic uncertainty for deep neural netwo...
Traditional deep neural networks (NNs) have significantly contributed to the state-of-the-art perfor...
Deep neural networks (DNNs) are known to produce incorrect predictions with very high confidence on ...
The work presented in this thesis addresses the problem of Out-of-Distribution (OOD) detection in de...
International audienceDuring training, the weights of a Deep Neural Network (DNN) are optimized from...
The One Versus Rest (OVR) method of building classifiers has fallen out of favor with machine learni...
As AI models are increasingly deployed in critical applications, ensuring the consistent performance...
Detecting out-of-distribution (OOD) samples is critical for the deployment of deep neural networks (...
This paper proposes a fast and scalable method for uncertainty quantification of machine learning mo...
The ability to detect Out-of-Distribution (OOD) data is important in safety-critical applications of...
The 16th European Conference on Computer Vision (ECCV 2020), Online Conference, 23-28 August 2020Dee...
The inaccuracy of neural network models on inputs that do not stem from the distribution underlying ...
This paper presents a new approach to the problem of multiclass classification. The proposed approac...
Uncertainty estimation is essential to make neural networks trustworthy in real-world applications. ...
International audienceDeep neural networks (DNNs) are powerful learning models yet their results are...
We present an approach to quantifying both aleatoric and epistemic uncertainty for deep neural netwo...
Traditional deep neural networks (NNs) have significantly contributed to the state-of-the-art perfor...
Deep neural networks (DNNs) are known to produce incorrect predictions with very high confidence on ...
The work presented in this thesis addresses the problem of Out-of-Distribution (OOD) detection in de...
International audienceDuring training, the weights of a Deep Neural Network (DNN) are optimized from...
The One Versus Rest (OVR) method of building classifiers has fallen out of favor with machine learni...
As AI models are increasingly deployed in critical applications, ensuring the consistent performance...
Detecting out-of-distribution (OOD) samples is critical for the deployment of deep neural networks (...
This paper proposes a fast and scalable method for uncertainty quantification of machine learning mo...
The ability to detect Out-of-Distribution (OOD) data is important in safety-critical applications of...
The 16th European Conference on Computer Vision (ECCV 2020), Online Conference, 23-28 August 2020Dee...
The inaccuracy of neural network models on inputs that do not stem from the distribution underlying ...
This paper presents a new approach to the problem of multiclass classification. The proposed approac...
Uncertainty estimation is essential to make neural networks trustworthy in real-world applications. ...