Recent works show that the data distribution in a network's latent space is useful for estimating classification uncertainty and detecting Out-Of-Distribution (OOD) samples. To obtain a well-regularized latent space that is conducive for uncertainty estimation, existing methods bring in significant changes to model architectures and training procedures. In this paper, we present a lightweight, fast, and high-performance regularization method for Mahalanobis distance (MD)-based uncertainty prediction, and that requires minimal changes to the network's architecture. To derive Gaussian latent representation favourable for MD calculation, we introduce a self-supervised representation learning method that separates in-class representations into ...
The focus in deep learning research has been mostly to push the limits of prediction accuracy. Howev...
Deep Learning (DL) has achieved the state-of-the-art performance across a broad spectrum oftasks. Fr...
Gaussian process [1] and it’s variants of deep structures like deep gaussian processes [2] and convo...
Recent works show that the data distribution in a network's latent space is useful for estimating cl...
Recent works show that the data distribution in a network's latent space is useful for estimating cl...
Abstract. An important advantage of Gaussian processes is the ability to directly estimate classific...
Deep neural networks are often ignorant about what they do not know and overconfident when they make...
Uncertainty estimation (UE) techniques -- such as the Gaussian process (GP), Bayesian neural network...
We propose a simple method that combines neural networks and Gaussian processes. The proposed method...
This paper presents a probabilistic framework to obtain both reliable and fast uncertainty estimates...
Deep learning models such as convolutional neural networks have brought advances in computer vision ...
A common question regarding the application of neural networks is whether the predictions of the mod...
Estimating how uncertain an AI system is in its predictions is important to improve the safety of su...
As Deep Learning continues to yield successful applications in Computer Vision, the ability to quant...
Deep learning (DL), which involves powerful black box predictors, has achieved a remarkable performa...
The focus in deep learning research has been mostly to push the limits of prediction accuracy. Howev...
Deep Learning (DL) has achieved the state-of-the-art performance across a broad spectrum oftasks. Fr...
Gaussian process [1] and it’s variants of deep structures like deep gaussian processes [2] and convo...
Recent works show that the data distribution in a network's latent space is useful for estimating cl...
Recent works show that the data distribution in a network's latent space is useful for estimating cl...
Abstract. An important advantage of Gaussian processes is the ability to directly estimate classific...
Deep neural networks are often ignorant about what they do not know and overconfident when they make...
Uncertainty estimation (UE) techniques -- such as the Gaussian process (GP), Bayesian neural network...
We propose a simple method that combines neural networks and Gaussian processes. The proposed method...
This paper presents a probabilistic framework to obtain both reliable and fast uncertainty estimates...
Deep learning models such as convolutional neural networks have brought advances in computer vision ...
A common question regarding the application of neural networks is whether the predictions of the mod...
Estimating how uncertain an AI system is in its predictions is important to improve the safety of su...
As Deep Learning continues to yield successful applications in Computer Vision, the ability to quant...
Deep learning (DL), which involves powerful black box predictors, has achieved a remarkable performa...
The focus in deep learning research has been mostly to push the limits of prediction accuracy. Howev...
Deep Learning (DL) has achieved the state-of-the-art performance across a broad spectrum oftasks. Fr...
Gaussian process [1] and it’s variants of deep structures like deep gaussian processes [2] and convo...