Uncertainty quantification (UQ) is important for reliability assessment and enhancement of machine learning models. In deep learning, uncertainties arise not only from data, but also from the training procedure that often injects substantial noises and biases. These hinder the attainment of statistical guarantees and, moreover, impose computational challenges on UQ due to the need for repeated network retraining. Building upon the recent neural tangent kernel theory, we create statistically guaranteed schemes to principally \emph{characterize}, and \emph{remove}, the uncertainty of over-parameterized neural networks with very low computation effort. In particular, our approach, based on what we call a procedural-noise-correcting (PNC) predi...
Deep neural networks (NNs) have become ubiquitous and achieved state-of-the-art results in a wide va...
Deep neural networks (DNNs) have made great strides in pushing the state-of-the-art in several chall...
In this paper we attempt to build upon past work on Interval Neural Networks, and provide a robust w...
Well-calibrated predictive uncertainty of neural networks—essentially making them know when they do ...
Uncertainty quantification (UQ) for predictions generated by neural networks (NNs) is of vital impor...
This paper proposes a fast and scalable method for uncertainty quantification of machine learning mo...
Neural networks predictions are unreliable when the input sample is out of the training distribution...
We are interested in estimating the uncertainties of deep neural networks, which play an important r...
Accurate uncertainty quantification is necessary to enhance the reliability of deep learning models ...
Considering uncertainty estimation of modern neural networks (NNs) is one of the most important ste...
Whereas the ability of deep networks to produce useful predictions on many kinds of data has been am...
Despite the popularity of Convolutional Neural Networks (CNN), the problem of uncertainty quantifica...
The focus in deep learning research has been mostly to push the limits of prediction accuracy. Howev...
Uncertainty quantification in automated image analysis is highly desired in many applications. Typic...
While machine learning is traditionally a resource intensive task, embedded systems, autonomous navi...
Deep neural networks (NNs) have become ubiquitous and achieved state-of-the-art results in a wide va...
Deep neural networks (DNNs) have made great strides in pushing the state-of-the-art in several chall...
In this paper we attempt to build upon past work on Interval Neural Networks, and provide a robust w...
Well-calibrated predictive uncertainty of neural networks—essentially making them know when they do ...
Uncertainty quantification (UQ) for predictions generated by neural networks (NNs) is of vital impor...
This paper proposes a fast and scalable method for uncertainty quantification of machine learning mo...
Neural networks predictions are unreliable when the input sample is out of the training distribution...
We are interested in estimating the uncertainties of deep neural networks, which play an important r...
Accurate uncertainty quantification is necessary to enhance the reliability of deep learning models ...
Considering uncertainty estimation of modern neural networks (NNs) is one of the most important ste...
Whereas the ability of deep networks to produce useful predictions on many kinds of data has been am...
Despite the popularity of Convolutional Neural Networks (CNN), the problem of uncertainty quantifica...
The focus in deep learning research has been mostly to push the limits of prediction accuracy. Howev...
Uncertainty quantification in automated image analysis is highly desired in many applications. Typic...
While machine learning is traditionally a resource intensive task, embedded systems, autonomous navi...
Deep neural networks (NNs) have become ubiquitous and achieved state-of-the-art results in a wide va...
Deep neural networks (DNNs) have made great strides in pushing the state-of-the-art in several chall...
In this paper we attempt to build upon past work on Interval Neural Networks, and provide a robust w...