We propose a scalable Gaussian process model for regression by applying a deep neural network as the feature-mapping function. We first pre-train the deep neural network with a stacked denoising auto-encoder in an unsupervised way. Then, we perform a Bayesian linear regression on the top layer of the pre-trained deep network. The resulting model, Deep-Neural-Network-based Gaussian Pro-cess (DNN-GP), can learn much more meaningful representation of the data by the finite-dimensional but deep-layered feature-mapping function. Unlike standard Gaussian processes, our model scales well with the size of the training set due to the avoidance of kernel matrix inversion. Moreover, we present a mixture of DNN-GPs to further improve the re-gression pe...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Many modern machine learning methods, including deep neural networks, utilize a discrete sequence of...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
We propose a simple method that combines neural networks and Gaussian processes. The proposed method...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief network bas...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Choosing appropriate architectures and regular-ization strategies for deep networks is crucial to go...
We propose a practical and scalable Gaussian process model for large-scale nonlinear probabilistic r...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Many modern machine learning methods, including deep neural networks, utilize a discrete sequence of...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief net-work ba...
We propose a simple method that combines neural networks and Gaussian processes. The proposed method...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
In this paper we introduce deep Gaussian process (GP) models. Deep GPs are a deep belief network bas...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Choosing appropriate architectures and regular-ization strategies for deep networks is crucial to go...
We propose a practical and scalable Gaussian process model for large-scale nonlinear probabilistic r...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
Recent years have witnessed an increasing interest in the correspondence between infinitely wide net...