At present, there is no consensus on the most effective way to establish feature relevance for Gaussian process models. The most common heuristic, Automatic Relevance Determination, has several downsides; many alternate methods incur unacceptable computational costs. Existing methods based on sensitivity analysis of the posterior predictive distribution are promising, but are biased and show room for improvement. This paper proposes Feature Collapsing as a novel method for performing GP feature relevance determination in an effective, consistent, unbiased, and computationally-inexpensive manner compared to existing algorithms.Peer reviewe
In many applications, like function approximation, pattern recognition, time series prediction, and ...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
Gaussian Process Preference Learning (GPPL) is considered to be the state-of-the-art algorithm for l...
Variable selection for Gaussian process models is often done using automatic relevance determination...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...
Gaussian process (GP) methods have been widely studied recently, especially for large-scale systems ...
We improve Gaussian processes (GP) classification by reorganizing the (non-stationary and anisotropi...
While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performan...
Gaussian Process Priors. It estimates class membership posterior probability employing variational a...
The paper presents an algorithm to rank features in “small number of samples, large dimensionality” ...
We describe a feature selection method that can be applied directly to models that are linear with r...
Gaussian processes have proved to be useful and powerful constructs for the purposes of regression. ...
In this report, we discuss the application and usage of Gaussian Process in Classification and Regre...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
The main topic of this thesis are Gaussian processes for machine learning, more precisely the select...
In many applications, like function approximation, pattern recognition, time series prediction, and ...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
Gaussian Process Preference Learning (GPPL) is considered to be the state-of-the-art algorithm for l...
Variable selection for Gaussian process models is often done using automatic relevance determination...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...
Gaussian process (GP) methods have been widely studied recently, especially for large-scale systems ...
We improve Gaussian processes (GP) classification by reorganizing the (non-stationary and anisotropi...
While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performan...
Gaussian Process Priors. It estimates class membership posterior probability employing variational a...
The paper presents an algorithm to rank features in “small number of samples, large dimensionality” ...
We describe a feature selection method that can be applied directly to models that are linear with r...
Gaussian processes have proved to be useful and powerful constructs for the purposes of regression. ...
In this report, we discuss the application and usage of Gaussian Process in Classification and Regre...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
The main topic of this thesis are Gaussian processes for machine learning, more precisely the select...
In many applications, like function approximation, pattern recognition, time series prediction, and ...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
Gaussian Process Preference Learning (GPPL) is considered to be the state-of-the-art algorithm for l...