Much research has gone into scaling up classical machine learning algorithms such as Gaussian Processes (GPs), but the curse of dimensionality still remains. While many supervised dimensionality reduction algorithms have been proposed in the literature, few of them can scale up to large data-sets. Furthermore, the majority of dimensionality reduction techniques are tailored for classication problems, which leaves regression tasks unexplored. The contributions of this thesis are threefold. First, we extend classical active subspace (AS) theory to a non-linear counterpart. Secondly, we introduce a scalable non linear supervised principal component analysis (SPCA) algorithm. Thirdly, we propose a novel class of supervised dimensionality reduct...
Abstract. We study data-adaptive dimensionality reduction in the context of supervised learning in g...
Manifold learning has gained in recent years a great attention in facing the problem of dimensionali...
We propose a novel dimensionality reduction approach based on the gradient of the regression functio...
Much research has gone into scaling up classical machine learning algorithms such as Gaussian Proces...
The Gaussian process latent variable model (GP-LVM) has been identified to be an effective probabili...
The Gaussian process latent variable model (GP-LVM) has been identified to be an effective probabili...
A fundamental task in machine learning is modeling the relationship between dif-ferent observation s...
Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the struct...
We propose an active learning method for discovering low-dimensional structure in high-dimensional G...
Dimensionality reduction is the transformation of data from a high-dimensional space into a low-dime...
Learning low dimensional manifold from highly nonlinear data of high dimensionality has become incr...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
We are interested in using the goal of making predictions to influence dimensionality reduction proc...
© 2016 IEEE. Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions ...
Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduct...
Abstract. We study data-adaptive dimensionality reduction in the context of supervised learning in g...
Manifold learning has gained in recent years a great attention in facing the problem of dimensionali...
We propose a novel dimensionality reduction approach based on the gradient of the regression functio...
Much research has gone into scaling up classical machine learning algorithms such as Gaussian Proces...
The Gaussian process latent variable model (GP-LVM) has been identified to be an effective probabili...
The Gaussian process latent variable model (GP-LVM) has been identified to be an effective probabili...
A fundamental task in machine learning is modeling the relationship between dif-ferent observation s...
Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the struct...
We propose an active learning method for discovering low-dimensional structure in high-dimensional G...
Dimensionality reduction is the transformation of data from a high-dimensional space into a low-dime...
Learning low dimensional manifold from highly nonlinear data of high dimensionality has become incr...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
We are interested in using the goal of making predictions to influence dimensionality reduction proc...
© 2016 IEEE. Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions ...
Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduct...
Abstract. We study data-adaptive dimensionality reduction in the context of supervised learning in g...
Manifold learning has gained in recent years a great attention in facing the problem of dimensionali...
We propose a novel dimensionality reduction approach based on the gradient of the regression functio...