When a series of problems are related, representations derived from learning earlier tasks may be useful in solving later problems. In this paper we propose a novel approach to transfer learning with low-dimensional, non-linear latent spaces. We show how such representations can be jointly learned across multiple tasks in a Gaussian Process framework. When transferred to new tasks with relatively few training examples, learning can be faster and/or more accurate. Experiments on digit recognition and newsgroup classification tasks show significantly improved performance when compared to baseline performance with a representation derived from a semi-supervised learning approach or with a discriminative approach that uses only the target data
In this paper, we propose a novel stochastic framework for unsupervised manifold learning. The laten...
In this paper we investigate multi-task learning in the context of Gaussian Pro-cesses (GP). We prop...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...
When a series of problems are related, representations derived fromlearning earlier tasks may be use...
Transfer Learning is an emerging framework for learning from data that aims at intelligently transf...
Transfer learning aims at reusing the knowledge in some source tasks to improve the learning of a ta...
Supervised learning is difficult with high dimensional input spacesand very small training sets, but...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
Given several related learning tasks, we propose a nonparametric Bayesian learn-ing model that captu...
We consider the problem of multi-task learning, that is, learning multiple related functions. Our ap...
We introduce a novel Gaussian process based Bayesian model for asymmetric transfer learn-ing. We ado...
We discuss a general method to learn data representations from multiple tasks. We provide a justific...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
The multi-output Gaussian process ($\mathcal{MGP}$) is based on the assumption that outputs share co...
In this paper, we propose a novel stochastic framework for unsupervised manifold learning. The laten...
In this paper we investigate multi-task learning in the context of Gaussian Pro-cesses (GP). We prop...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...
When a series of problems are related, representations derived fromlearning earlier tasks may be use...
Transfer Learning is an emerging framework for learning from data that aims at intelligently transf...
Transfer learning aims at reusing the knowledge in some source tasks to improve the learning of a ta...
Supervised learning is difficult with high dimensional input spacesand very small training sets, but...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
Given several related learning tasks, we propose a nonparametric Bayesian learn-ing model that captu...
We consider the problem of multi-task learning, that is, learning multiple related functions. Our ap...
We introduce a novel Gaussian process based Bayesian model for asymmetric transfer learn-ing. We ado...
We discuss a general method to learn data representations from multiple tasks. We provide a justific...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
The multi-output Gaussian process ($\mathcal{MGP}$) is based on the assumption that outputs share co...
In this paper, we propose a novel stochastic framework for unsupervised manifold learning. The laten...
In this paper we investigate multi-task learning in the context of Gaussian Pro-cesses (GP). We prop...
Gaussian process (GP) is a stochastic process that has been studied for a long time and gained wide ...