Part 2: Machine LearningInternational audienceMixture of Gaussian Processes (MGP) is a generative model being powerful and widely used in the fields of machine learning and data mining. However, when we learn this generative model on a given dataset, we should set the probability density function (pdf) of the input in advance. In general, it can be set as a Gaussian distribution. But, for some actual data like time series, this setting or assumption is not reasonable and effective. In this paper, we propose a specialized pdf for the input of MGP model which is a piecewise-defined continuous function with three parts such that the middle part takes the form of a uniform distribution, while the two side parts take the form of Gaussian distrib...
We compare two regularization methods which can be used to im-prove the generalization capabilities ...
International audienceMixture of experts (MoE) models are a class of artificial neural networks that...
Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, ...
We give a basic introduction to Gaussian Process regression models. We focus on understanding the ro...
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kern...
The mixture of Gaussian Processes (MGP) is a powerful and fast developed machine learning framework....
The mixture of Gaussian processes (MGP) is a powerful statistical learning model for regression and ...
The Mixture of Gaussian Processes (MGP) is a powerful statistical learning framework in machine lear...
The mixture of Gaussian processes (MOP) is an important probabilistic model which is often applied t...
Mixtures of experts probabilistically divide the input space into regions, where the assumptions of ...
The mixture of Gaussian processes (MGP) is a powerful framework for machine learning. However, its p...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gauss...
We present the Gaussian Process Density Sampler (GPDS), an exchangeable generative model for use in ...
The mixture of Gaussian processes(MGP) is a powerful and widely used model in machine learning. Howe...
We compare two regularization methods which can be used to im-prove the generalization capabilities ...
International audienceMixture of experts (MoE) models are a class of artificial neural networks that...
Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, ...
We give a basic introduction to Gaussian Process regression models. We focus on understanding the ro...
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kern...
The mixture of Gaussian Processes (MGP) is a powerful and fast developed machine learning framework....
The mixture of Gaussian processes (MGP) is a powerful statistical learning model for regression and ...
The Mixture of Gaussian Processes (MGP) is a powerful statistical learning framework in machine lear...
The mixture of Gaussian processes (MOP) is an important probabilistic model which is often applied t...
Mixtures of experts probabilistically divide the input space into regions, where the assumptions of ...
The mixture of Gaussian processes (MGP) is a powerful framework for machine learning. However, its p...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gauss...
We present the Gaussian Process Density Sampler (GPDS), an exchangeable generative model for use in ...
The mixture of Gaussian processes(MGP) is a powerful and widely used model in machine learning. Howe...
We compare two regularization methods which can be used to im-prove the generalization capabilities ...
International audienceMixture of experts (MoE) models are a class of artificial neural networks that...
Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, ...