This paper presents a novel methodology to infer parameters of probabilistic models whose output noise is a Student-t distribution. The method is an extension of earlier work for models that are linear in parameters to nonlinear multi-layer perceptrons (MLPs). We used an EM algorithm combined with variational approximation, the evidence procedure, and an optimisation algorithm. The technique was tested on two regression applications. The first one is a synthetic dataset and the second is gas forward contract prices data from the UK energy market. The results showed that forecasting accuracy is significantly improved by using Student-t noise models
The learning of variational inference can be widely seen as first estimating the class assignment va...
We combine the replica approach from statistical physics with a variational approach to analyze lear...
We combine the replica approach from statistical physics with a variational approach to analyze lear...
This paper presents a novel methodology to infer parameters of probabilistic models whose output noi...
We propose the Laplace method to derive approximate inference for Gaussian process (GP) regression i...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
We propose the Laplace method to derive approximate inference for Gaussian process (GP) regression i...
PublishedArticleWe introduce a Bayesian spectral analysis model for one-dimensional signals where th...
Copyright © 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obta...
This thesis is a study of three techniques to improve performance of some standard fore-casting mode...
In recent years, probabilistic models have become fundamental techniques in machine learning. They a...
The results in this thesis are based on applications of the expectation propagation algorithm to app...
In this paper we present a framework for using multi-layer per-ceptron (MLP) networks in nonlinear g...
Contains fulltext : 83218.pdf (publisher's version ) (Open Access)The results in t...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
The learning of variational inference can be widely seen as first estimating the class assignment va...
We combine the replica approach from statistical physics with a variational approach to analyze lear...
We combine the replica approach from statistical physics with a variational approach to analyze lear...
This paper presents a novel methodology to infer parameters of probabilistic models whose output noi...
We propose the Laplace method to derive approximate inference for Gaussian process (GP) regression i...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
We propose the Laplace method to derive approximate inference for Gaussian process (GP) regression i...
PublishedArticleWe introduce a Bayesian spectral analysis model for one-dimensional signals where th...
Copyright © 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obta...
This thesis is a study of three techniques to improve performance of some standard fore-casting mode...
In recent years, probabilistic models have become fundamental techniques in machine learning. They a...
The results in this thesis are based on applications of the expectation propagation algorithm to app...
In this paper we present a framework for using multi-layer per-ceptron (MLP) networks in nonlinear g...
Contains fulltext : 83218.pdf (publisher's version ) (Open Access)The results in t...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
The learning of variational inference can be widely seen as first estimating the class assignment va...
We combine the replica approach from statistical physics with a variational approach to analyze lear...
We combine the replica approach from statistical physics with a variational approach to analyze lear...