AbstractA Gaussian version of the iterative proportional fitting procedure (IFP-P) was applied by Speed and Kiiveri to solve the likelihood equations in graphical Gaussian models. The calculation of the maximum likelihood estimates can be seen as the problem to find a Gaussian distribution with prescribed Gaussian marginals. We extend the Gaussian version of the IPF-P so that additionally given conditionals of Gaussian type are taken into account. The convergence of both proposed procedures, called conditional iterative proportional fitting procedures (CIPF-P), is proved
We introduce a treatment of parametric estimation in which optimality of an estimator is measured in...
We present two new methods for inference in Gaussian process (GP) models with general nonlinear like...
We derive the Expectation Propagation algorithm updates for approximating the posterior distribution...
A Gaussian version of the iterative proportional fitting procedure (IFP-P) was applied by Speed and ...
AbstractA Gaussian version of the iterative proportional fitting procedure (IFP-P) was applied by Sp...
<div><p>In this article, we propose localized implementations of the iterative proportional scaling ...
In this paper we give a proof of convergence of the iterative proportional fitting procedures (IPFP)...
In Gaussian graphical models, the likelihood equations must typically be solved iteratively, for exa...
The conditional Gauss–Hermite filter (CGHF) utilizes a decompo-sition of the filter density p(y1, y2...
We consider estimation of the covariance matrix of a random vector under the constraint that certain...
Probabilistic constraints represent a major model of stochastic optimization. A possible approach fo...
This thesis contributes to the field of Gaussian Graphical Models by exploring either numerically or...
<p>We consider the problem of learning a conditional Gaussian graphical model in the presence of lat...
Given a joint probability distribution, one can generally find its marginal components. However, it...
We consider the problem of learning a conditional Gaussian graphical model in the presence of latent...
We introduce a treatment of parametric estimation in which optimality of an estimator is measured in...
We present two new methods for inference in Gaussian process (GP) models with general nonlinear like...
We derive the Expectation Propagation algorithm updates for approximating the posterior distribution...
A Gaussian version of the iterative proportional fitting procedure (IFP-P) was applied by Speed and ...
AbstractA Gaussian version of the iterative proportional fitting procedure (IFP-P) was applied by Sp...
<div><p>In this article, we propose localized implementations of the iterative proportional scaling ...
In this paper we give a proof of convergence of the iterative proportional fitting procedures (IPFP)...
In Gaussian graphical models, the likelihood equations must typically be solved iteratively, for exa...
The conditional Gauss–Hermite filter (CGHF) utilizes a decompo-sition of the filter density p(y1, y2...
We consider estimation of the covariance matrix of a random vector under the constraint that certain...
Probabilistic constraints represent a major model of stochastic optimization. A possible approach fo...
This thesis contributes to the field of Gaussian Graphical Models by exploring either numerically or...
<p>We consider the problem of learning a conditional Gaussian graphical model in the presence of lat...
Given a joint probability distribution, one can generally find its marginal components. However, it...
We consider the problem of learning a conditional Gaussian graphical model in the presence of latent...
We introduce a treatment of parametric estimation in which optimality of an estimator is measured in...
We present two new methods for inference in Gaussian process (GP) models with general nonlinear like...
We derive the Expectation Propagation algorithm updates for approximating the posterior distribution...