Many machine learning problems deal with the estimation of conditional probabilities $p(y \mid x)$ from data $(x_1,y_i),\ldots,(x_n,y_n)$. This includes classification, regression and density estimation. Given a prior for $p(y \mid x)$ the maximum a-posteriori method estimates $p(y \mid x)$ as the most likely probability given the data. This principle can be formulated rigorously using the Cameron-Martin theory of stochastic processes and allows a variational characterisation of the estimator. The resulting nonlinear Galerkin equations are solved numerically. Convexity and total positivity lead to existence, uniqueness and error bounds. For machine learning problems dealing with large numbers of features we suggest to use sparse grid appr...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
A statistical learning approach for high-dimensional parametric PDEs related to uncertainty quantifi...
Variational inference is a popular alternative to Markov chain Monte Carlo methods that constructs ...
Many machine learning problems deal with the estimation of conditional probabilities $p(y \mid x)$ f...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coh...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
We show that maximum a posteriori (MAP) statistical methods can be used in nonparametric machine lea...
Abstract—This paper proposes a novel probabilistic variational method with deterministic annealing f...
This thesis focuses on the variational learning of latent Gaussian models for discrete data. The lea...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
Probabilistic inference is at the core of many recent advances in machine learning. Unfortunately, ...
Parametric partial differential equations (PDEs) are of central importance to modern engineering sci...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
A statistical learning approach for high-dimensional parametric PDEs related to uncertainty quantifi...
Variational inference is a popular alternative to Markov chain Monte Carlo methods that constructs ...
Many machine learning problems deal with the estimation of conditional probabilities $p(y \mid x)$ f...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coh...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
We show that maximum a posteriori (MAP) statistical methods can be used in nonparametric machine lea...
Abstract—This paper proposes a novel probabilistic variational method with deterministic annealing f...
This thesis focuses on the variational learning of latent Gaussian models for discrete data. The lea...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
Probabilistic inference is at the core of many recent advances in machine learning. Unfortunately, ...
Parametric partial differential equations (PDEs) are of central importance to modern engineering sci...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
A statistical learning approach for high-dimensional parametric PDEs related to uncertainty quantifi...
Variational inference is a popular alternative to Markov chain Monte Carlo methods that constructs ...