We propose a novel approach for nonlinear regression using a two-layer neural network (NN) model structure with sparsity- favoring hierarchical priors on the network weights. We present an expectation propagation (EP) approach for approximate integration over the posterior distribution of the weights, the hierarchical scale parameters of the priors, and the residual scale. Using a factorized posterior approximation we derive a computationally efficient algorithm, whose complexity scales similarly to an ensemble of independent sparse linear models. The approach enables flexible definition of weight priors with different sparseness properties such as independent Laplace priors with a common scale parameter or Gaussian automatic relevance dete...
Sparsity-promoting prior along with Bayesian inference is an effective approach in solving sparse li...
We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a fac...
We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of fac...
Contains fulltext : 134605.pdf (publisher's version ) (Closed access)We propose a ...
| openaire: EC/H2020/101016775/EU//INTERVENEEncoding domain knowledge into the prior over the high-d...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonli...
The paper deals with learning probability distributions of observed data by artificial neural networ...
An expectation propagation (EP) algorithm is proposed for approximate inference in linear regression...
Gaussian processes are now widely used to perform key machine learning tasks such as nonlinear regre...
Abstract. We present a framework for efficient, accurate approximate Bayesian inference in generaliz...
We present a framework for efficient, accurate approximate Bayesian inference in generalized linear ...
International audienceSparsifying deep neural networks is of paramount interest in many areas, espec...
In recent years, Neural Networks (NN) have become a popular data-analytic tool in Statistics, Compu...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
Sparsity-promoting prior along with Bayesian inference is an effective approach in solving sparse li...
We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a fac...
We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of fac...
Contains fulltext : 134605.pdf (publisher's version ) (Closed access)We propose a ...
| openaire: EC/H2020/101016775/EU//INTERVENEEncoding domain knowledge into the prior over the high-d...
International audienceWe investigate deep Bayesian neural networks with Gaussian priors on the weigh...
We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonli...
The paper deals with learning probability distributions of observed data by artificial neural networ...
An expectation propagation (EP) algorithm is proposed for approximate inference in linear regression...
Gaussian processes are now widely used to perform key machine learning tasks such as nonlinear regre...
Abstract. We present a framework for efficient, accurate approximate Bayesian inference in generaliz...
We present a framework for efficient, accurate approximate Bayesian inference in generalized linear ...
International audienceSparsifying deep neural networks is of paramount interest in many areas, espec...
In recent years, Neural Networks (NN) have become a popular data-analytic tool in Statistics, Compu...
Deep neural networks have bested notable benchmarks across computer vision, reinforcement learning, ...
Sparsity-promoting prior along with Bayesian inference is an effective approach in solving sparse li...
We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a fac...
We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of fac...