Multilayer perceptrons (MLPs) or neural networks are popular models used for nonlinear regression and classification tasks. As regressors, MLPs model the conditional distribution of the predictor variables Y given the input variables X. However, this predictive distribution is assumed to be unimodal (e.g. Gaussian). For tasks involving structured prediction, the conditional distribution should be multi-modal, resulting in one-to-many mappings. By using stochastic hidden vari-ables rather than deterministic ones, Sigmoid Belief Nets (SBNs) can induce a rich multimodal distribution in the output space. However, previously proposed learn-ing algorithms for SBNs are not efficient and unsuitable for modeling real-valued data. In this paper, we p...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
We introduce a novel training principle for probabilistic models that is an al-ternative to maximum ...
Leveraging advances in variational inference, we propose to enhance recurrent neural networks with l...
Multilayer perceptrons (MLPs) or neural networks are popular models used for nonlinear regression an...
Multilayer perceptrons (MLPs) or artificial neural nets are popular models used for non-linear regre...
Stochastic binary hidden units in a multi-layer perceptron (MLP) network give at least three potenti...
Introduction The work reported here began with the desire to find a network architecture that shared...
We view perceptual tasks such as vision and speech recognition as inference problems where the goal ...
Predicting conditional probability densities with neural networks requires complex (at least two-hid...
Sigmoid type belief networks, a class of probabilistic neural networks, provide a natural framework ...
Feedforward neural networks applied to time series prediction are usually trained to predict the nex...
This paper addresses the duality between the deterministic feed-forward neural networks (FF-NNs) and...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
We introduce a novel training principle for prob-abilistic models that is an alternative to max-imum...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
We introduce a novel training principle for probabilistic models that is an al-ternative to maximum ...
Leveraging advances in variational inference, we propose to enhance recurrent neural networks with l...
Multilayer perceptrons (MLPs) or neural networks are popular models used for nonlinear regression an...
Multilayer perceptrons (MLPs) or artificial neural nets are popular models used for non-linear regre...
Stochastic binary hidden units in a multi-layer perceptron (MLP) network give at least three potenti...
Introduction The work reported here began with the desire to find a network architecture that shared...
We view perceptual tasks such as vision and speech recognition as inference problems where the goal ...
Predicting conditional probability densities with neural networks requires complex (at least two-hid...
Sigmoid type belief networks, a class of probabilistic neural networks, provide a natural framework ...
Feedforward neural networks applied to time series prediction are usually trained to predict the nex...
This paper addresses the duality between the deterministic feed-forward neural networks (FF-NNs) and...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
We introduce a novel training principle for prob-abilistic models that is an alternative to max-imum...
This paper proposes a backpropagation-based feedforward neural network for learning probability dist...
We introduce a novel training principle for probabilistic models that is an al-ternative to maximum ...
Leveraging advances in variational inference, we propose to enhance recurrent neural networks with l...