We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks that can approximate any probability distribution on the states of their visible units arbitrarily well. We relax the setting of binary units (Sutskever and Hinton, 2008; Le Roux and Bengio, 2008, 2010; Montúfar and Ay, 2011) to units with arbitrary finite state spaces, and the vanishing approximation error to an arbitrary approximation error tolerance. For example, we show that a q-ary deep belief network with L ≥ 2 + q dm−δe−1 q−1 layers of width n ≤ m + logq(m) + 1 for some m ∈ N can approximate any probability distribution on {0, 1,..., q−1}n without exceeding a Kullback-Leibler divergence of δ. Our analysis covers discrete restricted B...
Many models have been proposed to capture the statistical regularities in natural images patches. Th...
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence nod...
This paper introduces and investigates Depth-bounded Belief functions, a logic-based representation ...
We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks...
We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks...
We improve recently published results about resources of Restricted Boltzmann Ma-chines (RBM) and De...
We study the mixtures of factorizing probability distributions represented as visi-ble marginal dist...
Deep Belief Networks (DBN’s) are generative models that contain many layers of hidden vari-ables. Ef...
We present explicit classes of probability distributions that can be learned by Restricted Boltzmann...
AbstractApproximating the inference probability Pr[X = x | E = e] in any sense, even for a single ev...
Abstract. We review recent results about the maximal values of the Kullback-Leibler information dive...
We present explicit classes of probability distributions that can be learned by Re-stricted Boltzman...
The first part of this thesis develops fundamental limits of deep neural network learning by charact...
In this study, we provide a direct comparison of the Stochastic Maximum Likelihood algorithm and Con...
AbstractProbabilistic inference and maximum a posteriori (MAP) explanation are two important and rel...
Many models have been proposed to capture the statistical regularities in natural images patches. Th...
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence nod...
This paper introduces and investigates Depth-bounded Belief functions, a logic-based representation ...
We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks...
We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks...
We improve recently published results about resources of Restricted Boltzmann Ma-chines (RBM) and De...
We study the mixtures of factorizing probability distributions represented as visi-ble marginal dist...
Deep Belief Networks (DBN’s) are generative models that contain many layers of hidden vari-ables. Ef...
We present explicit classes of probability distributions that can be learned by Restricted Boltzmann...
AbstractApproximating the inference probability Pr[X = x | E = e] in any sense, even for a single ev...
Abstract. We review recent results about the maximal values of the Kullback-Leibler information dive...
We present explicit classes of probability distributions that can be learned by Re-stricted Boltzman...
The first part of this thesis develops fundamental limits of deep neural network learning by charact...
In this study, we provide a direct comparison of the Stochastic Maximum Likelihood algorithm and Con...
AbstractProbabilistic inference and maximum a posteriori (MAP) explanation are two important and rel...
Many models have been proposed to capture the statistical regularities in natural images patches. Th...
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence nod...
This paper introduces and investigates Depth-bounded Belief functions, a logic-based representation ...