We present a method to bound the partition function of a Boltzmann machine neural network with any odd-order polynomial. This is a direct extension of the mean-field bound, which is first order. We show that the third-order bound is strictly better than mean field. Additionally, we derive a third-order bound for the likelihood of sigmoid belief networks. Numerical experiments indicate that an error reduction of a factor of two is easily reached in the region where expansion-based approximations are useful
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
Contains fulltext : 58959.pdf (publisher's version ) (Open Access)'A graphical mod...
In this thesis we asses the consistency and convexity of the parameter inference in Boltzmann machin...
Contains fulltext : 112682.pdf (preprint version ) (Open Access)We present a metho...
In this paper, we derive a second order mean field theory for directed graphical probability models....
Exact inference for Boltzmann machines is computationally expensive. One approach to improving tract...
Deep Belief Networks (DBN’s) are generative models that contain many layers of hidden vari-ables. Ef...
We present a heuristical procedure for efficient estimation of the partition function in the Boltzma...
Various applications of the mean field theory (MFT) technique for obtaining solutions close to optim...
Estimating the partition function is a key but difficult computation in graphical models. One approa...
Estimating the partition function is a key but difficult computation in graphical models. One approa...
We consider the problem of bounding from above the log-partition function corresponding to second-or...
This paper examines the question: What kinds of distributions can be efficiently represented by Rest...
This paper examines the question: What kinds of distributions can be efficiently represented by Rest...
Abstract. The more realistic neural soma and synaptic nonlinear relations and an alternative mean fi...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
Contains fulltext : 58959.pdf (publisher's version ) (Open Access)'A graphical mod...
In this thesis we asses the consistency and convexity of the parameter inference in Boltzmann machin...
Contains fulltext : 112682.pdf (preprint version ) (Open Access)We present a metho...
In this paper, we derive a second order mean field theory for directed graphical probability models....
Exact inference for Boltzmann machines is computationally expensive. One approach to improving tract...
Deep Belief Networks (DBN’s) are generative models that contain many layers of hidden vari-ables. Ef...
We present a heuristical procedure for efficient estimation of the partition function in the Boltzma...
Various applications of the mean field theory (MFT) technique for obtaining solutions close to optim...
Estimating the partition function is a key but difficult computation in graphical models. One approa...
Estimating the partition function is a key but difficult computation in graphical models. One approa...
We consider the problem of bounding from above the log-partition function corresponding to second-or...
This paper examines the question: What kinds of distributions can be efficiently represented by Rest...
This paper examines the question: What kinds of distributions can be efficiently represented by Rest...
Abstract. The more realistic neural soma and synaptic nonlinear relations and an alternative mean fi...
. W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks ...
Contains fulltext : 58959.pdf (publisher's version ) (Open Access)'A graphical mod...
In this thesis we asses the consistency and convexity of the parameter inference in Boltzmann machin...