We present a new statistical learning paradigm for Boltzmann machines based on a new inference principle we have proposed: the latent maximum entropy principle (LME). LME is dierent both from Jaynes' maximum entropy principle and from standard maximum likelihood estimation. We demonstrate the LME principle by deriving new algorithms for Boltzmann machine parameter estimation, and show how a robust and rapidly convergent new variant of the EM algorithm can be developed. Our experiments show that estimation based on LME generally yields better results than maximum likelihood estimation when inferring models from small amounts of data
We propose a statistical mechanical framework for the modeling of discrete time series. Maximum like...
In this study, we provide a direct comparison of the Stochastic Maximum Likelihood algorithm and Con...
Maximum pseudo-likelihood estimation (MPLE) is an attractive method for training fully visible Boltz...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference princ...
We present a new approach to estimating mixture models based on a new inference principle we have ...
We present an extension to Jaynes’ maximum entropy principle that incorporates latent variables. The...
In this paper we formulate the Expectation Maximization (EM) algorithm for Boltzmann Machines and we...
This paper presents a new approach to estimating mixture models based on a recent inference principl...
We present a new approach to estimating mix-ture models based on a new inference princi-ple we have ...
Entropy is a central concept in physics and has deep connections with Information theory, which is o...
Boltzmann learning underlies an artificial neural network model known as the Boltzmann machine that ...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
We propose a statistical mechanical framework for the modeling of discrete time series. Maximum like...
Efficient approximation lies at the heart of large-scale machine learning problems. In this paper, w...
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configurat...
We propose a statistical mechanical framework for the modeling of discrete time series. Maximum like...
In this study, we provide a direct comparison of the Stochastic Maximum Likelihood algorithm and Con...
Maximum pseudo-likelihood estimation (MPLE) is an attractive method for training fully visible Boltz...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference princ...
We present a new approach to estimating mixture models based on a new inference principle we have ...
We present an extension to Jaynes’ maximum entropy principle that incorporates latent variables. The...
In this paper we formulate the Expectation Maximization (EM) algorithm for Boltzmann Machines and we...
This paper presents a new approach to estimating mixture models based on a recent inference principl...
We present a new approach to estimating mix-ture models based on a new inference princi-ple we have ...
Entropy is a central concept in physics and has deep connections with Information theory, which is o...
Boltzmann learning underlies an artificial neural network model known as the Boltzmann machine that ...
We propose a framework for learning hidden-variable models by optimizing entropies, in which entropy...
We propose a statistical mechanical framework for the modeling of discrete time series. Maximum like...
Efficient approximation lies at the heart of large-scale machine learning problems. In this paper, w...
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configurat...
We propose a statistical mechanical framework for the modeling of discrete time series. Maximum like...
In this study, we provide a direct comparison of the Stochastic Maximum Likelihood algorithm and Con...
Maximum pseudo-likelihood estimation (MPLE) is an attractive method for training fully visible Boltz...