This Article is brought to you for free and open access by the The Ohio Center of Excellence in Knowledge-Enabled Computing (Kno.e.sis) at COR
International audienceWe propose a new family of latent variable models called max-margin min-entrop...
In Gaussian mixture modeling, it is crucial to select the number of Gaussians or mixture model for a...
We are interested in distributions which are derived as a maximumentropy distribution given a set of...
This paper presents a new approach to estimating mixture models based on a recent inference principl...
We present a new approach to estimating mixture models based on a new inference principle we have ...
We present a new approach to estimating mix-ture models based on a new inference princi-ple we have ...
Maximum entropy (MaxEnt) framework has been studied extensively in supervised learning. Here, the go...
We present an extension to Jaynes’ maximum entropy principle that incorporates latent variables. The...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference pri...
This note is completely expository, and contains a whirlwind abridged introduction to the topic of m...
Derived from regularization theory, an adaptive entropy regularized likelihood (ERL) learning algori...
Nowadays, we observe a rapid growth of complex data in all formats due to the technological developm...
We consider the semi-supervised learning problem, where a decision rule is to be learned from labele...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Derived from regularization theory, an adaptive entropy regularized likelihood (ERL) learning algori...
International audienceWe propose a new family of latent variable models called max-margin min-entrop...
In Gaussian mixture modeling, it is crucial to select the number of Gaussians or mixture model for a...
We are interested in distributions which are derived as a maximumentropy distribution given a set of...
This paper presents a new approach to estimating mixture models based on a recent inference principl...
We present a new approach to estimating mixture models based on a new inference principle we have ...
We present a new approach to estimating mix-ture models based on a new inference princi-ple we have ...
Maximum entropy (MaxEnt) framework has been studied extensively in supervised learning. Here, the go...
We present an extension to Jaynes’ maximum entropy principle that incorporates latent variables. The...
We present a new statistical learning paradigm for Boltzmann machines based on a new inference pri...
This note is completely expository, and contains a whirlwind abridged introduction to the topic of m...
Derived from regularization theory, an adaptive entropy regularized likelihood (ERL) learning algori...
Nowadays, we observe a rapid growth of complex data in all formats due to the technological developm...
We consider the semi-supervised learning problem, where a decision rule is to be learned from labele...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Derived from regularization theory, an adaptive entropy regularized likelihood (ERL) learning algori...
International audienceWe propose a new family of latent variable models called max-margin min-entrop...
In Gaussian mixture modeling, it is crucial to select the number of Gaussians or mixture model for a...
We are interested in distributions which are derived as a maximumentropy distribution given a set of...