We show that, given data from a mixture of k well-separated spherical Gaussians in ℜ^d, a simple two-round variant of EM will, with high probability, learn the parameters of the Gaussians to near-optimal precision, if the dimension is high (d >> ln k). We relate this to previous theoretical and empirical work on the EM algorithm
It is well-known that the EM algorithm generally converges to a local maximum likelihood estimate. H...
Given data drawn from a mixture of multivariate Gaussians, a basic problem is to accurately estimate...
<p>While several papers have investigated computationally and statistically efficient methods for le...
We show that, given data from a mixture of k well-separated spherical Gaussians in ℜ^d, a simple two...
AbstractWe show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in ...
Presented on March 6, 2017 at 11:00 a.m. in the Klaus Advanced Computing Building, Room 1116E.Consta...
"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood le...
Abstract. The EM algorithm for Gaussian mixture models often gets caught in local maxima of the like...
Mixtures of Gaussian (or normal) distributions arise in a variety of application areas. Many heurist...
Maximum likelihood through the EM algorithm is widely used to estimate the parameters in hidden stru...
For the Gaussian mixture learning, the expectation-maximization (EM) algorithm as well as its modifi...
This paper studies the sample complexity of learning the $k$ unknown centers of a balanced Gaussian ...
FSMEM, or free split/merge expectation maximization, is a modication of the SMEM algorithm presented...
Cluster analysis faces two problems in high dimensions: first, the “curse of di-mensionality ” that ...
Mixtures of gaussian (or normal) distributions arise in a variety of application areas. Many techniq...
It is well-known that the EM algorithm generally converges to a local maximum likelihood estimate. H...
Given data drawn from a mixture of multivariate Gaussians, a basic problem is to accurately estimate...
<p>While several papers have investigated computationally and statistically efficient methods for le...
We show that, given data from a mixture of k well-separated spherical Gaussians in ℜ^d, a simple two...
AbstractWe show that a simple spectral algorithm for learning a mixture of k spherical Gaussians in ...
Presented on March 6, 2017 at 11:00 a.m. in the Klaus Advanced Computing Building, Room 1116E.Consta...
"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood le...
Abstract. The EM algorithm for Gaussian mixture models often gets caught in local maxima of the like...
Mixtures of Gaussian (or normal) distributions arise in a variety of application areas. Many heurist...
Maximum likelihood through the EM algorithm is widely used to estimate the parameters in hidden stru...
For the Gaussian mixture learning, the expectation-maximization (EM) algorithm as well as its modifi...
This paper studies the sample complexity of learning the $k$ unknown centers of a balanced Gaussian ...
FSMEM, or free split/merge expectation maximization, is a modication of the SMEM algorithm presented...
Cluster analysis faces two problems in high dimensions: first, the “curse of di-mensionality ” that ...
Mixtures of gaussian (or normal) distributions arise in a variety of application areas. Many techniq...
It is well-known that the EM algorithm generally converges to a local maximum likelihood estimate. H...
Given data drawn from a mixture of multivariate Gaussians, a basic problem is to accurately estimate...
<p>While several papers have investigated computationally and statistically efficient methods for le...