<p>The Expectation Maximization (EM) algorithm is a method for learning the parameters of probabilistic graphical models when there is hidden or missing data. The goal of an EM algorithm is to estimate a set of parameters that maximizes the likelihood of the data. In spite of its success in practice, the EM algorithm has several limitations, including slow convergence, computational complexity, and inability to escape local maxima. Using multiple random starting points is a popular approach to mitigate the local maxima problem, but this method is time consuming. This work seeks to improve the understanding and performance of the EM algorithm. We combine evolutionary algorithms, which make use of stochastic search, with the multiple random s...