Most machine learning algorithms need to handle large data sets. This feature often leads to limitations on processing time and memory. The Expectation-Maximization (EM) is one of such algorithms, which is used to train one of the most commonly used parametric statistical models, the Gaussian Mixture Models (GMM). All steps of the algorithm are potentially parallelizable once they iterate over the entire data set. In this study, we propose a parallel implementation of EM for training GMM using CUDA. Experiments are performed with a UCI dataset and results show a speedup of 7 if compared to the sequential version. We have also carried out modifications to the code in order to provide better access to global memory and shared memory usage. We...
Abstract—This paper describes a local and distributed ex-pectation maximization algorithm for learni...
En este documento se hace un estudio de los modelos de mezclas gaussianas. Específicamente, se reali...
Training machine learning (ML) algorithms is a computationally intensive process, which is frequentl...
The central focus of this work is the Gaussian Mixture Model (GMM), a machine learning model widely ...
An important challenge in the field of unsupervised learning is not only the development of algorith...
An important challenge in the field of unsupervised learning is not only the development of algorith...
This work proposes an exponential computation with low-computational complexity and applies this tec...
Gaussian Mixture Model (GMM) statistics are required for maximum likelihood training as well as for ...
Finite mixture models have been widely used for the modelling and analysis of data from heterogeneou...
Gaussian Mixture Models (GMMs) are widely used in many applications such as data mining, signal proc...
In recent years, model selection methods have seen significant advancement, but improvements have te...
The Expectation–Maximization (EM) algorithm is a popular tool in a wide variety of statistical setti...
A challenge for statistical learning is to deal with large data sets, e.g. in data mining. The train...
Abstract—In this paper, we consider simple and fast ap-proaches to initialize the Expectation-Maximi...
Composed of several hundreds of processors, the Graphics Processing Unit (GPU) has become a very int...
Abstract—This paper describes a local and distributed ex-pectation maximization algorithm for learni...
En este documento se hace un estudio de los modelos de mezclas gaussianas. Específicamente, se reali...
Training machine learning (ML) algorithms is a computationally intensive process, which is frequentl...
The central focus of this work is the Gaussian Mixture Model (GMM), a machine learning model widely ...
An important challenge in the field of unsupervised learning is not only the development of algorith...
An important challenge in the field of unsupervised learning is not only the development of algorith...
This work proposes an exponential computation with low-computational complexity and applies this tec...
Gaussian Mixture Model (GMM) statistics are required for maximum likelihood training as well as for ...
Finite mixture models have been widely used for the modelling and analysis of data from heterogeneou...
Gaussian Mixture Models (GMMs) are widely used in many applications such as data mining, signal proc...
In recent years, model selection methods have seen significant advancement, but improvements have te...
The Expectation–Maximization (EM) algorithm is a popular tool in a wide variety of statistical setti...
A challenge for statistical learning is to deal with large data sets, e.g. in data mining. The train...
Abstract—In this paper, we consider simple and fast ap-proaches to initialize the Expectation-Maximi...
Composed of several hundreds of processors, the Graphics Processing Unit (GPU) has become a very int...
Abstract—This paper describes a local and distributed ex-pectation maximization algorithm for learni...
En este documento se hace un estudio de los modelos de mezclas gaussianas. Específicamente, se reali...
Training machine learning (ML) algorithms is a computationally intensive process, which is frequentl...