Por favor, use este identificador para citar o enlazar este ítem: http://ri.ufs.br/jspui/handle/riufs/1764
Tipo de Documento: Artigo
Título : Parallel implementation of Expectation-Maximisation algorithm for the training of Gaussian Mixture Models
Autor : Araújo, Gabriel Ferreira
Macedo, Hendrik Teixeira
Chella, Marco Túlio
Estombelo Montesco, Carlos Alberto
Medeiros, Marcus Vinícius Oliveira
Fecha de publicación : jul-2014
Resumen : Most machine learning algorithms need to handle large data sets. This feature often leads to limitations on processing time and memory. The Expectation-Maximization (EM) is one of such algorithms, which is used to train one of the most commonly used parametric statistical models, the Gaussian Mixture Models (GMM). All steps of the algorithm are potentially parallelizable once they iterate over the entire data set. In this study, we propose a parallel implementation of EM for training GMM using CUDA. Experiments are performed with a UCI dataset and results show a speedup of 7 if compared to the sequential version. We have also carried out modifications to the code in order to provide better access to global memory and shared memory usage. We have achieved up to 56.4% of achieved occupancy, regardless the number of Gaussians considered in the set of experiments.
Palabras clave : Expectation-Maximization (EM)
Gaussian Mixture Models (GMM)
CUDA
Modelo de misturas guassianas
ISSN : 1552-6607
Institución / Editorial : Science Publications
Citación : ARAÚJO, G. F. et al. Parallel implementation of Expectation-Maximisation algorithm for the training of Gaussian Mixture Models. Journal of Computer Science, v. 10, n. 10, jul. 2014. Disponível em: <http://thescipub.com/abstract/10.3844/jcssp.2014.2124.2134>. Acesso em: 16 maio 2016.
License: Creative Commons Attribution License
URI : https://ri.ufs.br/handle/riufs/1764
Aparece en las colecciones: DCOMP - Artigos de periódicos

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
ExpectationMaximisationAlgorithm.pdf226,02 kBAdobe PDFVista previa
Visualizar/Abrir


Los ítems de DSpace están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.