Efficient Training of LDA on a GPU by Mean-for-Mode Estimation

Jean-Baptiste Tristan, Joseph Tassarotti, Guy Steele
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:59-68, 2015.

Abstract

We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler — and unlike an uncollapsed Gibbs sampler — it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like an uncollapsed Gibbs sampler — and unlike a collapsed Gibbs sampler — it is embarrassingly parallel, and can use approximate counters.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-tristan15, title = {Efficient Training of LDA on a GPU by Mean-for-Mode Estimation}, author = {Tristan, Jean-Baptiste and Tassarotti, Joseph and Steele, Guy}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {59--68}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/tristan15.pdf}, url = {https://proceedings.mlr.press/v37/tristan15.html}, abstract = {We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler — and unlike an uncollapsed Gibbs sampler — it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like an uncollapsed Gibbs sampler — and unlike a collapsed Gibbs sampler — it is embarrassingly parallel, and can use approximate counters.} }
Endnote
%0 Conference Paper %T Efficient Training of LDA on a GPU by Mean-for-Mode Estimation %A Jean-Baptiste Tristan %A Joseph Tassarotti %A Guy Steele %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-tristan15 %I PMLR %P 59--68 %U https://proceedings.mlr.press/v37/tristan15.html %V 37 %X We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler — and unlike an uncollapsed Gibbs sampler — it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like an uncollapsed Gibbs sampler — and unlike a collapsed Gibbs sampler — it is embarrassingly parallel, and can use approximate counters.
RIS
TY - CPAPER TI - Efficient Training of LDA on a GPU by Mean-for-Mode Estimation AU - Jean-Baptiste Tristan AU - Joseph Tassarotti AU - Guy Steele BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-tristan15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 59 EP - 68 L1 - http://proceedings.mlr.press/v37/tristan15.pdf UR - https://proceedings.mlr.press/v37/tristan15.html AB - We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler — and unlike an uncollapsed Gibbs sampler — it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like an uncollapsed Gibbs sampler — and unlike a collapsed Gibbs sampler — it is embarrassingly parallel, and can use approximate counters. ER -
APA
Tristan, J., Tassarotti, J. & Steele, G.. (2015). Efficient Training of LDA on a GPU by Mean-for-Mode Estimation. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:59-68 Available from https://proceedings.mlr.press/v37/tristan15.html.

Related Material