Thurstonian Boltzmann Machines: Learning from Multiple Inequalities

Truyen Tran, Dinh Phung, Svetha Venkatesh
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):46-54, 2013.

Abstract

We introduce Thurstonian Boltzmann Machines (TBM), a unified architecture that can naturally incorporate a wide range of data inputs at the same time. Our motivation rests in the Thurstonian view that many discrete data types can be considered as being generated from a subset of underlying latent continuous variables, and in the observation that each realisation of a discrete type imposes certain inequalities on those variables. Thus learning and inference in TBM reduce to making sense of a set of inequalities. Our proposed TBM naturally supports the following types: Gaussian, intervals, censored, binary, categorical, muticategorical, ordinal, (in)-complete rank with and without ties. We demonstrate the versatility and capacity of the proposed model on three applications of very different natures; namely handwritten digit recognition, collaborative filtering and complex social survey analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-tran13, title = {Thurstonian {B}oltzmann Machines: Learning from Multiple Inequalities}, author = {Tran, Truyen and Phung, Dinh and Venkatesh, Svetha}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {46--54}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/tran13.pdf}, url = {https://proceedings.mlr.press/v28/tran13.html}, abstract = {We introduce Thurstonian Boltzmann Machines (TBM), a unified architecture that can naturally incorporate a wide range of data inputs at the same time. Our motivation rests in the Thurstonian view that many discrete data types can be considered as being generated from a subset of underlying latent continuous variables, and in the observation that each realisation of a discrete type imposes certain inequalities on those variables. Thus learning and inference in TBM reduce to making sense of a set of inequalities. Our proposed TBM naturally supports the following types: Gaussian, intervals, censored, binary, categorical, muticategorical, ordinal, (in)-complete rank with and without ties. We demonstrate the versatility and capacity of the proposed model on three applications of very different natures; namely handwritten digit recognition, collaborative filtering and complex social survey analysis.} }
Endnote
%0 Conference Paper %T Thurstonian Boltzmann Machines: Learning from Multiple Inequalities %A Truyen Tran %A Dinh Phung %A Svetha Venkatesh %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-tran13 %I PMLR %P 46--54 %U https://proceedings.mlr.press/v28/tran13.html %V 28 %N 2 %X We introduce Thurstonian Boltzmann Machines (TBM), a unified architecture that can naturally incorporate a wide range of data inputs at the same time. Our motivation rests in the Thurstonian view that many discrete data types can be considered as being generated from a subset of underlying latent continuous variables, and in the observation that each realisation of a discrete type imposes certain inequalities on those variables. Thus learning and inference in TBM reduce to making sense of a set of inequalities. Our proposed TBM naturally supports the following types: Gaussian, intervals, censored, binary, categorical, muticategorical, ordinal, (in)-complete rank with and without ties. We demonstrate the versatility and capacity of the proposed model on three applications of very different natures; namely handwritten digit recognition, collaborative filtering and complex social survey analysis.
RIS
TY - CPAPER TI - Thurstonian Boltzmann Machines: Learning from Multiple Inequalities AU - Truyen Tran AU - Dinh Phung AU - Svetha Venkatesh BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-tran13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 2 SP - 46 EP - 54 L1 - http://proceedings.mlr.press/v28/tran13.pdf UR - https://proceedings.mlr.press/v28/tran13.html AB - We introduce Thurstonian Boltzmann Machines (TBM), a unified architecture that can naturally incorporate a wide range of data inputs at the same time. Our motivation rests in the Thurstonian view that many discrete data types can be considered as being generated from a subset of underlying latent continuous variables, and in the observation that each realisation of a discrete type imposes certain inequalities on those variables. Thus learning and inference in TBM reduce to making sense of a set of inequalities. Our proposed TBM naturally supports the following types: Gaussian, intervals, censored, binary, categorical, muticategorical, ordinal, (in)-complete rank with and without ties. We demonstrate the versatility and capacity of the proposed model on three applications of very different natures; namely handwritten digit recognition, collaborative filtering and complex social survey analysis. ER -
APA
Tran, T., Phung, D. & Venkatesh, S.. (2013). Thurstonian Boltzmann Machines: Learning from Multiple Inequalities. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(2):46-54 Available from https://proceedings.mlr.press/v28/tran13.html.

Related Material