Stochastic Neural Networks with Monotonic Activation Functions

Siamak Ravanbakhsh, Barnabas Poczos, Jeff Schneider, Dale Schuurmans, Russell Greiner
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:809-818, 2016.

Abstract

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-ravanbakhsh16, title = {Stochastic Neural Networks with Monotonic Activation Functions}, author = {Ravanbakhsh, Siamak and Poczos, Barnabas and Schneider, Jeff and Schuurmans, Dale and Greiner, Russell}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {809--818}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/ravanbakhsh16.pdf}, url = {https://proceedings.mlr.press/v51/ravanbakhsh16.html}, abstract = {We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.} }
Endnote
%0 Conference Paper %T Stochastic Neural Networks with Monotonic Activation Functions %A Siamak Ravanbakhsh %A Barnabas Poczos %A Jeff Schneider %A Dale Schuurmans %A Russell Greiner %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-ravanbakhsh16 %I PMLR %P 809--818 %U https://proceedings.mlr.press/v51/ravanbakhsh16.html %V 51 %X We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.
RIS
TY - CPAPER TI - Stochastic Neural Networks with Monotonic Activation Functions AU - Siamak Ravanbakhsh AU - Barnabas Poczos AU - Jeff Schneider AU - Dale Schuurmans AU - Russell Greiner BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-ravanbakhsh16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 809 EP - 818 L1 - http://proceedings.mlr.press/v51/ravanbakhsh16.pdf UR - https://proceedings.mlr.press/v51/ravanbakhsh16.html AB - We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units. ER -
APA
Ravanbakhsh, S., Poczos, B., Schneider, J., Schuurmans, D. & Greiner, R.. (2016). Stochastic Neural Networks with Monotonic Activation Functions. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:809-818 Available from https://proceedings.mlr.press/v51/ravanbakhsh16.html.

Related Material