Regularization of Neural Networks using DropConnect

Li Wan, Matthew Zeiler, Sixin Zhang, Yann Le Cun, Rob Fergus
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1058-1066, 2013.

Abstract

We introduce DropConnect, a generalization of DropOut, for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recoginition benchmarks can be obtained by aggregating multiple DropConnect-trained models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-wan13, title = {Regularization of Neural Networks using DropConnect}, author = {Wan, Li and Zeiler, Matthew and Zhang, Sixin and Le Cun, Yann and Fergus, Rob}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {1058--1066}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/wan13.pdf}, url = {https://proceedings.mlr.press/v28/wan13.html}, abstract = {We introduce DropConnect, a generalization of DropOut, for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recoginition benchmarks can be obtained by aggregating multiple DropConnect-trained models.} }
Endnote
%0 Conference Paper %T Regularization of Neural Networks using DropConnect %A Li Wan %A Matthew Zeiler %A Sixin Zhang %A Yann Le Cun %A Rob Fergus %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-wan13 %I PMLR %P 1058--1066 %U https://proceedings.mlr.press/v28/wan13.html %V 28 %N 3 %X We introduce DropConnect, a generalization of DropOut, for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recoginition benchmarks can be obtained by aggregating multiple DropConnect-trained models.
RIS
TY - CPAPER TI - Regularization of Neural Networks using DropConnect AU - Li Wan AU - Matthew Zeiler AU - Sixin Zhang AU - Yann Le Cun AU - Rob Fergus BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-wan13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 1058 EP - 1066 L1 - http://proceedings.mlr.press/v28/wan13.pdf UR - https://proceedings.mlr.press/v28/wan13.html AB - We introduce DropConnect, a generalization of DropOut, for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recoginition benchmarks can be obtained by aggregating multiple DropConnect-trained models. ER -
APA
Wan, L., Zeiler, M., Zhang, S., Le Cun, Y. & Fergus, R.. (2013). Regularization of Neural Networks using DropConnect. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):1058-1066 Available from https://proceedings.mlr.press/v28/wan13.html.

Related Material