Analysis of Deep Neural Networks with Extended Data Jacobian Matrix

Shengjie Wang, Abdel-rahman Mohamed, Rich Caruana, Jeff Bilmes, Matthai Plilipose, Matthew Richardson, Krzysztof Geras, Gregor Urban, Ozlem Aslan
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:718-726, 2016.

Abstract

Deep neural networks have achieved great successes on various machine learning tasks, however, there are many open fundamental questions to be answered. In this paper, we tackle the problem of quantifying the quality of learned wights of different networks with possibly different architectures, going beyond considering the final classification error as the only metric. We introduce \emphExtended Data Jacobian Matrix to help analyze properties of networks of various structures, finding that, the spectrum of the extended data jacobian matrix is a strong discriminating factor for networks of different structures and performance. Based on such observation, we propose a novel regularization method, which manages to improve the network performance comparably to dropout, which in turn verifies the observation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-wanga16, title = {Analysis of Deep Neural Networks with Extended Data Jacobian Matrix}, author = {Wang, Shengjie and Mohamed, Abdel-rahman and Caruana, Rich and Bilmes, Jeff and Plilipose, Matthai and Richardson, Matthew and Geras, Krzysztof and Urban, Gregor and Aslan, Ozlem}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {718--726}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/wanga16.pdf}, url = {https://proceedings.mlr.press/v48/wanga16.html}, abstract = {Deep neural networks have achieved great successes on various machine learning tasks, however, there are many open fundamental questions to be answered. In this paper, we tackle the problem of quantifying the quality of learned wights of different networks with possibly different architectures, going beyond considering the final classification error as the only metric. We introduce \emphExtended Data Jacobian Matrix to help analyze properties of networks of various structures, finding that, the spectrum of the extended data jacobian matrix is a strong discriminating factor for networks of different structures and performance. Based on such observation, we propose a novel regularization method, which manages to improve the network performance comparably to dropout, which in turn verifies the observation.} }
Endnote
%0 Conference Paper %T Analysis of Deep Neural Networks with Extended Data Jacobian Matrix %A Shengjie Wang %A Abdel-rahman Mohamed %A Rich Caruana %A Jeff Bilmes %A Matthai Plilipose %A Matthew Richardson %A Krzysztof Geras %A Gregor Urban %A Ozlem Aslan %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-wanga16 %I PMLR %P 718--726 %U https://proceedings.mlr.press/v48/wanga16.html %V 48 %X Deep neural networks have achieved great successes on various machine learning tasks, however, there are many open fundamental questions to be answered. In this paper, we tackle the problem of quantifying the quality of learned wights of different networks with possibly different architectures, going beyond considering the final classification error as the only metric. We introduce \emphExtended Data Jacobian Matrix to help analyze properties of networks of various structures, finding that, the spectrum of the extended data jacobian matrix is a strong discriminating factor for networks of different structures and performance. Based on such observation, we propose a novel regularization method, which manages to improve the network performance comparably to dropout, which in turn verifies the observation.
RIS
TY - CPAPER TI - Analysis of Deep Neural Networks with Extended Data Jacobian Matrix AU - Shengjie Wang AU - Abdel-rahman Mohamed AU - Rich Caruana AU - Jeff Bilmes AU - Matthai Plilipose AU - Matthew Richardson AU - Krzysztof Geras AU - Gregor Urban AU - Ozlem Aslan BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-wanga16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 718 EP - 726 L1 - http://proceedings.mlr.press/v48/wanga16.pdf UR - https://proceedings.mlr.press/v48/wanga16.html AB - Deep neural networks have achieved great successes on various machine learning tasks, however, there are many open fundamental questions to be answered. In this paper, we tackle the problem of quantifying the quality of learned wights of different networks with possibly different architectures, going beyond considering the final classification error as the only metric. We introduce \emphExtended Data Jacobian Matrix to help analyze properties of networks of various structures, finding that, the spectrum of the extended data jacobian matrix is a strong discriminating factor for networks of different structures and performance. Based on such observation, we propose a novel regularization method, which manages to improve the network performance comparably to dropout, which in turn verifies the observation. ER -
APA
Wang, S., Mohamed, A., Caruana, R., Bilmes, J., Plilipose, M., Richardson, M., Geras, K., Urban, G. & Aslan, O.. (2016). Analysis of Deep Neural Networks with Extended Data Jacobian Matrix. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:718-726 Available from https://proceedings.mlr.press/v48/wanga16.html.

Related Material