Gradient Descent Only Converges to Minimizers

Jason D. Lee, Max Simchowitz, Michael I. Jordan, Benjamin Recht
29th Annual Conference on Learning Theory, PMLR 49:1246-1257, 2016.

Abstract

We show that gradient descent converges to a local minimizer, almost surely with random initial- ization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-lee16, title = {Gradient Descent Only Converges to Minimizers}, author = {Lee, Jason D. and Simchowitz, Max and Jordan, Michael I. and Recht, Benjamin}, booktitle = {29th Annual Conference on Learning Theory}, pages = {1246--1257}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/lee16.pdf}, url = {https://proceedings.mlr.press/v49/lee16.html}, abstract = {We show that gradient descent converges to a local minimizer, almost surely with random initial- ization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.} }
Endnote
%0 Conference Paper %T Gradient Descent Only Converges to Minimizers %A Jason D. Lee %A Max Simchowitz %A Michael I. Jordan %A Benjamin Recht %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-lee16 %I PMLR %P 1246--1257 %U https://proceedings.mlr.press/v49/lee16.html %V 49 %X We show that gradient descent converges to a local minimizer, almost surely with random initial- ization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.
RIS
TY - CPAPER TI - Gradient Descent Only Converges to Minimizers AU - Jason D. Lee AU - Max Simchowitz AU - Michael I. Jordan AU - Benjamin Recht BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-lee16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 1246 EP - 1257 L1 - http://proceedings.mlr.press/v49/lee16.pdf UR - https://proceedings.mlr.press/v49/lee16.html AB - We show that gradient descent converges to a local minimizer, almost surely with random initial- ization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory. ER -
APA
Lee, J.D., Simchowitz, M., Jordan, M.I. & Recht, B.. (2016). Gradient Descent Only Converges to Minimizers. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:1246-1257 Available from https://proceedings.mlr.press/v49/lee16.html.

Related Material