The Geometry of Losses

Robert C. Williamson
Proceedings of The 27th Conference on Learning Theory, PMLR 35:1078-1108, 2014.

Abstract

Loss functions are central to machine learning because they are the means by which the quality of a prediction is evaluated. Any loss that is not proper, or can not be transformed to be proper via a link function is inadmissible. All admissible losses for n-class problems can be obtained in terms of a convex body in \mathbbR^n. We show this explicitly and show how some existing results simplify when viewed from this perspective. This allows the development of a rich algebra of losses induced by binary operations on convex bodies (that return a convex body). Furthermore it allows us to define an “inverse loss” which provides a universal “substitution function” for the Aggregating Algorithm. In doing so we show a formal connection between proper losses and norms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-williamson14, title = {The Geometry of Losses}, author = {Williamson, Robert C.}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {1078--1108}, year = {2014}, editor = {Balcan, Maria Florina and Feldman, Vitaly and Szepesvári, Csaba}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/williamson14.pdf}, url = {https://proceedings.mlr.press/v35/williamson14.html}, abstract = {Loss functions are central to machine learning because they are the means by which the quality of a prediction is evaluated. Any loss that is not proper, or can not be transformed to be proper via a link function is inadmissible. All admissible losses for n-class problems can be obtained in terms of a convex body in \mathbbR^n. We show this explicitly and show how some existing results simplify when viewed from this perspective. This allows the development of a rich algebra of losses induced by binary operations on convex bodies (that return a convex body). Furthermore it allows us to define an “inverse loss” which provides a universal “substitution function” for the Aggregating Algorithm. In doing so we show a formal connection between proper losses and norms. } }
Endnote
%0 Conference Paper %T The Geometry of Losses %A Robert C. Williamson %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-williamson14 %I PMLR %P 1078--1108 %U https://proceedings.mlr.press/v35/williamson14.html %V 35 %X Loss functions are central to machine learning because they are the means by which the quality of a prediction is evaluated. Any loss that is not proper, or can not be transformed to be proper via a link function is inadmissible. All admissible losses for n-class problems can be obtained in terms of a convex body in \mathbbR^n. We show this explicitly and show how some existing results simplify when viewed from this perspective. This allows the development of a rich algebra of losses induced by binary operations on convex bodies (that return a convex body). Furthermore it allows us to define an “inverse loss” which provides a universal “substitution function” for the Aggregating Algorithm. In doing so we show a formal connection between proper losses and norms.
RIS
TY - CPAPER TI - The Geometry of Losses AU - Robert C. Williamson BT - Proceedings of The 27th Conference on Learning Theory DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-williamson14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 35 SP - 1078 EP - 1108 L1 - http://proceedings.mlr.press/v35/williamson14.pdf UR - https://proceedings.mlr.press/v35/williamson14.html AB - Loss functions are central to machine learning because they are the means by which the quality of a prediction is evaluated. Any loss that is not proper, or can not be transformed to be proper via a link function is inadmissible. All admissible losses for n-class problems can be obtained in terms of a convex body in \mathbbR^n. We show this explicitly and show how some existing results simplify when viewed from this perspective. This allows the development of a rich algebra of losses induced by binary operations on convex bodies (that return a convex body). Furthermore it allows us to define an “inverse loss” which provides a universal “substitution function” for the Aggregating Algorithm. In doing so we show a formal connection between proper losses and norms. ER -
APA
Williamson, R.C.. (2014). The Geometry of Losses. Proceedings of The 27th Conference on Learning Theory, in Proceedings of Machine Learning Research 35:1078-1108 Available from https://proceedings.mlr.press/v35/williamson14.html.

Related Material