On Partitioning Rules for Bipartite Ranking

Stephan Clemencon, Nicolas Vayatis
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:97-104, 2009.

Abstract

The purpose of this paper is to investigate the properties of partitioning scoring rules in the bipartite ranking setup. We focus on ranking rules based on scoring functions. General sufficient conditions for the AUC consistency of scoring functions that are constant on cells of a partition of the feature space are provided. Rate bounds are obtained for cubic histogram scoring rules under mild smoothness assumptions on the regression function. In this setup, it is shown how to penalize the empirical AUC criterion in order to select a scoring rule nearly as good as the one that can be built when the degree of smoothness of the regression function is known.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-clemencon09a, title = {On Partitioning Rules for Bipartite Ranking}, author = {Clemencon, Stephan and Vayatis, Nicolas}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {97--104}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/clemencon09a/clemencon09a.pdf}, url = {https://proceedings.mlr.press/v5/clemencon09a.html}, abstract = {The purpose of this paper is to investigate the properties of partitioning scoring rules in the bipartite ranking setup. We focus on ranking rules based on scoring functions. General sufficient conditions for the AUC consistency of scoring functions that are constant on cells of a partition of the feature space are provided. Rate bounds are obtained for cubic histogram scoring rules under mild smoothness assumptions on the regression function. In this setup, it is shown how to penalize the empirical AUC criterion in order to select a scoring rule nearly as good as the one that can be built when the degree of smoothness of the regression function is known.} }
Endnote
%0 Conference Paper %T On Partitioning Rules for Bipartite Ranking %A Stephan Clemencon %A Nicolas Vayatis %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-clemencon09a %I PMLR %P 97--104 %U https://proceedings.mlr.press/v5/clemencon09a.html %V 5 %X The purpose of this paper is to investigate the properties of partitioning scoring rules in the bipartite ranking setup. We focus on ranking rules based on scoring functions. General sufficient conditions for the AUC consistency of scoring functions that are constant on cells of a partition of the feature space are provided. Rate bounds are obtained for cubic histogram scoring rules under mild smoothness assumptions on the regression function. In this setup, it is shown how to penalize the empirical AUC criterion in order to select a scoring rule nearly as good as the one that can be built when the degree of smoothness of the regression function is known.
RIS
TY - CPAPER TI - On Partitioning Rules for Bipartite Ranking AU - Stephan Clemencon AU - Nicolas Vayatis BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-clemencon09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 97 EP - 104 L1 - http://proceedings.mlr.press/v5/clemencon09a/clemencon09a.pdf UR - https://proceedings.mlr.press/v5/clemencon09a.html AB - The purpose of this paper is to investigate the properties of partitioning scoring rules in the bipartite ranking setup. We focus on ranking rules based on scoring functions. General sufficient conditions for the AUC consistency of scoring functions that are constant on cells of a partition of the feature space are provided. Rate bounds are obtained for cubic histogram scoring rules under mild smoothness assumptions on the regression function. In this setup, it is shown how to penalize the empirical AUC criterion in order to select a scoring rule nearly as good as the one that can be built when the degree of smoothness of the regression function is known. ER -
APA
Clemencon, S. & Vayatis, N.. (2009). On Partitioning Rules for Bipartite Ranking. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:97-104 Available from https://proceedings.mlr.press/v5/clemencon09a.html.

Related Material