Home Page

Papers

Submissions

News

Editorial Board

Open Source Software

Proceedings (PMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

A Primal-Dual Convergence Analysis of Boosting

Matus Telgarsky; 13(20):561−606, 2012.

Abstract

Boosting combines weak learners into a predictor with low empirical risk. Its dual constructs a high entropy distribution upon which weak learners and training labels are uncorrelated. This manuscript studies this primal-dual relationship under a broad family of losses, including the exponential loss of AdaBoost and the logistic loss, revealing:
• Weak learnability aids the whole loss family: for any ε > 0, O(ln(1/ε)) iterations suffice to produce a predictor with empirical risk ε-close to the infimum;
• The circumstances granting the existence of an empirical risk minimizer may be characterized in terms of the primal and dual problems, yielding a new proof of the known rate O(ln(1/ε));
• Arbitrary instances may be decomposed into the above two, granting rate O(1/ε), with a matching lower bound provided for the logistic loss.

[abs][pdf][bib]       
© JMLR 2012. (edit, beta)