SpeedBoost: Anytime Prediction with Uniform Near-Optimality

Alex Grubb, Drew Bagnell
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:458-466, 2012.

Abstract

We present SpeedBoost, a natural extension of functional gradient descent, for learning anytime predictors, which automatically trade computation time for predictive accuracy by selecting from a set of simpler candidate predictors. These anytime predictors not only generate approximate predictions rapidly, but are capable of using extra resources at prediction time, when available, to improve performance. We also demonstrate how our framework can be used to select weak predictors which target certain subsets of the data, allowing for efficient use of computational resources on difficult examples. We also show that variants of the SpeedBoost algorithm produce predictors which are provably competitive with any possible sequence of weak predictors with the same total complexity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-grubb12, title = {SpeedBoost: Anytime Prediction with Uniform Near-Optimality}, author = {Grubb, Alex and Bagnell, Drew}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {458--466}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/grubb12/grubb12.pdf}, url = {https://proceedings.mlr.press/v22/grubb12.html}, abstract = {We present SpeedBoost, a natural extension of functional gradient descent, for learning anytime predictors, which automatically trade computation time for predictive accuracy by selecting from a set of simpler candidate predictors. These anytime predictors not only generate approximate predictions rapidly, but are capable of using extra resources at prediction time, when available, to improve performance. We also demonstrate how our framework can be used to select weak predictors which target certain subsets of the data, allowing for efficient use of computational resources on difficult examples. We also show that variants of the SpeedBoost algorithm produce predictors which are provably competitive with any possible sequence of weak predictors with the same total complexity.} }
Endnote
%0 Conference Paper %T SpeedBoost: Anytime Prediction with Uniform Near-Optimality %A Alex Grubb %A Drew Bagnell %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-grubb12 %I PMLR %P 458--466 %U https://proceedings.mlr.press/v22/grubb12.html %V 22 %X We present SpeedBoost, a natural extension of functional gradient descent, for learning anytime predictors, which automatically trade computation time for predictive accuracy by selecting from a set of simpler candidate predictors. These anytime predictors not only generate approximate predictions rapidly, but are capable of using extra resources at prediction time, when available, to improve performance. We also demonstrate how our framework can be used to select weak predictors which target certain subsets of the data, allowing for efficient use of computational resources on difficult examples. We also show that variants of the SpeedBoost algorithm produce predictors which are provably competitive with any possible sequence of weak predictors with the same total complexity.
RIS
TY - CPAPER TI - SpeedBoost: Anytime Prediction with Uniform Near-Optimality AU - Alex Grubb AU - Drew Bagnell BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-grubb12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 458 EP - 466 L1 - http://proceedings.mlr.press/v22/grubb12/grubb12.pdf UR - https://proceedings.mlr.press/v22/grubb12.html AB - We present SpeedBoost, a natural extension of functional gradient descent, for learning anytime predictors, which automatically trade computation time for predictive accuracy by selecting from a set of simpler candidate predictors. These anytime predictors not only generate approximate predictions rapidly, but are capable of using extra resources at prediction time, when available, to improve performance. We also demonstrate how our framework can be used to select weak predictors which target certain subsets of the data, allowing for efficient use of computational resources on difficult examples. We also show that variants of the SpeedBoost algorithm produce predictors which are provably competitive with any possible sequence of weak predictors with the same total complexity. ER -
APA
Grubb, A. & Bagnell, D.. (2012). SpeedBoost: Anytime Prediction with Uniform Near-Optimality. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:458-466 Available from https://proceedings.mlr.press/v22/grubb12.html.

Related Material