Home Page

Papers

Submissions

News

Editorial Board

Open Source Software

Proceedings (PMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization

Lisha Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh, Ameet Talwalkar; 18(185):1−52, 2018.

Abstract

Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian optimization to adaptively select configurations, we focus on speeding up random search through adaptive resource allocation and early-stopping. We formulate hyperparameter optimization as a pure-exploration non-stochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. We introduce a novel algorithm, øuralg , for this framework and analyze its theoretical properties, providing several desirable guarantees. Furthermore, we compare øuralg with popular Bayesian optimization methods on a suite of hyperparameter optimization problems. We observe that øuralg can provide over an order-of-magnitude speedup over our competitor set on a variety of deep-learning and kernel-based learning problems.

[abs][pdf][bib]       
© JMLR 2018. (edit, beta)