Activized Learning: Transforming Passive to Active with Improved Label Complexity
Steve Hanneke; 13(49):1469−1587, 2012.
Abstract
We study the theoretical advantages of active learning over passive learning. Specifically, we prove that, in noise-free classifier learning for VC classes, any passive learning algorithm can be transformed into an active learning algorithm with asymptotically strictly superior label complexity for all nontrivial target functions and distributions. We further provide a general characterization of the magnitudes of these improvements in terms of a novel generalization of the disagreement coefficient. We also extend these results to active learning in the presence of label noise, and find that even under broad classes of noise distributions, we can typically guarantee strict improvements over the known results for passive learning.
[abs]
[pdf][bib]© JMLR 2012. (edit, beta) |