PLAL: Cluster-based active learning

Ruth Urner, Sharon Wulff, Shai Ben-David
Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:376-397, 2013.

Abstract

We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v30-Urner13, title = {PLAL: Cluster-based active learning}, author = {Urner, Ruth and Wulff, Sharon and Ben-David, Shai}, booktitle = {Proceedings of the 26th Annual Conference on Learning Theory}, pages = {376--397}, year = {2013}, editor = {Shalev-Shwartz, Shai and Steinwart, Ingo}, volume = {30}, series = {Proceedings of Machine Learning Research}, address = {Princeton, NJ, USA}, month = {12--14 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v30/Urner13.pdf}, url = {https://proceedings.mlr.press/v30/Urner13.html}, abstract = {We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations.} }
Endnote
%0 Conference Paper %T PLAL: Cluster-based active learning %A Ruth Urner %A Sharon Wulff %A Shai Ben-David %B Proceedings of the 26th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2013 %E Shai Shalev-Shwartz %E Ingo Steinwart %F pmlr-v30-Urner13 %I PMLR %P 376--397 %U https://proceedings.mlr.press/v30/Urner13.html %V 30 %X We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations.
RIS
TY - CPAPER TI - PLAL: Cluster-based active learning AU - Ruth Urner AU - Sharon Wulff AU - Shai Ben-David BT - Proceedings of the 26th Annual Conference on Learning Theory DA - 2013/06/13 ED - Shai Shalev-Shwartz ED - Ingo Steinwart ID - pmlr-v30-Urner13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 30 SP - 376 EP - 397 L1 - http://proceedings.mlr.press/v30/Urner13.pdf UR - https://proceedings.mlr.press/v30/Urner13.html AB - We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. The procedure takes an unlabeledsample, queries the labels of some of its members, and outputs a full labeling of that sample. Assuming the data satisfies “Probabilistic Lipschitzness”, a notion of clusterability, we show that for several common learning paradigms, applying our procedure as a preprocessing leads to provable label complexity reductions (over any “passive”learning algorithm, under the same data assumptions). Our labeling procedure is simple and easy to implement. We complement our theoretical findings with experimental validations. ER -
APA
Urner, R., Wulff, S. & Ben-David, S.. (2013). PLAL: Cluster-based active learning. Proceedings of the 26th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 30:376-397 Available from https://proceedings.mlr.press/v30/Urner13.html.

Related Material