Adaptive Task Assignment for Crowdsourced Classification

Chien-Ju Ho, Shahin Jabbari, Jennifer Wortman Vaughan
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):534-542, 2013.

Abstract

Crowdsourcing markets have gained popularity as a tool for inexpensively collecting data from diverse populations of workers. Classification tasks, in which workers provide labels (such as “offensive” or “not offensive”) for instances (such as websites), are among the most common tasks posted, but due to a mix of human error and the overwhelming prevalence of spam, the labels collected are often noisy. This problem is typically addressed by collecting labels for each instance from multiple workers and combining them in a clever way. However, the question of how to choose which tasks to assign to each worker is often overlooked. We investigate the problem of task assignment and label inference for heterogeneous classification tasks. By applying online primal-dual techniques, we derive a provably near-optimal adaptive assignment algorithm. We show that adaptively assigning workers to tasks can lead to more accurate predictions at a lower cost when the available workers are diverse.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-ho13, title = {Adaptive Task Assignment for Crowdsourced Classification}, author = {Ho, Chien-Ju and Jabbari, Shahin and Vaughan, Jennifer Wortman}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {534--542}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/ho13.pdf}, url = {https://proceedings.mlr.press/v28/ho13.html}, abstract = {Crowdsourcing markets have gained popularity as a tool for inexpensively collecting data from diverse populations of workers. Classification tasks, in which workers provide labels (such as “offensive” or “not offensive”) for instances (such as websites), are among the most common tasks posted, but due to a mix of human error and the overwhelming prevalence of spam, the labels collected are often noisy. This problem is typically addressed by collecting labels for each instance from multiple workers and combining them in a clever way. However, the question of how to choose which tasks to assign to each worker is often overlooked. We investigate the problem of task assignment and label inference for heterogeneous classification tasks. By applying online primal-dual techniques, we derive a provably near-optimal adaptive assignment algorithm. We show that adaptively assigning workers to tasks can lead to more accurate predictions at a lower cost when the available workers are diverse.} }
Endnote
%0 Conference Paper %T Adaptive Task Assignment for Crowdsourced Classification %A Chien-Ju Ho %A Shahin Jabbari %A Jennifer Wortman Vaughan %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-ho13 %I PMLR %P 534--542 %U https://proceedings.mlr.press/v28/ho13.html %V 28 %N 1 %X Crowdsourcing markets have gained popularity as a tool for inexpensively collecting data from diverse populations of workers. Classification tasks, in which workers provide labels (such as “offensive” or “not offensive”) for instances (such as websites), are among the most common tasks posted, but due to a mix of human error and the overwhelming prevalence of spam, the labels collected are often noisy. This problem is typically addressed by collecting labels for each instance from multiple workers and combining them in a clever way. However, the question of how to choose which tasks to assign to each worker is often overlooked. We investigate the problem of task assignment and label inference for heterogeneous classification tasks. By applying online primal-dual techniques, we derive a provably near-optimal adaptive assignment algorithm. We show that adaptively assigning workers to tasks can lead to more accurate predictions at a lower cost when the available workers are diverse.
RIS
TY - CPAPER TI - Adaptive Task Assignment for Crowdsourced Classification AU - Chien-Ju Ho AU - Shahin Jabbari AU - Jennifer Wortman Vaughan BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-ho13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 1 SP - 534 EP - 542 L1 - http://proceedings.mlr.press/v28/ho13.pdf UR - https://proceedings.mlr.press/v28/ho13.html AB - Crowdsourcing markets have gained popularity as a tool for inexpensively collecting data from diverse populations of workers. Classification tasks, in which workers provide labels (such as “offensive” or “not offensive”) for instances (such as websites), are among the most common tasks posted, but due to a mix of human error and the overwhelming prevalence of spam, the labels collected are often noisy. This problem is typically addressed by collecting labels for each instance from multiple workers and combining them in a clever way. However, the question of how to choose which tasks to assign to each worker is often overlooked. We investigate the problem of task assignment and label inference for heterogeneous classification tasks. By applying online primal-dual techniques, we derive a provably near-optimal adaptive assignment algorithm. We show that adaptively assigning workers to tasks can lead to more accurate predictions at a lower cost when the available workers are diverse. ER -
APA
Ho, C., Jabbari, S. & Vaughan, J.W.. (2013). Adaptive Task Assignment for Crowdsourced Classification. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(1):534-542 Available from https://proceedings.mlr.press/v28/ho13.html.

Related Material