Learning and Testing Junta Distributions

Maryam Aliakbarpour, Eric Blais, Ronitt Rubinfeld
29th Annual Conference on Learning Theory, PMLR 49:19-46, 2016.

Abstract

We consider the problem of learning distributions in the presence of irrelevant features. This problem is formalized by introducing a new notion of \emphk-junta distributions. Informally, a distribution \mathcalD over the domain \mathcalX^n is a \emphk-junta distribution with respect to another distribution \mathcalU over the same domain if there is a set J ⊆[n] of size |J| \le k that captures the difference between \mathcal D and \mathcalU. We show that it is possible to learn k-junta distributions with respect to the uniform distribution over the Boolean hypercube {0,1}^n in time \poly(n^k, 1/ε). This result is obtained via a new Fourier-based learning algorithm inspired by the Low-Degree Algorithm of Linial, Mansour, and Nisan (1993). We also consider the problem of testing whether an unknown distribution is a k-junta distribution with respect to the uniform distribution. We give a nearly-optimal algorithm for this task. Both the analysis of the algorithm and the lower bound showing its optimality are obtained by establishing connections between the problem of testing junta distributions and testing uniformity of weighted collections of distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-aliakbarpour16, title = {Learning and Testing Junta Distributions}, author = {Aliakbarpour, Maryam and Blais, Eric and Rubinfeld, Ronitt}, booktitle = {29th Annual Conference on Learning Theory}, pages = {19--46}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/aliakbarpour16.pdf}, url = {https://proceedings.mlr.press/v49/aliakbarpour16.html}, abstract = {We consider the problem of learning distributions in the presence of irrelevant features. This problem is formalized by introducing a new notion of \emphk-junta distributions. Informally, a distribution \mathcalD over the domain \mathcalX^n is a \emphk-junta distribution with respect to another distribution \mathcalU over the same domain if there is a set J ⊆[n] of size |J| \le k that captures the difference between \mathcal D and \mathcalU. We show that it is possible to learn k-junta distributions with respect to the uniform distribution over the Boolean hypercube {0,1}^n in time \poly(n^k, 1/ε). This result is obtained via a new Fourier-based learning algorithm inspired by the Low-Degree Algorithm of Linial, Mansour, and Nisan (1993). We also consider the problem of testing whether an unknown distribution is a k-junta distribution with respect to the uniform distribution. We give a nearly-optimal algorithm for this task. Both the analysis of the algorithm and the lower bound showing its optimality are obtained by establishing connections between the problem of testing junta distributions and testing uniformity of weighted collections of distributions. } }
Endnote
%0 Conference Paper %T Learning and Testing Junta Distributions %A Maryam Aliakbarpour %A Eric Blais %A Ronitt Rubinfeld %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-aliakbarpour16 %I PMLR %P 19--46 %U https://proceedings.mlr.press/v49/aliakbarpour16.html %V 49 %X We consider the problem of learning distributions in the presence of irrelevant features. This problem is formalized by introducing a new notion of \emphk-junta distributions. Informally, a distribution \mathcalD over the domain \mathcalX^n is a \emphk-junta distribution with respect to another distribution \mathcalU over the same domain if there is a set J ⊆[n] of size |J| \le k that captures the difference between \mathcal D and \mathcalU. We show that it is possible to learn k-junta distributions with respect to the uniform distribution over the Boolean hypercube {0,1}^n in time \poly(n^k, 1/ε). This result is obtained via a new Fourier-based learning algorithm inspired by the Low-Degree Algorithm of Linial, Mansour, and Nisan (1993). We also consider the problem of testing whether an unknown distribution is a k-junta distribution with respect to the uniform distribution. We give a nearly-optimal algorithm for this task. Both the analysis of the algorithm and the lower bound showing its optimality are obtained by establishing connections between the problem of testing junta distributions and testing uniformity of weighted collections of distributions.
RIS
TY - CPAPER TI - Learning and Testing Junta Distributions AU - Maryam Aliakbarpour AU - Eric Blais AU - Ronitt Rubinfeld BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-aliakbarpour16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 19 EP - 46 L1 - http://proceedings.mlr.press/v49/aliakbarpour16.pdf UR - https://proceedings.mlr.press/v49/aliakbarpour16.html AB - We consider the problem of learning distributions in the presence of irrelevant features. This problem is formalized by introducing a new notion of \emphk-junta distributions. Informally, a distribution \mathcalD over the domain \mathcalX^n is a \emphk-junta distribution with respect to another distribution \mathcalU over the same domain if there is a set J ⊆[n] of size |J| \le k that captures the difference between \mathcal D and \mathcalU. We show that it is possible to learn k-junta distributions with respect to the uniform distribution over the Boolean hypercube {0,1}^n in time \poly(n^k, 1/ε). This result is obtained via a new Fourier-based learning algorithm inspired by the Low-Degree Algorithm of Linial, Mansour, and Nisan (1993). We also consider the problem of testing whether an unknown distribution is a k-junta distribution with respect to the uniform distribution. We give a nearly-optimal algorithm for this task. Both the analysis of the algorithm and the lower bound showing its optimality are obtained by establishing connections between the problem of testing junta distributions and testing uniformity of weighted collections of distributions. ER -
APA
Aliakbarpour, M., Blais, E. & Rubinfeld, R.. (2016). Learning and Testing Junta Distributions. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:19-46 Available from https://proceedings.mlr.press/v49/aliakbarpour16.html.

Related Material