Complexity Theoretic Limitations on Learning DNF’s

Amit Daniely, Shai Shalev-Shwartz
29th Annual Conference on Learning Theory, PMLR 49:815-830, 2016.

Abstract

Using the recently developed framework of Daniely, Linial and Shalev-Shwartz, we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of logarithmically many halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions).

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-daniely16, title = {Complexity Theoretic Limitations on Learning DNF's}, author = {Daniely, Amit and Shalev-Shwartz, Shai}, booktitle = {29th Annual Conference on Learning Theory}, pages = {815--830}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/daniely16.pdf}, url = {https://proceedings.mlr.press/v49/daniely16.html}, abstract = {Using the recently developed framework of Daniely, Linial and Shalev-Shwartz, we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of logarithmically many halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions). } }
Endnote
%0 Conference Paper %T Complexity Theoretic Limitations on Learning DNF’s %A Amit Daniely %A Shai Shalev-Shwartz %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-daniely16 %I PMLR %P 815--830 %U https://proceedings.mlr.press/v49/daniely16.html %V 49 %X Using the recently developed framework of Daniely, Linial and Shalev-Shwartz, we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of logarithmically many halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions).
RIS
TY - CPAPER TI - Complexity Theoretic Limitations on Learning DNF’s AU - Amit Daniely AU - Shai Shalev-Shwartz BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-daniely16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 815 EP - 830 L1 - http://proceedings.mlr.press/v49/daniely16.pdf UR - https://proceedings.mlr.press/v49/daniely16.html AB - Using the recently developed framework of Daniely, Linial and Shalev-Shwartz, we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of logarithmically many halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions). ER -
APA
Daniely, A. & Shalev-Shwartz, S.. (2016). Complexity Theoretic Limitations on Learning DNF’s. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:815-830 Available from https://proceedings.mlr.press/v49/daniely16.html.

Related Material