Fast Algorithms for Segmented Regression

Jayadev Acharya, Ilias Diakonikolas, Jerry Li, Ludwig Schmidt
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2878-2886, 2016.

Abstract

We study the fixed design segmented regression problem: Given noisy samples from a piecewise linear function f, we want to recover f up to a desired accuracy in mean-squared error. Previous rigorous approaches for this problem rely on dynamic programming (DP) and, while sample efficient, have running time quadratic in the sample size. As our main contribution, we provide new sample near-linear time algorithms for the problem that - while not being minimax optimal - achieve a significantly better sample-time tradeoff on large datasets compared to the DP approach. Our experimental evaluation shows that, compared with the DP approach, our algorithms provide a convergence rate that is only off by a factor of 2 to 4, while achieving speedups of three orders of magnitude.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-acharya16, title = {Fast Algorithms for Segmented Regression}, author = {Acharya, Jayadev and Diakonikolas, Ilias and Li, Jerry and Schmidt, Ludwig}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2878--2886}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/acharya16.pdf}, url = {https://proceedings.mlr.press/v48/acharya16.html}, abstract = {We study the fixed design segmented regression problem: Given noisy samples from a piecewise linear function f, we want to recover f up to a desired accuracy in mean-squared error. Previous rigorous approaches for this problem rely on dynamic programming (DP) and, while sample efficient, have running time quadratic in the sample size. As our main contribution, we provide new sample near-linear time algorithms for the problem that - while not being minimax optimal - achieve a significantly better sample-time tradeoff on large datasets compared to the DP approach. Our experimental evaluation shows that, compared with the DP approach, our algorithms provide a convergence rate that is only off by a factor of 2 to 4, while achieving speedups of three orders of magnitude.} }
Endnote
%0 Conference Paper %T Fast Algorithms for Segmented Regression %A Jayadev Acharya %A Ilias Diakonikolas %A Jerry Li %A Ludwig Schmidt %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-acharya16 %I PMLR %P 2878--2886 %U https://proceedings.mlr.press/v48/acharya16.html %V 48 %X We study the fixed design segmented regression problem: Given noisy samples from a piecewise linear function f, we want to recover f up to a desired accuracy in mean-squared error. Previous rigorous approaches for this problem rely on dynamic programming (DP) and, while sample efficient, have running time quadratic in the sample size. As our main contribution, we provide new sample near-linear time algorithms for the problem that - while not being minimax optimal - achieve a significantly better sample-time tradeoff on large datasets compared to the DP approach. Our experimental evaluation shows that, compared with the DP approach, our algorithms provide a convergence rate that is only off by a factor of 2 to 4, while achieving speedups of three orders of magnitude.
RIS
TY - CPAPER TI - Fast Algorithms for Segmented Regression AU - Jayadev Acharya AU - Ilias Diakonikolas AU - Jerry Li AU - Ludwig Schmidt BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-acharya16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2878 EP - 2886 L1 - http://proceedings.mlr.press/v48/acharya16.pdf UR - https://proceedings.mlr.press/v48/acharya16.html AB - We study the fixed design segmented regression problem: Given noisy samples from a piecewise linear function f, we want to recover f up to a desired accuracy in mean-squared error. Previous rigorous approaches for this problem rely on dynamic programming (DP) and, while sample efficient, have running time quadratic in the sample size. As our main contribution, we provide new sample near-linear time algorithms for the problem that - while not being minimax optimal - achieve a significantly better sample-time tradeoff on large datasets compared to the DP approach. Our experimental evaluation shows that, compared with the DP approach, our algorithms provide a convergence rate that is only off by a factor of 2 to 4, while achieving speedups of three orders of magnitude. ER -
APA
Acharya, J., Diakonikolas, I., Li, J. & Schmidt, L.. (2016). Fast Algorithms for Segmented Regression. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2878-2886 Available from https://proceedings.mlr.press/v48/acharya16.html.

Related Material