Regression for sets of polynomial equations

Franz Kiraly, Paul Von Buenau, Jan Muller, Duncan Blythe, Frank Meinecke, Klaus-Robert Muller
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:628-637, 2012.

Abstract

We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-kiraly12, title = {Regression for sets of polynomial equations}, author = {Kiraly, Franz and Buenau, Paul Von and Muller, Jan and Blythe, Duncan and Meinecke, Frank and Muller, Klaus-Robert}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {628--637}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/kiraly12/kiraly12.pdf}, url = {https://proceedings.mlr.press/v22/kiraly12.html}, abstract = {We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.} }
Endnote
%0 Conference Paper %T Regression for sets of polynomial equations %A Franz Kiraly %A Paul Von Buenau %A Jan Muller %A Duncan Blythe %A Frank Meinecke %A Klaus-Robert Muller %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-kiraly12 %I PMLR %P 628--637 %U https://proceedings.mlr.press/v22/kiraly12.html %V 22 %X We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA.
RIS
TY - CPAPER TI - Regression for sets of polynomial equations AU - Franz Kiraly AU - Paul Von Buenau AU - Jan Muller AU - Duncan Blythe AU - Frank Meinecke AU - Klaus-Robert Muller BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-kiraly12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 628 EP - 637 L1 - http://proceedings.mlr.press/v22/kiraly12/kiraly12.pdf UR - https://proceedings.mlr.press/v22/kiraly12.html AB - We propose a method called ideal regression for approximating an arbitrary system of polynomial equations by a system of a particular type. Using techniques from approximate computational algebraic geometry, we show how we can solve ideal regression directly without resorting to numerical optimization. Ideal regression is useful whenever the solution to a learning problem can be described by a system of polynomial equations. As an example, we demonstrate how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA. We then compare this estimator in simulations with previous optimization-based approaches for SSA. ER -
APA
Kiraly, F., Buenau, P.V., Muller, J., Blythe, D., Meinecke, F. & Muller, K.. (2012). Regression for sets of polynomial equations. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:628-637 Available from https://proceedings.mlr.press/v22/kiraly12.html.

Related Material