Lifted Linear Programming

Martin Mladenov, Babak Ahmadi, Kristian Kersting
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:788-797, 2012.

Abstract

Lifted inference approaches have rendered large, previously intractable probabilistic inference problems quickly solvable by handling whole sets of indistinguishable objects together. Triggered by this success, we show that another important AI technique is liftable, too, namely linear programming. Intuitively, given a linear program (LP), we employ a lifted variant of Gaussian belief propagation (GaBP) to solve the systems of linear equations arising when running an interior-point method to solve the LP. However, this naive solution cannot make use of standard solvers for linear equations and is doomed to construct lifted networks in each iteration of the interior-point method again, an operation that can itself be quite costly. To address both issues, we show how to read off an equivalent LP from the lifted GaBP computations that can be solved using any off-the-shelf LP solver. We prove the correctness of this compilation approach, including a lifted duality theorem, and experimentally demonstrate that it can greatly reduce the cost of solving LPs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-mladenov12, title = {Lifted Linear Programming}, author = {Mladenov, Martin and Ahmadi, Babak and Kersting, Kristian}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {788--797}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/mladenov12/mladenov12.pdf}, url = {https://proceedings.mlr.press/v22/mladenov12.html}, abstract = {Lifted inference approaches have rendered large, previously intractable probabilistic inference problems quickly solvable by handling whole sets of indistinguishable objects together. Triggered by this success, we show that another important AI technique is liftable, too, namely linear programming. Intuitively, given a linear program (LP), we employ a lifted variant of Gaussian belief propagation (GaBP) to solve the systems of linear equations arising when running an interior-point method to solve the LP. However, this naive solution cannot make use of standard solvers for linear equations and is doomed to construct lifted networks in each iteration of the interior-point method again, an operation that can itself be quite costly. To address both issues, we show how to read off an equivalent LP from the lifted GaBP computations that can be solved using any off-the-shelf LP solver. We prove the correctness of this compilation approach, including a lifted duality theorem, and experimentally demonstrate that it can greatly reduce the cost of solving LPs.} }
Endnote
%0 Conference Paper %T Lifted Linear Programming %A Martin Mladenov %A Babak Ahmadi %A Kristian Kersting %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-mladenov12 %I PMLR %P 788--797 %U https://proceedings.mlr.press/v22/mladenov12.html %V 22 %X Lifted inference approaches have rendered large, previously intractable probabilistic inference problems quickly solvable by handling whole sets of indistinguishable objects together. Triggered by this success, we show that another important AI technique is liftable, too, namely linear programming. Intuitively, given a linear program (LP), we employ a lifted variant of Gaussian belief propagation (GaBP) to solve the systems of linear equations arising when running an interior-point method to solve the LP. However, this naive solution cannot make use of standard solvers for linear equations and is doomed to construct lifted networks in each iteration of the interior-point method again, an operation that can itself be quite costly. To address both issues, we show how to read off an equivalent LP from the lifted GaBP computations that can be solved using any off-the-shelf LP solver. We prove the correctness of this compilation approach, including a lifted duality theorem, and experimentally demonstrate that it can greatly reduce the cost of solving LPs.
RIS
TY - CPAPER TI - Lifted Linear Programming AU - Martin Mladenov AU - Babak Ahmadi AU - Kristian Kersting BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-mladenov12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 788 EP - 797 L1 - http://proceedings.mlr.press/v22/mladenov12/mladenov12.pdf UR - https://proceedings.mlr.press/v22/mladenov12.html AB - Lifted inference approaches have rendered large, previously intractable probabilistic inference problems quickly solvable by handling whole sets of indistinguishable objects together. Triggered by this success, we show that another important AI technique is liftable, too, namely linear programming. Intuitively, given a linear program (LP), we employ a lifted variant of Gaussian belief propagation (GaBP) to solve the systems of linear equations arising when running an interior-point method to solve the LP. However, this naive solution cannot make use of standard solvers for linear equations and is doomed to construct lifted networks in each iteration of the interior-point method again, an operation that can itself be quite costly. To address both issues, we show how to read off an equivalent LP from the lifted GaBP computations that can be solved using any off-the-shelf LP solver. We prove the correctness of this compilation approach, including a lifted duality theorem, and experimentally demonstrate that it can greatly reduce the cost of solving LPs. ER -
APA
Mladenov, M., Ahmadi, B. & Kersting, K.. (2012). Lifted Linear Programming. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:788-797 Available from https://proceedings.mlr.press/v22/mladenov12.html.

Related Material