Predictive Entropy Search for Multi-objective Bayesian Optimization

Daniel Hernandez-Lobato, Jose Hernandez-Lobato, Amar Shah, Ryan Adams
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1492-1501, 2016.

Abstract

We present \small PESMO, a Bayesian method for identifying the Pareto set of multi-objective optimization problems, when the functions are expensive to evaluate. \small PESMO chooses the evaluation points to maximally reduce the entropy of the posterior distribution over the Pareto set. The \small PESMO acquisition function is decomposed as a sum of objective-specific acquisition functions, which makes it possible to use the algorithm in \emphdecoupled scenarios in which the objectives can be evaluated separately and perhaps with different costs. This decoupling capability is useful to identify difficult objectives that require more evaluations. \small PESMO also offers gains in efficiency, as its cost scales linearly with the number of objectives, in comparison to the exponential cost of other methods. We compare \small PESMO with other methods on synthetic and real-world problems. The results show that \small PESMO produces better recommendations with a smaller number of evaluations, and that a decoupled evaluation can lead to improvements in performance, particularly when the number of objectives is large.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-hernandez-lobatoa16, title = {Predictive Entropy Search for Multi-objective Bayesian Optimization}, author = {Hernandez-Lobato, Daniel and Hernandez-Lobato, Jose and Shah, Amar and Adams, Ryan}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1492--1501}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/hernandez-lobatoa16.pdf}, url = {https://proceedings.mlr.press/v48/hernandez-lobatoa16.html}, abstract = {We present \small PESMO, a Bayesian method for identifying the Pareto set of multi-objective optimization problems, when the functions are expensive to evaluate. \small PESMO chooses the evaluation points to maximally reduce the entropy of the posterior distribution over the Pareto set. The \small PESMO acquisition function is decomposed as a sum of objective-specific acquisition functions, which makes it possible to use the algorithm in \emphdecoupled scenarios in which the objectives can be evaluated separately and perhaps with different costs. This decoupling capability is useful to identify difficult objectives that require more evaluations. \small PESMO also offers gains in efficiency, as its cost scales linearly with the number of objectives, in comparison to the exponential cost of other methods. We compare \small PESMO with other methods on synthetic and real-world problems. The results show that \small PESMO produces better recommendations with a smaller number of evaluations, and that a decoupled evaluation can lead to improvements in performance, particularly when the number of objectives is large.} }
Endnote
%0 Conference Paper %T Predictive Entropy Search for Multi-objective Bayesian Optimization %A Daniel Hernandez-Lobato %A Jose Hernandez-Lobato %A Amar Shah %A Ryan Adams %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-hernandez-lobatoa16 %I PMLR %P 1492--1501 %U https://proceedings.mlr.press/v48/hernandez-lobatoa16.html %V 48 %X We present \small PESMO, a Bayesian method for identifying the Pareto set of multi-objective optimization problems, when the functions are expensive to evaluate. \small PESMO chooses the evaluation points to maximally reduce the entropy of the posterior distribution over the Pareto set. The \small PESMO acquisition function is decomposed as a sum of objective-specific acquisition functions, which makes it possible to use the algorithm in \emphdecoupled scenarios in which the objectives can be evaluated separately and perhaps with different costs. This decoupling capability is useful to identify difficult objectives that require more evaluations. \small PESMO also offers gains in efficiency, as its cost scales linearly with the number of objectives, in comparison to the exponential cost of other methods. We compare \small PESMO with other methods on synthetic and real-world problems. The results show that \small PESMO produces better recommendations with a smaller number of evaluations, and that a decoupled evaluation can lead to improvements in performance, particularly when the number of objectives is large.
RIS
TY - CPAPER TI - Predictive Entropy Search for Multi-objective Bayesian Optimization AU - Daniel Hernandez-Lobato AU - Jose Hernandez-Lobato AU - Amar Shah AU - Ryan Adams BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-hernandez-lobatoa16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1492 EP - 1501 L1 - http://proceedings.mlr.press/v48/hernandez-lobatoa16.pdf UR - https://proceedings.mlr.press/v48/hernandez-lobatoa16.html AB - We present \small PESMO, a Bayesian method for identifying the Pareto set of multi-objective optimization problems, when the functions are expensive to evaluate. \small PESMO chooses the evaluation points to maximally reduce the entropy of the posterior distribution over the Pareto set. The \small PESMO acquisition function is decomposed as a sum of objective-specific acquisition functions, which makes it possible to use the algorithm in \emphdecoupled scenarios in which the objectives can be evaluated separately and perhaps with different costs. This decoupling capability is useful to identify difficult objectives that require more evaluations. \small PESMO also offers gains in efficiency, as its cost scales linearly with the number of objectives, in comparison to the exponential cost of other methods. We compare \small PESMO with other methods on synthetic and real-world problems. The results show that \small PESMO produces better recommendations with a smaller number of evaluations, and that a decoupled evaluation can lead to improvements in performance, particularly when the number of objectives is large. ER -
APA
Hernandez-Lobato, D., Hernandez-Lobato, J., Shah, A. & Adams, R.. (2016). Predictive Entropy Search for Multi-objective Bayesian Optimization. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1492-1501 Available from https://proceedings.mlr.press/v48/hernandez-lobatoa16.html.

Related Material