×

zbMATH — the first resource for mathematics

A machine learning approach to algorithm selection for \(\mathcal{NP}\)-hard optimization problems: a case study on the MPE problem. (English) Zbl 1145.68043
Summary: Given one instance of an \(\mathcal{NP}\)-hard optimization problem, can we tell in advance whether it is exactly solvable or not? If it is not, can we predict which approximate algorithm is the best to solve it? Since the behavior of most approximate, randomized, and heuristic search algorithms for \(\mathcal{NP}\)-hard problems is usually very difficult to characterize analytically, researchers have turned to experimental methods in order to answer these questions. In this paper we present a machine learning-based approach to address the above questions. Models induced from algorithmic performance data can represent the knowledge of how algorithmic performance depends on some easy-to-compute problem instance characteristics. Using these models, we can estimate approximately whether an input instance is exactly solvable or not. Furthermore, when it is classified as exactly unsolvable, we can select the best approximate algorithm for it among a list of candidates. In this paper we use the MPE (most probable explanation) problem in probabilistic inference as a case study to validate the proposed methodology. Our experimental results show that the machine learning-based algorithm selection system can integrate both exact and inexact algorithms and provide the best overall performance comparing to any single candidate algorithm.

MSC:
68T05 Learning and adaptive systems in artificial intelligence
68Q25 Analysis of algorithms and problem complexity
68W01 General topics in the theory of algorithms
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Abdelbar, A. M., & Hedetniemi, S. M. (1998). Approximating MAPs for belief networks is \(\mathcal{NP}\) -hard and other theorems. Artificial Intelligence, 102, 21–38. · Zbl 0909.68077 · doi:10.1016/S0004-3702(98)00043-5
[2] Breese, J. S., & Horvitz, E. (1990). Ideal reformulation of belief networks. In UAI90 (pp. 129–144).
[3] Cooper, G., & Herskovits, E. (1992). A Bayesian method for the induction of probabilistic networks from data. Machine Learning, 9(4), 309–347. · Zbl 0766.68109
[4] Fink, E. (1998). How to solve it automatically: selection among problem-solving methods. In Proceedings of the fourth international conference on artificial intelligence planning systems (pp. 128–136).
[5] Fung, R., & Chang, K. C. (1989). Weighting and integrating evidence for stochastic simulation in Bayesian networks. Uncertainty in Artificial Intelligence, 5, 209–219.
[6] Gent, I. P., & Walsh, T. (1993). An empirical analysis of search in GSAT. Journal of Artificial Intelligence Research, 1, 47–59. · Zbl 0900.68178
[7] Glover, F., & Laguna, M. (1997). Tabu search. Boston: Kluwer Academic. · Zbl 0930.90083
[8] Gomes, C. P., & Selman, B. (1997). Algorithm portfolio design: theory vs. practice. In UAI97 (pp. 190–197).
[9] Guo, H. (2003). Algorithm selection for sorting and probabilistic inference: a machine learning-based approach. PhD thesis, Kansas State University.
[10] Guo, H., Boddhireddy, P., & Hsu, W. (2004). Using Ant algorithm to solve MPE. In The 17th Australian joint conference on artificial intelligence, Dec. 2004, Cairns, Australia.
[11] Hooker, J. (1994). Needed: an empirical science of algorithms. Operations Research, 42, 201–212. · Zbl 0805.90119 · doi:10.1287/opre.42.2.201
[12] Hoos, H., & Stutzle, T. (1998). Evaluating Las Vegas algorithms–pitfalls and remedies. In UAI98.
[13] Hoos, H., & Stutzle, T. (2000). Local search algorithms for SAT: an empirical evaluation. Journal of Automated Reasoning, 24(4), 421–481. · Zbl 0961.68039 · doi:10.1023/A:1006350622830
[14] Horvitz, E. (1990). Computation and action under bounded resources. PhD thesis, Stanford University.
[15] Horvitz, E., Ruan, Y., Kautz, H., Selman, B., & Chickering, D. M. (2001). A Bayesian approach to tackling hard computational problems. In UAI01 (pp. 235–244). · Zbl 0991.68564
[16] Houstis, E. N., Catlin, A. C., Rice, J. R., Verykios, V. S., Ramakrishnan, N., & Houstis, C. (2000). PYTHIA-II: a knowledge/database system for managing performance data and recommending scientific software. ACM Transactions on Mathematical Software, 26(2), 227–253. · Zbl 1063.68664 · doi:10.1145/353474.353475
[17] Hutter, F. (2005). Stochastic local search for solving the most probable explanation problem in Bayesian networks. M.S. thesis, Intellectics Group, Darmstadt University of Technology.
[18] Ide, J. S., & Cozman, F. G. (2002). Random generation of Bayesian networks. In Brazilian symposium on artificial intelligence, Pernambuco, Brazil. · Zbl 1031.68561
[19] Jensen, F. V., Olesen, K. G., & Anderson, K. (1990). An algebra of Bayesian belief universes for knowledge-based systems. Networks, 20, 637–659. · Zbl 0719.68081 · doi:10.1002/net.3230200509
[20] Jitnah, N., & Nicholson, A. E. (1998). Belief network algorithms: a study of performance based on domain characterization. In Learning and reasoning with complex representations (Vol. 1359, pp. 169–188). New York: Springer.
[21] Johnson, D. S. (2002). A theoretician’s guide to the experimental analysis of algorithms. In M. H. Goldwasser, D. S. Johnson & C. C. McGeoch (Eds.), Data structures, near neighbor searches, and methodology: fifth and sixth DIMACS implementation challenges (pp. 215–250).
[22] Kask, K., & Dechter, R. (1999). Stochastic local search for Bayesian networks. In Workshop on AI and statistics 99 (pp. 113–122).
[23] Kirkpatrick, S., Gelatt, C. D., & Vecchi, M. P. (1983). Optimization by simulated annealing. Science, 220(4598), 671–680. · Zbl 1225.90162 · doi:10.1126/science.220.4598.671
[24] Lagoudakis, M., & Littman, M. (2001). Learning to select branching rules in the DPLL procedure for satisfiability. Electronic notes in discrete mathematics (ENDM): Vol. 9. LICS 2001 workshop on theory and applications of satisfiability testing (SAT 2001), Boston, MA, June 2001. · Zbl 0990.90543
[25] Lagoudakis, M., Littman, M., & Parr, R. (2001). Selection the right algorithm. In Proceedings of the 2001 AAAI fall symposium series: using uncertainty within computation, Cape Cod, MA, November 2001.
[26] Lauritzen, S. L., & Spiegelhalter, D. J. (1988). Local computations with probabilities on graphical structures and their application to expert systems (with discussion). Journal of the Royal Statistical Society Series B, 50, 157–224. · Zbl 0684.68106
[27] Leyton-Brown, K., Nudelman, E., & Shoham, Y. (2002). Learning the empirical hardness of optimization problems: the case of combinatorial auctions. In Constraint programming 2002 (CP-02). · Zbl 1325.68110
[28] Leyton-Brown, K., Nudelman, E., Andrew, G., McFadden, J., & Shoham, Y. (2003a). A portfolio approach to algorithm selection. In IJCAI.
[29] Leyton-Brown, K., Nudelman, E., Andrew, G., McFadden, J., & Shoham, Y. (2003b). Boosting as a metaphor for algorithm design. Preprint.
[30] Littman, M. (1999). Initial experiments in stochastic search for Bayesian networks. In Proceedings of the sixteenth national conference on artificial intelligence (pp. 667–672).
[31] Lobjois, L., & Lema, M. (1998). Branch and bound algorithm selection by performance prediction. In Proceedings of the fifteenth national/tenth conference on AI/innovative applications of AI (pp. 353–358).
[32] Lucks, M., & Gladwell, I. (1992). Automated selection of mathematical software. ACM Transactions on Mathematical Software, 18(1), 11–34. · Zbl 0892.65038 · doi:10.1145/128745.128747
[33] Mannila, H. (1985). Instance complexity for sorting and NP-complete problems. PhD thesis, Department of Computer Science, University of Helsinki. · Zbl 0556.68031
[34] McGeoch, C. C. (1986). Experimental analysis of algorithms. PhD thesis, Carnegie-Mellon University. · Zbl 0616.10027
[35] Mengshoel, O. J. (1999). Efficient Bayesian network inference: genetic algorithms, stochastic local search, and abstraction. Computer Science Department, University of Illinois at Urbana-Champaign.
[36] Moret, B. M. E. (2002). Towards a discipline of experimental algorithmics. In Data structures, near neighbor searches, and methodology: fifth and sixth DIMACS implementation challenges. DIMACS monographs (Vol. 59, pp. 197–213).
[37] Orponen, P., Ko, K., Schoning, U., & Watanabe, O. (1994). Instance complexity. Journal of the ACM, 41(1), 96–121. · Zbl 0807.68035 · doi:10.1145/174644.174648
[38] Park, J. D. (2002). Using weighted MAX-SAT engines to solve MPE. In Proceedings of the 18th national conference on artificial intelligence (AAAI) (pp. 682–687).
[39] Pearl, J. (1988). Probabilistic reasoning in intelligent systems: networks of plausible inference. San Mateo: Morgan Kaufmann. · Zbl 0746.68089
[40] Ramakrishnan, N., & Valdes-perez, R. E. (2000). Note on generalization in experimental algorithmics. ACM Transactions on Mathematical Software, 26(4), 568–580. · Zbl 01935107 · doi:10.1145/365723.365734
[41] Rardin, R. L., & Uzsoy, R. (2001). Experimental evaluation of heuristic optimization algorithms: a tutorial. Journal of Heuristics, 7(3), 261–304. · Zbl 0972.68634 · doi:10.1023/A:1011319115230
[42] Rice, J. R. (1976). The algorithm selection problem. In M. V. Zelkowitz (Ed.), Advances in computers (Vol. 15, pp. 65–118).
[43] Ruan, Y., Kautz, H., & Horvitz, E. (2004). The backdoor key: a path to understanding problem hardness. In Nineteenth national conference on artificial intelligence, San Jose, CA, 2004.
[44] Russell, S., & Norvig, P. (2003). Artificial intelligence: a modern approach. Englewood Cliffs: Prentice-Hall. · Zbl 0835.68093
[45] Sanders, P. (2002). Presenting data from experiments in algorithmics. In Experimental algorithmics: from algorithm design to robust and efficient software (pp. 181–196). New York: Springer. · Zbl 1026.68816
[46] Santos, E. (1991). On the generation of alternative explanations with implications for belief revision. In UAI 91 (pp. 339–347).
[47] Santos, E., Shimony, S. E., & Williams, E. (1995). On a distributed anytime architecture for probabilistic reasoning (Technique report AFIT/EN/TR94-06). Department of Electrical and Computer Engineering, Air Force Institute of Technology.
[48] Shafer, G., & Shenoy, P. (1990). Probability propagation. Annals of Mathematics and Artificial Intelligence, 2, 327–352. · Zbl 0875.68676 · doi:10.1007/BF01531015
[49] Shimony, S. E., & Charniak, E. (1999). A new algorithm for finding MAP assignments to belief network. In UAI 99 (pp. 185–193). · Zbl 0742.68076
[50] Shimony, S. E., & Domshlak, C. (2003). Complexity of probabilistic reasoning in directed-path singly connected Bayes networks. Artificial Intelligence, 151, 213–225. · Zbl 1082.68837 · doi:10.1016/S0004-3702(03)00110-3
[51] Witten, I. H., & Frank, E. (1999). Data mining: practical machine learning tools and techniques with Java implementations. Los Altos: Morgan Kaufmann.
[52] Zilberstein, S. (1993). Operational rationality through compilation of anytime algorithms. PhD thesis, University of California at Berkeley.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.