Abstract
Relational Learning (RL) has aroused interest to fill the gap between efficient attribute-value learners and growing applications stored in multi-relational databases. However, current systems use general- purpose problem solvers that do not scale-up well. This is in contrast with the past decade of success in combinatorics communities where studies of random problems, in the phase transition framework, allowed to evaluate and develop better specialised algorithms able to solve real-world applications up to millions of variables. A number of studies have been proposed in RL, like the analysis of the phase transition of a NP-complete sub-problem, the subsumption test, but none has directly studied the phase transition of RL. As RL, in general, is \({\it \Sigma}_2-hard\), we propose a first random problem generator, which exhibits the phase transition of its decision version, beyond NP. We study the learning cost of several learners on inherently easy and hard instances, and conclude on expected benefits of this new benchmarking tool for RL.
Chapter PDF
References
Domingos, P.: Prospects and challenges for multi-relational data mining. SIGKDD Explorations 5(1), 80–83 (2003)
Page, D., Srinivasan, A.: Ilp: a short look back and a longer look forward. J. Mach. Learn. Res. 4, 415–430 (2003)
Mitchell, T.M.: Generalization as search. Artificial Intelligence 18, 203–226 (1982)
Newell, A., Simon, H.A.: Human Problem Solving. Prentice-Hall, Englewood Cliffs (1972)
Cheeseman, P., Kanefsky, B., Taylor, W.: Where the really hard problems are. In: Proc. of the 12th International Joint Conference on Artificial Intelligence, pp. 331–340. Morgan Kaufmann, San Francisco (1991)
Mitchell, D., Selman, B., Levesque, H.: Hard and easy distribution of SAT problems. In: Proceedings of the Tenth National Conference on Artificial Intelligence (AAAI 1992), pp. 440–446 (1992)
Monasson, R., Zecchina, R., Kirkpatrick, S., Selman, B., Troyansky, L.: Determining computational complexity from characteristic ’phase transitions. Nature 400, 133–137 (1999)
Smith, B.M., Dyer, M.E.: Locating the phase transition in binary constraint satisfaction problems. Artificial Intelligence 81(1-2), 155–181 (1996)
Hogg, T., Williams, C.: The hardest constraint problems: A double phase transition. Artificial Intelligence 69(1–2), 359–377 (1994)
Mammen, D.L., Hogg, T.: A new look at the easy-hard-easy pattern of combinatorial search difficulty. Journal of Artificial Intelligence Research 7, 47–66 (1997)
Gomes, C., Heny Kautz, A.S., Selman, B.: Satisfiability solvers. In: Handbook of Knowledge Representation (2007)
Haussler, D.: Learning conjunctive concepts in structural domains. Machine Learning 4(1), 7–40 (1989)
Kearns, M.J., Vazirani, U.V.: An Introduction to Computational Learning Theory. The MIT Press, Cambridge (1994)
En, N., Srensson, N.: Translating pseudo-boolean constraints into SAT. Journal on Satisfiability, Boolean Modeling and Computation 2, 1–26 (2006)
Fürnkranz, J.: Pruning algorithms for rule learning. Mach. Learn. 27(2), 139–172 (1997)
Shim, G.M., Choi, M.Y., Kim, D.: Phase transitions in a dynamic model of neural networks. Physics Review A 43, 1079–1089 (1991)
Nagashino, H., Kelso, J.A.: Phase transitions in oscillatory neural networks. In: Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, vol. 1710, pp. 279–287 (1992)
Schottky, B.: Phase transitions in the generalization behaviour of multilayer neural networks. Journal of Physics A Mathematical General 28, 4515–4531 (1995)
Biehl, M., Ahr, M., Schlsser, E.: Statistical physics of learning: Phase transitions in multilayered neural networks. Advances in Solid State Physics 40/2000, 819–826 (2000)
Botta, M., Giordana, A., Saitta, L., Sebag, M.: Relational learning as search in a critical region. Journal of Machine Learning Research 4, 431–463 (2003)
Alphonse, E., Osmani, A.: A model to study phase transition and plateaus in relational learning. In: Proc. of Conf. on Inductive Logic Programming, pp. 6–23 (2008)
Rückert, U., Kramer, S., Raedt, L.D.: Phase transitions and stochastic local search in k-term DNF learning. In: Elomaa, T., Mannila, H., Toivonen, H. (eds.) ECML 2002. LNCS (LNAI), vol. 2430, pp. 405–417. Springer, Heidelberg (2002)
Gottlob, G., Leone, N., Scarcello, F.: On the complexity of some inductive logic programming problems. In: Džeroski, S., Lavrač, N. (eds.) ILP 1997. LNCS, vol. 1297, pp. 17–32. Springer, Heidelberg (1997)
Bylander, T.: A probabilistic analysis of propositional strips planning. Artificial Intelligence 81(1-2), 241–271 (1996)
Gent, I.P., Walsh, T.: Beyond NP: the QSAT phase transition. In: AAAI 1999/IAAI 1999: Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference, pp. 648–653 (1999)
Chen, H., Interian, Y.: A model for generating random quantified boolean formulas. In: Kaelbling, L.P., Saffiotti, A. (eds.) Proceedings of the Nineteenth International Joint Conference on Artificial Intelligence, pp. 66–71. Professional Book Center (2005)
Srinivasan, A.: A learning engine for proposing hypotheses (Aleph) (1999), http://web.comlab.ox.ac.uk/oucl/research/areas/machlearn/Aleph
Muggleton, S.: Inverse entailment and PROGOL. New Generation Computing 13, 245–286 (1995)
Alphonse, É., Rouveirol, C.: Extension of the top-down data-driven strategy to ILP. In: Muggleton, S.H., Otero, R., Tamaddoni-Nezhad, A. (eds.) ILP 2006. LNCS (LNAI), vol. 4455, pp. 49–63. Springer, Heidelberg (2007)
Cook, S.A., Mitchell, D.G.: Finding hard instances of the satisfiability problem: A survey. In: DIMACS Series in Discrete Mathematics and Theoretical Computer Science, pp. 1–17. American Mathematical Society (1997)
Xu, K., Li, W.: Many hard examples in exact phase transitions. Theor. Comput. Sci. 355(3), 291–302 (2006)
Gottlob, G.: Subsumption and implication. Information Processing Letters 24(2), 109–111 (1987)
Fürnkranz, J.: A pathology of bottom-up hill-climbing in inductive rule learning. In: Cesa-Bianchi, N., Numao, M., Reischuk, R. (eds.) ALT 2002. LNCS (LNAI), vol. 2533, pp. 263–277. Springer, Heidelberg (2002)
Plotkin, G.: A note on inductive generalization. In: Machine Intelligence, pp. 153–163. Edinburgh University Press (1970)
Valiant, L.G.: A theory of the learnable. In: ACM Symposium on Theory of Computing (STOC 1984), Baltimore, USA, pp. 436–445. ACM Press, New York (1984)
Kietz, J.U.: A comparative study of structural most specific generalisations used in machine learning. In: Proc. Third Workshop on ILP, pp. 149–164 (1993)
Muggleton, S., Feng, C.: Efficient induction of logic programs. In: Proc. of the 1st Conference on Algorithmic Learning Theory, Ohmsma, Tokyo, Japan, pp. 368–381 (1990)
Paskal, Y.I.: The meaning of the terms phase and phase transition. Russian Physics Journal 31(8), 664–666 (1988)
Selman, B., Levesque, H.J., Mitchell, D.: A new method for solving hard satisfiability problems. In: Proc. of the Tenth National Conference on Artificial Intelligence, Menlo Park, California, pp. 440–446 (1992)
Gent, I.P., Walsh, T.: Easy problems are sometimes hard. Artificial Intelligence 70(1–2), 335–345 (1994)
Davenport, A.: A comparison of complete and incomplete algorithms in the easy and hard regions. In: Workshop on Studying and Solving Really Hard Problems, CP 1995, pp. 43–51 (1995)
Smith, B.M.: Constructing an asymptotic phase transition in random binary constraint satisfaction problems. Theoretical Computer Science 265(1–2), 265–283 (2001)
Xu, K., Boussemart, F., Hemery, F., Lecoutre, C.: Random constraint satisfaction: Easy generation of hard (satisfiable) instances. Artif. Intell. 171(8-9), 514–534 (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Alphonse, E., Osmani, A. (2009). Empirical Study of Relational Learning Algorithms in the Phase Transition Framework. In: Buntine, W., Grobelnik, M., Mladenić, D., Shawe-Taylor, J. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2009. Lecture Notes in Computer Science(), vol 5781. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04180-8_21
Download citation
DOI: https://doi.org/10.1007/978-3-642-04180-8_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04179-2
Online ISBN: 978-3-642-04180-8
eBook Packages: Computer ScienceComputer Science (R0)