Abstract
Probabilistic conditionals are a powerful means of representing commonsense and expert knowledge. By viewing probabilistic conditionals as an institution, we obtain a formalization of probabilistic conditionals as a logical system. Using the framework of institutions, we phrase a general representation problem that is closely related to the selection of preferred models. The problem of discovering probabilistic conditionals from data can be seen as an instance of the inverse representation problem, thereby considering knowledge discovery as an operation inverse to inductive knowledge representation. These concepts are illustrated using the well-known probabilistic principle of maximum entropy for which we sketch an approach to solve the inverse representation problem.
The research reported here was partially supported by the DFG – Deutsche Forschungsgemeinschaft within the Condor-project under grant BE 1700/5-1.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Agrawal, R., Mannila, H., Srikant, R., Toivonen, H., Verkamo, A.I.: Fast discovery of association rules. In: Fayyad, U.M., Piatetsky-Shapiro, G., Smyth, P., Uthurusamy, R. (eds.) Advances in knowledge discovery and data mining, pp. 307–328. MIT Press, Cambridge (1996)
Burstall, R., Goguen, J.: The semantics of Clear, a specification language. In: Bjorner, D. (ed.) Abstract Software Specifications. LNCS, vol. 86, pp. 292–332. Springer, Heidelberg (1980)
Beierle, C., Hedtstück, U., Pletat, U., Schmitt, P.H., Siekmann, J.: An order-sorted logic for knowledge representation systems. Artificial Intelligence 55(2–3), 149–191 (1992)
Beierle, C., Kern-Isberner, G.: Looking at probabilistic conditionals from an institutional point of view. In: Workshop Conditionals, Information, and Inference, Hagen (2002)
Buntine, W.: A guide to the literature on learning probabilistic networks from data. IEEE Transactions on Knowledge and Data Engineering 8(2), 195–210 (1996)
Beierle, C., Voss, A.: Viewing implementations as an institution. In: Pitt, D.H., Rydeheard, D.E., Poigné, A. (eds.) Category Theory and Computer Science. LNCS, vol. 283, Springer, Heidelberg (1987)
Beierle, C., Voss, A.: Stepwise software development: Combining axiomatic and algorithmic approaches in algebraic specifications. Technology and Science of Informatics 10(1), 35–51 (1991)
Calabrese, P.G.: Deduction and inference using conditional logic and probability. In: Goodman, I.R., Gupta, M.M., Nguyen, H.T., Rogers, G.S. (eds.) Conditional Logic in Expert Systems, pp. 71–100. Elsevier, North Holland (1991)
Cowell, R.G., Dawid, A.P., Lauritzen, S.L., Spiegelhalter, D.J.: Probabilistic networks and expert systems. Springer, New York (1999)
Crocco, G., Fariñas del Cerro, L., Herzig, A. (eds.): Conditionals: From Philosophy to Computer Science. Studies in Logic and Computation. Oxford University Press, Oxford (1995)
Cooper, G.F., Herskovits, E.: A bayesian method for the induction of probabilistic networks from data. Machine learning 9, 309–347 (1992)
Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Prob. 3, 146–158 (1975)
DeFinetti, B.: Theory of Probability, vol. 1,2. John Wiley and Sons, New York (1974)
Ehrig, H., Mahr, B.: Fundamentals of Algebraic Specification 1 – Equations and Initial Semantics. EATCS Monographs on Theoretical Computer Science, vol. 6. Springer, Berlin (1985)
Goguen, J., Burstall, R.: Institutions: Abstract model theory for specification and programming. Journal of the ACM 39(1), 95–146 (1992)
Geiger, D.: An entropy-based learning algorithm of bayesian conditional trees. In: Proceedings Eighth Conference on Uncertainty in Artificial Intelligence, pp. 92–97 (1992)
Goguen, J.A., Rosu, G.: Institution morphisms. In: Sannella, D. (ed.) Festschrift for Rod Burstall (2002) (to appear)
Goguen, J., Tracz, W.: An implementation-oriented semantics for module composition. In: Leavens, G., Sitaraman, M. (eds.) Foundations of Component-based Systems, Cambridge, pp. 231–263 (2000)
Herskovits, E., Cooper, G.: Kutató: An entropy-driven system for construction of probabilistic expert systems from databases. Technical Report KSL-90-22, Knowledge Systems Laboratory (1990)
Heckerman, D.: Bayesian networks for knowledge discovery. In: Fayyad, U.M., Piatetsky-Shapiro, G., Smyth, P., Uthurusamy, R. (eds.) Advances in knowledge discovery and data mining. MIT Press, Cambridge (1996)
Herrlich, H., Strecker, G.E.: Category theory. Allyn and Bacon, Boston (1973)
Kern-Isberner, G.: Characterizing the principle of minimum cross-entropy within a conditional-logical framework. Artificial Intelligence 98, 169–208 (1998)
Kern-Isberner, G.: Solving the inverse representation problem. In: Proceedings 14th European Conference on Artificial Intelligence, ECAI 2000, Berlin, pp. 581–585. IOS Press, Amsterdam (2000)
Kern-Isberner, G.: Conditional preservation and conditional indifference. Journal of Applied Non-Classical Logics 11(1-2), 85–106 (2001)
Kern-Isberner, G. (ed.): Conditionals in Nonmonotonic Reasoning and Belief Revision. LNCS (LNAI), vol. 2087, p. 27. Springer, Heidelberg (2001)
Kern-Isberner, G.: Discovering most informative rules from data. In: Proceedings International Conference on Intelligent Agents, Web Technologies and Internet Commerce, IAWTIC 2001 (2001)
Kern-Isberner, G., Reidmacher, H.P.: Interpreting a contingency table by rules. International Journal of Intelligent Systems 11(6) (1996)
Kraus, S., Lehmann, D., Magidor, M.: Nonmonotonic reasoning, preferential models and cumulative logics. Artificial Intelligence 44, 167–207 (1990)
Mac Lane, S.: Categories for the Working Mathematician. Springer, New York (1972)
Makinson, D.: General theory of cumulative inference. In: Reinfrank, M., Ginsberg, M.L., de Kleer, J., Sandewall, E. (eds.) Non-Monotonic Reasoning 1988. LNCS, vol. 346, pp. 1–18. Springer, Heidelberg (1988)
Megiddo, N., Srikant, R.: Discovering predictive association rules. In: Proceedings of the 4th International Conference on Knowledge Discovery in Databases and Data Mining (1998)
Paris, J.B.: The uncertain reasoner’s companion – A mathematical perspective. Cambridge University Press, Cambridge (1994)
Paris, J.B., Vencovská, A.: A note on the inevitability of maximum entropy. International Journal of Approximate Reasoning 14, 183–223 (1990)
Rödder, W., Kern-Isberner, G.: Representation and extraction of information by probabilistic logic. Information Systems 21(8), 637–652 (1997)
Spirtes, P., Glymour, C., Scheines, R.: Causation, Prediction and Search. Lecture Notes in Statistics, vol. 81. Springer, Heidelberg (1993)
Shoham, Y.: A semantical approach to non-monotonic logics. In: Proceedings of the Tenth International Joint Conference on Artificial Intelligence, IJCAI 1987 (1987)
Shore, J.E., Johnson, R.W.: Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory IT-26, 26–37 (1980)
Sannella, D., Tarlecki, A.: Essential comcepts for algebraic specification and program development. Formal Aspects of Computing 9, 229–269 (1997)
Tarlecki, A.: Moving between logical systems. In: Haveraaen, M., Dahl, O.-J., Owe, O. (eds.) Abstract Data Types 1995 and COMPASS 1995. LNCS, vol. 1130, pp. 478–502. Springer, Heidelberg (1996)
Wirsing, M.: Algebraic specification. In: van Leeuwen, J. (ed.) Handbook of Theoretical Computer Science, vol. B, pp. 675–788. Elsevier Science Publishers B.V., Amsterdam (1990)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Beierle, C., Kern-Isberner, G. (2005). Footprints of Conditionals. In: Hutter, D., Stephan, W. (eds) Mechanizing Mathematical Reasoning. Lecture Notes in Computer Science(), vol 2605. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-32254-2_6
Download citation
DOI: https://doi.org/10.1007/978-3-540-32254-2_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25051-7
Online ISBN: 978-3-540-32254-2
eBook Packages: Computer ScienceComputer Science (R0)