Advertisement

A logically sound method for uncertain reasoning with quantified conditionals

  • Gabriele Kern-Isberner
Accepted Papers
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1244)

Abstract

Conditionals play a central part in knowledge representation and reasoning. Describing certain relationships between antecedents and consequences by “if-then-sentences” their range of expressiveness includes commonsense knowledge as well as scientific statements. In this paper, we present the principles of maximum entropy resp. of minimum cross-entropy (ME-principles) as a logically sound and practicable method for representing and reasoning with quantified conditionals. First the meaning of these principles is made clear by sketching a characterization from a completely conditional-logical point of view. Then we apply the techniques presented to derive ME-deduction schemes and illustrate them by examples in the second part of this paper.

Keywords

Maximum Entropy Elementary Event Propositional Variable Logical Consistency Probabilistic Conditional 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    E.W. Adams. The Logic of Conditionals. D. Reidel, Dordrecht, 1975.Google Scholar
  2. 2.
    P.G. Calabrese. Deduction and inference using conditional logic and probability. In I.R. Goodman, M.M. Gupta, H.T. Nguyen, and G.S. Rogers, editors, Conditional Logic in Expert Systems, pages 71–100. Elsevier, North Holland, 1991.Google Scholar
  3. 3.
    R.T. Cox. Probability, frequency and reasonable expectation. American Journal of Physics, 14(1):1–13, 1946.Google Scholar
  4. 4.
    D. Dubois and H. Prade. Conditional objects and non-monotonic reasoning. In Proceedings 2nd Int. Conference on Principles of Knowledge Representation and Reasoning (KR'91), pages 175–185. Morgan Kaufmann, 1991.Google Scholar
  5. 5.
    D. Dubois, H. Prade, and J.-M. Toucas. Inference with imprecise numerical quantifieres. In Z.W. Ras and M. Zemankova, editors, Intelligent Systems — state of the art and future directions, pages 52–72. Ellis Horwood Ltd., Chichester, England, 1990.Google Scholar
  6. 6.
    M. Goldszmidt, P. Morris, and J. Pearl. A maximum entropy approach to nonmonotonic reasoning. In Proceedings AAAI-90, pages 646–652, Boston, 1990.Google Scholar
  7. 7.
    I.J. Good. Maximum entropy for hypothesis formulation, especially for multidimensional contingency tables. Ann. Math. Statist., 34:911–934, 1963.Google Scholar
  8. 8.
    A.J. Grove, J.Y. Halpern, and D. Koller. Random worlds and maximum entropy. J. of Artificial Intelligence Research, 2:33–88, 1994.Google Scholar
  9. 9.
    E.T. Jaynes. Papers on Probability, Statistics and Statistical Physics. D. Reidel Publishing Company, Dordrecht, Holland, 1983.Google Scholar
  10. 10.
    R.W. Johnson and J.E. Shore. Comments on and correction to “Axiomatic derivation of the principle of maximum entropy and the principle of minimum crossentropy”. IEEE Transactions on Information Theory, IT-29(6):942–943, 1983.Google Scholar
  11. 11.
    G. Kern-Isberner. Characterizing the principle of minimum cross-entropy within a conditional logical framework. Informatik Fachbericht 206, FernUniversität Hagen, 1996.Google Scholar
  12. 12.
    G. Kern-Isberner. Conditional logics and entropy. Informatik Fachbericht 203, FernUniversitaet Hagen, 1996.Google Scholar
  13. 13.
    S. Kullback. Information Theory and Statistics. Dover, New York, 1968.Google Scholar
  14. 14.
    S.L. Lauritzen and D.J. Spiegelhalter. Local computations with probabilities in graphical structures and their applications to expert systems. Journal of the Royal Statistical Society B, 50(2):415–448, 1988.Google Scholar
  15. 15.
    N. Rescher. Many-Valued Logic. McGraw-Hill, New York, 1969.Google Scholar
  16. 16.
    D. Nute. Topics in Conditional Logic. D. Reidel Publishing Company, Dordrecht, Holland, 1980.Google Scholar
  17. 17.
    J.B. Paris and A. Vencovská. A note on the inevitability of maximum entropy. International Journal of Approximate Reasoning, 14:183–223, 1990.Google Scholar
  18. 18.
    J. Pearl. Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, San Mateo, Ca., 1988.Google Scholar
  19. 19.
    W. Rödder and C.-H. Meyer. Coherent knowledge processing at maximum entropy by spirit. In E. Horvitz and F. Jensen, editors, Proceedings 12th Conference on Uncertainty in Artificial Intelligence, pages 470–476, San Francisco, Ca., 1996. Morgan Kaufmann.Google Scholar
  20. 20.
    J.E. Shore. Relative entropy, probabilistic inference and AI. In L.N. Kanal and J.F. Lemmer, editors, Uncertainty in Artificial Intelligence, pages 211–215. North-Holland, Amsterdam, 1986.Google Scholar
  21. 21.
    J.E. Shore and R.W. Johnson. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory, IT-26:26–37, 1980.Google Scholar
  22. 22.
    J.E. Shore and R.W. Johnson. Properties of cross-entropy minimization. IEEE Transactions on Information Theory, IT-27:472–482, 1981.Google Scholar
  23. 23.
    H. Thöne, U. Güntzer, and W. Kiessling. Towards precision of probabilistic bounds propagation. In D. Dubois, M.P. Wellmann, B. D'Ambrosio, and P. Smets, editors, Proceedings 8th Conference on Uncertainty in Artificial Intelligence, pages 315–322, San Mateo, Ca., 1992. Morgan Kaufmann.Google Scholar
  24. 24.
    J. Whittaker. Graphical models in applied multivariate statistics. John Wiley & Sons, New York, 1990.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Gabriele Kern-Isberner
    • 1
  1. 1.Fachbereich InformatikFernUniversität HagenHagenGermany

Personalised recommendations