Skip to main content

From Bayesian Inference to Logical Bayesian Inference

A New Mathematical Frame for Semantic Communication and Machine Learning

  • Conference paper
  • First Online:
Intelligence Science II (ICIS 2018)

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 539))

Included in the following conference series:

Abstract

Bayesian Inference (BI) uses the Bayes’ posterior whereas Logical Bayesian Inference (LBI) uses the truth function or membership function as the inference tool. LBI is proposed because BI is not compatible with the classical Bayes’ prediction and does not use logical probability and hence cannot express semantic meaning. In LBI, statistical probability and logical probability are strictly distinguished, used at the same time, and linked by the third kind of Bayes’ Theorem. The Shannon channel consists of a set of transition probability functions whereas the semantic channel consists of a set of truth functions. When a sample is large enough, we can directly derive the semantic channel from Shannon’s channel. Otherwise, we can use parameters to construct truth functions and use the Maximum Semantic Information (MSI) criterion to optimize the truth functions. The MSI criterion is equivalent to the Maximum Likelihood (ML) criterion, and compatible with the Regularized Least Square (RLS) criterion. By matching the two channels one with another, we can obtain the Channels’ Matching (CM) algorithm. This algorithm can improve multi-label classifications, maximum likelihood estimations (including unseen instance classifications), and mixture models. In comparison with BI, LBI (1) uses the prior P(X) of X instead of that of Y or θ and fits cases where the source P(X) changes, (2) can be used to solve the denotations of labels, and (3) is more compatible with the classical Bayes’ prediction and likelihood method. LBI also provides a confirmation measure between −1 and 1 for induction.

The original version of this chapter was revised: An error in Equation (31) has been corrected. The correction to this chapter is available at https://doi.org/10.1007/978-3-030-01313-4_51

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Change history

  • 14 December 2018

    The original version of this chapter contained a mistake. There was an error in Equation (31). The original chapter has been corrected.

Notes

  1. 1.

    For the strict convergence proof, see http://survivor99.com/lcg/CM/CM4MM.html.

References

  1. Fienberg, S.E.: When did Bayesian inference become “Bayesian”? Bayesian Anal. 1(1), 1–40 (2006)

    Article  MathSciNet  Google Scholar 

  2. Anon: Bayesian inference, Wikipedia: the Free Encyclopedia. https://en.wikipedia.org/wiki/Bayesian_probability. Accessed 20 July 2018

  3. Hájek, A.: Interpretations of probability. In: Zalta, E.N. (ed.) The Stanford Encyclopedia of Philosophy (Winter 2012 edn). https://plato.stanford.edu/archives/win2012/entries/probability-interpret/. Accessed 27 July 2018

  4. Keynes, I.M.: A Treaties on Probability. Macmillan, London (1921)

    Google Scholar 

  5. Carnap, R.: Logical Fundations of Probability. The University of Chicago Press, Chicago (1962)

    Google Scholar 

  6. Jaynes, E.T.: Probability Theory: The Logic of Science, Edited by Larry Bretthorst. Cambridge University Press, New York (2003)

    Google Scholar 

  7. Reichenbach, H.: The Theory of Probability. University of California Press, Berkeley (1949)

    MATH  Google Scholar 

  8. Fisher, R.A.: On the mathematical foundations of theoretical statistics. Philos. Trans. R. Soc. A222, 309–368 (1922)

    Article  Google Scholar 

  9. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–429 and 623–656 (1948)

    Article  MathSciNet  Google Scholar 

  10. Goodfellow, I., Bengio, Y.: Deep Learning. The MIP Press, Cambridge (2016)

    MATH  Google Scholar 

  11. Lu, C.: B-fuzzy quasi-Boolean algebra and generalized mutual entropy formula. Fuzzy Syst. Math. (in Chinese) 5(1), 76–80 (1991)

    Google Scholar 

  12. Lu, C.: A Generalized Information Theory (in Chinese). China Science and Technology University Press, Hefei (1993)

    Google Scholar 

  13. Lu, C.: Meanings of generalized entropy and generalized mutual information for coding (in Chinese). J. China Inst. Commun. 15(6), 37–44 (1994)

    Google Scholar 

  14. Lu, C.: A generalization of Shannon’s information theory. Int. J. Gener. Syst. 28(6), 453–490 (1999)

    Article  MathSciNet  Google Scholar 

  15. Lu, C.: Semantic channel and Shannon’s channel mutually match for multi-label classifications. In: ICIS2018, Beijing, China (2018)

    Chapter  Google Scholar 

  16. Lu, C.: Semantic channel and Shannon channel mutually match and iterate for tests and estimations with maximum mutual information and maximum likelihood. In: 2018 IEEE International Conference on Big Data and Smart Computing, pp. 227–234, IEEE Conference Publishing Services, Piscataway (2018)

    Google Scholar 

  17. Lu, C.: Channels’ matching algorithm for mixture models. In: Shi, Z., Goertzel, B., Feng, J. (eds.) ICIS 2017. IAICT, vol. 510, pp. 321–332. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68121-4_35

    Chapter  Google Scholar 

  18. Zadeh, L.A.: Fuzzy sets. Inf. Control 8(3), 338–353 (1965)

    Article  Google Scholar 

  19. Zadeh, L.A.: Probability measures of fuzzy events. J. Math. Anal. Appl. 23(2), 421–427 (1986)

    Article  MathSciNet  Google Scholar 

  20. Popper, K.: Conjectures and Refutations. Repr. Routledge, London and New York (1963/2005)

    Google Scholar 

  21. Tarski, A.: The semantic conception of truth: and the foundations of semantics. Philos. Phenomenol. Res. 4(3), 341–376 (1944)

    Article  MathSciNet  Google Scholar 

  22. Davidson, D.: Truth and meaning. Synthese 17(1), 304–323 (1967)

    Article  Google Scholar 

  23. Bayes, T., Price, R.: An essay towards solving a problem in the doctrine of chance. Philos. Trans. R. Soc. Lond. 53, 370–418 (1763)

    Article  Google Scholar 

  24. Zhang, M.L., Zhou, Z.H.: A review on multi-label learning algorithm. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)

    Article  Google Scholar 

  25. Zhang, M.L., Li, Y.K., Liu, X.Y., Geng, X.: Binary relevance for multi-label learning: an overview. Front. Comput. Sci. 12(2), 191–202 (2018)

    Article  Google Scholar 

  26. Wang, P.Z.: From the fuzzy statistics to the falling fandom subsets. In: Wang, P.P. (ed.) Advances in Fuzzy Sets, Possibility Theory and Applications, pp. 81–96. Plenum Press, New York (1983)

    Chapter  Google Scholar 

  27. Bar-Hillel, Y., Carnap, R.: An outline of a theory of semantic information. Tech. Rep. No. 247, Research Lab. of Electronics, MIT (1952)

    Google Scholar 

  28. Hawthorne, J.: Inductive logic. In: Zalta, E.N. (ed.) The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/logic-inductive/. Accessed 22 July 2018

  29. Tentori, K., Crupi, V., Bonini, N., Osherson, D.: Comparison of confirmation measures. Cognition 103(1), 107–119 (2017)

    Article  Google Scholar 

  30. Lu, C.: Semantic information measure with two types of probability for falsification and confirmation. https://arxiv.org/abs/1609.07827. Accessed 27 July 2018

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chenguang Lu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lu, C. (2018). From Bayesian Inference to Logical Bayesian Inference. In: Shi, Z., Pennartz, C., Huang, T. (eds) Intelligence Science II. ICIS 2018. IFIP Advances in Information and Communication Technology, vol 539. Springer, Cham. https://doi.org/10.1007/978-3-030-01313-4_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01313-4_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01312-7

  • Online ISBN: 978-3-030-01313-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics