Abstract
Bayesian Inference (BI) uses the Bayes’ posterior whereas Logical Bayesian Inference (LBI) uses the truth function or membership function as the inference tool. LBI is proposed because BI is not compatible with the classical Bayes’ prediction and does not use logical probability and hence cannot express semantic meaning. In LBI, statistical probability and logical probability are strictly distinguished, used at the same time, and linked by the third kind of Bayes’ Theorem. The Shannon channel consists of a set of transition probability functions whereas the semantic channel consists of a set of truth functions. When a sample is large enough, we can directly derive the semantic channel from Shannon’s channel. Otherwise, we can use parameters to construct truth functions and use the Maximum Semantic Information (MSI) criterion to optimize the truth functions. The MSI criterion is equivalent to the Maximum Likelihood (ML) criterion, and compatible with the Regularized Least Square (RLS) criterion. By matching the two channels one with another, we can obtain the Channels’ Matching (CM) algorithm. This algorithm can improve multi-label classifications, maximum likelihood estimations (including unseen instance classifications), and mixture models. In comparison with BI, LBI (1) uses the prior P(X) of X instead of that of Y or θ and fits cases where the source P(X) changes, (2) can be used to solve the denotations of labels, and (3) is more compatible with the classical Bayes’ prediction and likelihood method. LBI also provides a confirmation measure between −1 and 1 for induction.
The original version of this chapter was revised: An error in Equation (31) has been corrected. The correction to this chapter is available at https://doi.org/10.1007/978-3-030-01313-4_51
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Change history
14 December 2018
The original version of this chapter contained a mistake. There was an error in Equation (31). The original chapter has been corrected.
Notes
- 1.
For the strict convergence proof, see http://survivor99.com/lcg/CM/CM4MM.html.
References
Fienberg, S.E.: When did Bayesian inference become “Bayesian”? Bayesian Anal. 1(1), 1–40 (2006)
Anon: Bayesian inference, Wikipedia: the Free Encyclopedia. https://en.wikipedia.org/wiki/Bayesian_probability. Accessed 20 July 2018
Hájek, A.: Interpretations of probability. In: Zalta, E.N. (ed.) The Stanford Encyclopedia of Philosophy (Winter 2012 edn). https://plato.stanford.edu/archives/win2012/entries/probability-interpret/. Accessed 27 July 2018
Keynes, I.M.: A Treaties on Probability. Macmillan, London (1921)
Carnap, R.: Logical Fundations of Probability. The University of Chicago Press, Chicago (1962)
Jaynes, E.T.: Probability Theory: The Logic of Science, Edited by Larry Bretthorst. Cambridge University Press, New York (2003)
Reichenbach, H.: The Theory of Probability. University of California Press, Berkeley (1949)
Fisher, R.A.: On the mathematical foundations of theoretical statistics. Philos. Trans. R. Soc. A222, 309–368 (1922)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–429 and 623–656 (1948)
Goodfellow, I., Bengio, Y.: Deep Learning. The MIP Press, Cambridge (2016)
Lu, C.: B-fuzzy quasi-Boolean algebra and generalized mutual entropy formula. Fuzzy Syst. Math. (in Chinese) 5(1), 76–80 (1991)
Lu, C.: A Generalized Information Theory (in Chinese). China Science and Technology University Press, Hefei (1993)
Lu, C.: Meanings of generalized entropy and generalized mutual information for coding (in Chinese). J. China Inst. Commun. 15(6), 37–44 (1994)
Lu, C.: A generalization of Shannon’s information theory. Int. J. Gener. Syst. 28(6), 453–490 (1999)
Lu, C.: Semantic channel and Shannon’s channel mutually match for multi-label classifications. In: ICIS2018, Beijing, China (2018)
Lu, C.: Semantic channel and Shannon channel mutually match and iterate for tests and estimations with maximum mutual information and maximum likelihood. In: 2018 IEEE International Conference on Big Data and Smart Computing, pp. 227–234, IEEE Conference Publishing Services, Piscataway (2018)
Lu, C.: Channels’ matching algorithm for mixture models. In: Shi, Z., Goertzel, B., Feng, J. (eds.) ICIS 2017. IAICT, vol. 510, pp. 321–332. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68121-4_35
Zadeh, L.A.: Fuzzy sets. Inf. Control 8(3), 338–353 (1965)
Zadeh, L.A.: Probability measures of fuzzy events. J. Math. Anal. Appl. 23(2), 421–427 (1986)
Popper, K.: Conjectures and Refutations. Repr. Routledge, London and New York (1963/2005)
Tarski, A.: The semantic conception of truth: and the foundations of semantics. Philos. Phenomenol. Res. 4(3), 341–376 (1944)
Davidson, D.: Truth and meaning. Synthese 17(1), 304–323 (1967)
Bayes, T., Price, R.: An essay towards solving a problem in the doctrine of chance. Philos. Trans. R. Soc. Lond. 53, 370–418 (1763)
Zhang, M.L., Zhou, Z.H.: A review on multi-label learning algorithm. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)
Zhang, M.L., Li, Y.K., Liu, X.Y., Geng, X.: Binary relevance for multi-label learning: an overview. Front. Comput. Sci. 12(2), 191–202 (2018)
Wang, P.Z.: From the fuzzy statistics to the falling fandom subsets. In: Wang, P.P. (ed.) Advances in Fuzzy Sets, Possibility Theory and Applications, pp. 81–96. Plenum Press, New York (1983)
Bar-Hillel, Y., Carnap, R.: An outline of a theory of semantic information. Tech. Rep. No. 247, Research Lab. of Electronics, MIT (1952)
Hawthorne, J.: Inductive logic. In: Zalta, E.N. (ed.) The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/logic-inductive/. Accessed 22 July 2018
Tentori, K., Crupi, V., Bonini, N., Osherson, D.: Comparison of confirmation measures. Cognition 103(1), 107–119 (2017)
Lu, C.: Semantic information measure with two types of probability for falsification and confirmation. https://arxiv.org/abs/1609.07827. Accessed 27 July 2018
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 IFIP International Federation for Information Processing
About this paper
Cite this paper
Lu, C. (2018). From Bayesian Inference to Logical Bayesian Inference. In: Shi, Z., Pennartz, C., Huang, T. (eds) Intelligence Science II. ICIS 2018. IFIP Advances in Information and Communication Technology, vol 539. Springer, Cham. https://doi.org/10.1007/978-3-030-01313-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-01313-4_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01312-7
Online ISBN: 978-3-030-01313-4
eBook Packages: Computer ScienceComputer Science (R0)