Skip to main content

The Semantic Information Method Compatible with Shannon, Popper, Fisher, and Zadeh’s Thoughts

  • Conference paper
  • First Online:
  • 273 Accesses

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 872))

Abstract

Popper and Fisher’s hypothesis testing thoughts are very important. However, Shannon’s information theory does not consider hypothesis testing. The combination of information theory and likelihood method is attracting more and more researchers’ attention, especially when they solve Maximum Mutual Information (MMI) and Maximum Likelihood (ML). This paper introduces how we combine Shannon’s information theory, likelihood method, and fuzzy sets theory to obtain the Semantic Information Method (SIM) for optimizing hypothesis testing better. First, we use the membership functions of fuzzy sets proposed by Zadeh as the truth functions of hypotheses; then, we use the truth functions to produce likelihood functions, and bring such likelihood functions into Kullback-Leibler and Shannon’s information formulas to obtain the semantic information formulas. Conversely, the semantic information measure may be used to optimize the membership functions. The maximum semantic information criterion is equivalent to the ML criterion; however, it is compatible with Bayesian prediction, and hence can be used in cases where the prior probability distribution is changed. Letting the semantic channel and the Shannon channel mutually match and iterate, we can achieve MMI and ML for tests, estimations, and mixture models. This iterative algorithm is called Channels’ Matching (CM) algorithm. Theoretical analyses and several examples show that the CM algorithm has fast speed, clear convergence reason, and wild potential applications. The further studies of the SIM related to the factor space and information value are discussed.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    More examples and the excel files for demonstrating the iterative processes can be found at http://survivor99.com/lcg/CM.html.

References

  1. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–429 (1948). 623–656

    Google Scholar 

  2. Popper, K.: Conjectures and Refutations. Routledge, London/New York (1963/2005)

    Google Scholar 

  3. Fisher, R.A.: On the mathematical foundations of theoretical statistics. Philo. Trans. Roy. Soc. 222, 309–368 (1922)

    Article  Google Scholar 

  4. Davidson, D.: Truth and meaning. Synthese 17, 304–323 (1967)

    Article  Google Scholar 

  5. Zadeh, L.A.: Fuzzy sets. Inf. Control 8(3), 338–353 (1965)

    Article  Google Scholar 

  6. Kok, M., Dahlin, J., Schon, B., Wills, T.B.: A Newton-based maximum likelihood estimation in nonlinear state space models. IFAC-PapersOnLine 48, 398–403 (2015)

    Article  Google Scholar 

  7. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc., Ser. B 39, 1–38 (1977)

    MathSciNet  MATH  Google Scholar 

  8. Barron, A., Roos, T., Watanabe, K.: Bayesian properties of normalized maximum likelihood and its fast computation. In: IEEE IT Symposium on Information Theory, pp. 1667–1671 (2014)

    Google Scholar 

  9. Lu, C.: B-fuzzy set algebra and a generalized cross-information equation. Fuzzy Syst. Math. (in Chin.) 5(1), 76–80 (1991)

    MathSciNet  MATH  Google Scholar 

  10. Lu, C.: A Generalized Information Theory (in Chinese). China Science and Technology University Press, Hefei (1993)

    Google Scholar 

  11. Lu, C.: Meanings of generalized entropy and generalized mutual information for coding. J. China Inst. Commun. (in Chin.) 15(6), 37–44 (1994)

    Google Scholar 

  12. Lu, C.: A generalization of Shannon’s information theory. Int. J. Gen. Syst. 28(6), 453–490 (1999)

    Article  MathSciNet  Google Scholar 

  13. Dubois, D., Prade, H.: Fuzzy sets and probability: misunderstandings, bridges and gaps. In: Second IEEE International Conference on Fuzzy Systems, 28 March, 1 April (1993)

    Google Scholar 

  14. Bar-Hillel, Y., Carnap, R.: An outline of a theory of semantic information. Technical report No.247, Research Lab. of Electronics, MIT (1952)

    Google Scholar 

  15. Kullback, S., Leibler, R.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1952)

    Article  MathSciNet  Google Scholar 

  16. Akaike, H.: A new look at the statistical model identification. IEEE Trans. Autom. Control 19, 716–723 (1974)

    Article  MathSciNet  Google Scholar 

  17. Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiely, New York (2006)

    MATH  Google Scholar 

  18. Wang, P.W.: Fuzzy Sets and Random Sets Shadow (in Chinese). Beijing Normal University Press, Beijing (1985)

    Google Scholar 

  19. Thornbury, J.R., Fryback, D.G., Edwards, W.: Likelihood ratios as a measure of the diagnostic usefulness of excretory urogram information. Radiology 114(3), 561–565 (1975)

    Article  Google Scholar 

  20. Lu, C.: The Semantic Information Method for Maximum Mutual Information and Maximum Likelihood of Tests, Estimations, and Mixture Models. https://arxiv.org/abs/1706.07918, 24 June 2017

  21. Lu, C.: Channels’ matching algorithm for mixture models. In: Proceedings of International Conference on Intelligence Science, Shanghai, pp. 25–28, October 2017

    Google Scholar 

  22. Wu, C.F.J.: On the convergence properties of the EM algorithm. Ann. Stat. 11(1), 95–103 (1983)

    Article  MathSciNet  Google Scholar 

  23. Neal, R., Hinton, G.: A view of the EM algorithm that justifies incremental, sparse, and other variants. In: Jordan, M.I. (ed) Learning in Graphical Models, pp. 355–368. MIT Press, Cambridge (1990)

    Google Scholar 

  24. Wang, P.Z.: Factor space and data science. J. Liaoning Tech. Univ. 34(2), 273–280 (2015)

    Google Scholar 

  25. Lu, C.: Entropy Theory of Portfolio and Information Value (in Chinese). Science and Technology University Press, Hefei (1997)

    Google Scholar 

Download references

Acknowledgement

The author thanks Professor Peizhuang Wang for his long term supports. Without his recent encouragement, the author wouldn’t have continued researching to find the channels’ matching algorithm.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chenguang Lu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lu, C. (2019). The Semantic Information Method Compatible with Shannon, Popper, Fisher, and Zadeh’s Thoughts. In: Cao, BY., Zhong, YB. (eds) Fuzzy Sets and Operations Research. ICFIE 2017. Advances in Intelligent Systems and Computing, vol 872. Springer, Cham. https://doi.org/10.1007/978-3-030-02777-3_19

Download citation

Publish with us

Policies and ethics