Skip to main content

Learning Structure of Bayesian Networks by Using Possibilistic Upper Entropy

  • Chapter
  • 758 Accesses

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 315))

Abstract

The most common way to learn the structure of Bayesian networks is to use a score function together with an optimization process. When no prior knowledge is available over the structure, score functions based on information theory are used to balance the entropy of the conditional probability tables with network complexity. Clearly, this complexity has a high impact on the uncertainty about the estimation of the conditional distributions. However, this complexity is estimated independently of the computation of the entropy and thus does not faithfully handle the uncertainty about the estimation. In this paper we propose a new entropy function based on a “possibilistic upper entropy” which relies on the entropy of a possibility distribution that encodes an upper bound of the estimation of the frequencies. Since the network structure has a direct effect on the number of pieces of data available for probability estimation, the possibilistic upper entropy is of an effective interest for learning the structure of the network. We also show that possibilistic upper entropy can be used for obtaining an incremental algorithm for the online learning of Bayesian network.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abellàn, J., Moral, S.: Upper entropy of credal sets. applications to credal classification. International Journal of Approximate Reasoning 39, 235–255 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  2. Agresti, A., Coull, B.: Approximate Is Better than ”Exact” for Interval Estimation of Binomial Proportions. The American Statistician 52(2), 119–126 (1998)

    MathSciNet  Google Scholar 

  3. Dubois, D.: Possibility theory and statistical reasoning. Computational Statistics and Data Analysis 51, 47–69 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  4. Dubois, D., Foulloy, L., Mauris, G., Prade, H.: Probability-possibility transformations, triangular fuzzy sets, and probabilistic inequalities. Reliable Computing 10, 273–297 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dubois, D., Prade, H.: When upper probabilities are possibility measures. Fuzzy Sets and Systems 49, 65–74 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  6. Dubois, D., Prade, H., Sandri, S.: On possibility/probability transformations. In: Proceedings of Fourth IFSA Conference, pp. 103–112. Kluwer Academic Publ. (1993)

    Google Scholar 

  7. Heckerman, D., Chickering, D.M.: Learning bayesian networks: The combination of knowledge and statistical data. In: Machine Learning, pp. 20–197 (1995)

    Google Scholar 

  8. Koller, D., Friedman, N.: Probabilistic Graphical Models: Principles and Techniques. MIT Press (2009)

    Google Scholar 

  9. Serrurier, M., Prade, H.: An informational distance for estimating the faithfulness of a possibility distribution, viewed as a family of probability distributions, with respect to data. Int. J. Approx. Reasoning 54(7), 919–933 (2013)

    Article  MathSciNet  Google Scholar 

  10. Zadeh, L.A.: Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets and Systems 1, 3–25 (1978)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mathieu Serrurier .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Serrurier, M., Prade, H. (2015). Learning Structure of Bayesian Networks by Using Possibilistic Upper Entropy. In: Grzegorzewski, P., Gagolewski, M., Hryniewicz, O., Gil, M. (eds) Strengthening Links Between Data Analysis and Soft Computing. Advances in Intelligent Systems and Computing, vol 315. Springer, Cham. https://doi.org/10.1007/978-3-319-10765-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-10765-3_11

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-10764-6

  • Online ISBN: 978-3-319-10765-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics