Skip to main content

Compression of General Bayesian Net CPTs

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9673))

Abstract

Non-Impeding Noisy-AND (NIN-AND) Tree (NAT) models offer a highly expressive approximate representation for significantly reducing the space of Bayesian Nets (BNs). They can also significantly improve efficiency of BN inference, as shown for binary NAT models. To enable these advantages for general BNs, advancements on three technical challenges are made in this work. We overcome the limitation of well-defined Pairwise Causal Interaction (PCI) bits and present a flexible PCI pattern extraction from general target Conditional Probability Tables (CPTs). We extend parameter estimation for binary NAT models to constrained gradient descent for compressing target CPTs into multi-valued NAT models. The effectiveness of the compression is demonstrated experimentally. A novel framework is also developed for PCI pattern extraction when persistent leaky causes exist.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Jensen, F.V., Lauritzen, S.L., Olesen, K.G.: Bayesian updating in causal probabilistic networks by local computations. Comput. Stat. Q. 4, 269–282 (1990)

    MathSciNet  MATH  Google Scholar 

  2. Madsen, A.L., D’Ambrosio, B.: A factorized representation of independence of causal influence, lazy propagation. Int. J. Uncertainty Fuzziness Knowl.-Based Syst. 8(2), 151–166 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  3. Madsen, A.L., Jensen, F.V.: Lazy propagation: A junction tree inference algorithm based on lazy evaluation. Artif. Intell. 113(1–2), 203–245 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  4. Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, Burlington (1988)

    MATH  Google Scholar 

  5. Savicky, P., Vomlel, J.: Exploiting tensor rank-one decomposition in probabilistic inference. Kybernetika 43(5), 747–764 (2007)

    MathSciNet  MATH  Google Scholar 

  6. Takikawa, M., D’Ambrosio, B.: Multiplicative factorization of noisy-max. In: Proceedings of 15th Conference Uncertainty in Artificial Intelligence, pp. 622–630 (1999)

    Google Scholar 

  7. Yang, X.: Acquisition and computation issues with NIN-AND tree models. In: Myllymaki, P., Roos, T., Jaakkola, T., (eds.) Proceedings of 5th European Workshop on Probabilistic Graphical Models, Finland, pp. 281–289 (2010)

    Google Scholar 

  8. Yang, X.: Bayesian network inference with NIN-AND tree models. In: Cano, A., Gomez-Olmedo, M., Nielsen, T.D. (eds.) Proceedings 6th European Workshop on Probabilistic Graphical Models, Granadapp, pp. 363–370 (2012)

    Google Scholar 

  9. Xiang, Y.: Non-impeding noisy-and tree causal models over multi-valued variables. Int. J. Approx. Reason. 53(7), 988–1002 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  10. Xiang, Y., Li, Y., Zhu, Z.J.: Towards effective elicitation of NIN-AND tree causal models. In: Godo, L., Pugliese, A. (eds.) SUM 2009. LNCS, vol. 5785, pp. 282–296. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  11. Xiang, Y., Liu, Q.: Compression of Bayesian networks with NIN-AND tree modeling. In: van der Gaag, L.C., Feelders, A.J. (eds.) PGM 2014. LNCS, vol. 8754, pp. 551–566. Springer, Heidelberg (2014)

    Google Scholar 

  12. Xiang, Y., Truong, M.: Acquisition of causal models for local distributions in Bayesian networks. IEEE Trans. Cybern. 44(9), 1591–1604 (2014)

    Article  Google Scholar 

  13. Zagorecki, A., Druzdzel, M.J.: Knowledge engineering for Bayesian networks: How common are noisy-MAX distributions in practice? IEEE Trans. Syst. Man Cybern. Syst. 43(1), 186–195 (2013)

    Article  Google Scholar 

  14. Zhang, N., Poole, D.: Exploiting causal independence in Bayesian network inference. J. Artif. Intell. Res. 5, 301–328 (1996)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgement

Financial support from NSERC Discovery Grant is acknowledged. We thank anonymous reviewers. We apologize for not moving explanations of figures and tables from text to captions, as it does not appear feasible to us.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yang Xiang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Xiang, Y., Jiang, Q. (2016). Compression of General Bayesian Net CPTs. In: Khoury, R., Drummond, C. (eds) Advances in Artificial Intelligence. Canadian AI 2016. Lecture Notes in Computer Science(), vol 9673. Springer, Cham. https://doi.org/10.1007/978-3-319-34111-8_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-34111-8_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-34110-1

  • Online ISBN: 978-3-319-34111-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics