Skip to main content

An Optimal Approach to Mining Boolean Functions from Noisy Data

  • Conference paper
Intelligent Data Engineering and Automated Learning (IDEAL 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2690))

Abstract

Data Mining of binary sequences has been an area of profound research and is often used as a proof of concept in various aspects of computational learning theory. The inference task in this paper, a specialized version of the segmentation problem, is the estimation of a predefined Boolean function on the real interval [0,1] from a noisy random sample. The framework for this problem was introduced by Kearns et al. (1997) in an earlier empirical evaluation of model selection methods. This paper presents an optimal approach to mining for Boolean functions from noisy data samples based on the Minimum Message Length (MML) principle. The MML method is shown to be optimal in comparison to well-known model selection methods based on Guaranteed Risk Minimization, Minimum Description Length (MDL) Principle and Cross Validation after a thorough empirical evaluation with varying levels of noisy data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kearns, M., Mansour, Y., Ng, A.Y., Ron, D.: An Experimental and Theoretical Comparison of Model Selection Methods. Machine Learning 27, 7–50 (1997)

    Article  Google Scholar 

  2. Wallace, C.S., Boulton, D.M.: An information measure for classification. Computer Journal 2(11), 195–209 (1968)

    Google Scholar 

  3. Wallace, C.S., Freeman, P.R.: Estimation and Inference by Compact Coding. Journal of the Royal Statistical Society B,49, 240–252 (1987)

    MathSciNet  Google Scholar 

  4. Rissanen, J.: Stochastic Complexity and Modeling. Annals of Statistics 14, 1080–1100 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  5. Vapnik, V.: Statistical Learning Theory. Springer, New York (1995)

    MATH  Google Scholar 

  6. Viswanathan, M., Wallace, C.S.: A Note on the Comparison of Polynomial Selection Methods. Artificial Intelligence and Statistics 99, 169–177 (1999)

    Google Scholar 

  7. Wallace, C.S., Dowe, D.L.: Minimum Message Length and Kolmogorov Complexity. Computer Journal (to appear)

    Google Scholar 

  8. Stone, M.: Cross-validatory Choice and Assessment of Statistical Predictions. Journal of the Royal Statistical Society B,36, 111–147 (1974)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Viswanathan, M., Wallace, C. (2003). An Optimal Approach to Mining Boolean Functions from Noisy Data. In: Liu, J., Cheung, Ym., Yin, H. (eds) Intelligent Data Engineering and Automated Learning. IDEAL 2003. Lecture Notes in Computer Science, vol 2690. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45080-1_96

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-45080-1_96

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40550-4

  • Online ISBN: 978-3-540-45080-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics