Skip to main content

The Continuous—Function Attribute Class in Decision Tree Induction

  • Conference paper
  • First Online:
Discovey Science (DS 1998)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1532))

Included in the following conference series:

  • 576 Accesses

Abstract

The automatic extraction of knowledge from data gathered from a dynamic system is an important task, because continuous measurement acquisition provides an increasing amount of numerical data. On an abstract layer these data can generally be modeled as continuous functions over time. In this article we present an approach to handle continuous-function attributes efficiently in decision tree induction, if the entropy minimalization heuristics is applied

It is shown how time series based upon continuous functions could be preprocessed if used in decision tree induction. A proof is given, that a piecewise linear approximation of the individual time series or the underlying continuous functions could improve the efficiency of the induction task

Research supported by FNK-Forschungsförderung University of Bremen.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baines, M.J. (1994). Algorithms for Optimal Piecewise Linear and Constant L2 Fits to Continuous Functions with Adjustable Nodes in One and Two Dimensions. Mathematics of Computation, 62(206):645–669.

    Article  MATH  MathSciNet  Google Scholar 

  2. Boronowsky, M. (1998). Automatic Measurement Interpretation of a Physical System with Decision Tree Induction. In Proceedings of the International Symposium on Intelligent Data Engineering and Learning in Hong Kong, China. Springer.

    Google Scholar 

  3. Breiman, L., Friedman, J. H., Olshen, R. A., and Stone, C. J. (1984). Classification and Regression Trees. Wadsworth, Belmont.

    MATH  Google Scholar 

  4. Catlett, J. (1991). On changing continuous attributes into ordered discrete attributes. In Kodratoff, Y., editor, Proceedings of the European Working Session on Learning, pages 164–178.

    Google Scholar 

  5. Clark, P. and Niblett, T. (1989). The CN2 Induction Algorithm. Machine Learning, 1:261–282.

    Google Scholar 

  6. Dougherty, J., Kohavi, R., and Sahami, M. (1995). Supervised and Unsupervised Discretitzation of Continuous Features. In Machine Learning: Proceedings of the 12th International Conference, pages 194–202. Morgan Kaufmann.

    Google Scholar 

  7. Elomaa, T. and Rousu, J. (1998). General and Efficient Multiplitting of Numerical Attributes. Machine Learning. to appear.

    Google Scholar 

  8. Fayyad, U. M. and Irani, K. B. (1992). On the Handling of Continuous-Valued Attributes in Decision Tree Generation. Machine Learning, 8:87–102.

    MATH  Google Scholar 

  9. Goodrich, M. T. (1995). Efficient Piecewise-Linear Function Approximation Using the Uniform Metric. Discrete & Computational Geometry, 14:445–462.

    Article  MATH  MathSciNet  Google Scholar 

  10. Gradshteyn, I. S. and Ryzhik, I. M. (1979). Table of integrals, series, and products. Academic Press, 5th. edition. page 1092.

    Google Scholar 

  11. Hakimi, S. L. and Schmeichel, E. F. (1991). Fitting Polygonal Functions to a Set of Points in the Plane. CVGIP: Graphical Models and Image Processing, 53(2):132–136.

    Article  MATH  Google Scholar 

  12. Imai, H. and Iri, M. (1988). Polygonal Approximation of a Curve-Formulations and Algorithms. In G.T. Toussaint, editor, Computational Morphology, pages 71–86. Elsevier Science Publishers B.V.

    Google Scholar 

  13. Kerber, R. (1992). Chimerge: Discretization of Numeric Attributes. In Proceedings of the 10th National Conference on Artificial Intelligence, pages 123–127.

    Google Scholar 

  14. Quinlan, J.R. (1986). Induction of Decision Trees. Machine Learning, 1.

    Google Scholar 

  15. Quinlan, J.R. (1993). C4.5 Programs for Machine Learning. Morgan Kaufmann.

    Google Scholar 

  16. Susmaga, R. (1997). Analyzing Discretizations of Continuous Attributes Given a Monotonic Discrimination Function. Intelligent Data Analysis, 1(3).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Boronowsky, M. (1998). The Continuous—Function Attribute Class in Decision Tree Induction. In: Arikawa, S., Motoda, H. (eds) Discovey Science. DS 1998. Lecture Notes in Computer Science(), vol 1532. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-49292-5_24

Download citation

  • DOI: https://doi.org/10.1007/3-540-49292-5_24

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-65390-5

  • Online ISBN: 978-3-540-49292-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics