Skip to main content

A Nearest Hyperrectangle Monotonic Learning Method

  • Conference paper
  • First Online:
Hybrid Artificial Intelligent Systems (HAIS 2016)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9648))

Included in the following conference series:

  • 2123 Accesses

Abstract

We can find real prediction learning problems whose class attribute is represented by ordinal values that should increase with some of the explaining attributes. They are known as classification problems with monotonicity constraints. In this contribution, our goal is to formalize the nearest hyperrectangle learning approach to manage monotonicity constraints. The idea behind it is to retain objects in \(\mathbb {R}^n\), which can be either single points or hyperrectangles or rules into a combined model. The approach is checked with experimental analysis involving wide range of monotonic data sets. The results reported, verified by nonparametric statistical tests, show that our approach is very competitive with well-known techniques for monotonic classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Australian, Automobile, Bands, Cleveland, Dermatology, Glass, Heart, Hepatitis, Housevotes, Ionosphere, Iris, Mammographic, Newthyroid, Pima, Saheart, Segment, Sonar, Titanic, Vowel, Wine, Wisconsin.

  2. 2.

    ERA, ESL, LEV, SWD.

  3. 3.

    Auto-mpg, Bostonhousing, MachineCPU.

References

  1. Aha, D.W. (ed.): Lazy Learning. Springer, Heidelberg (1997)

    MATH  Google Scholar 

  2. Alcala-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17(2–3), 255–287 (2011)

    Google Scholar 

  3. Bache, K., Lichman, M.: UCI machine learning repository (2013). http://archive.ics.uci.edu/ml

  4. Ben-David, A.: Automatic generation of symbolic multiattribute ordinal knowledge-based dsss: methodology and applications. Decis. Sci. 23, 1357–1372 (1992)

    Article  Google Scholar 

  5. Ben-David, A.: Monotonicity maintenance in information-theoretic machine learning algorithms. Mach. Learn. 19(1), 29–43 (1995)

    Google Scholar 

  6. Ben-David, A., Sterling, L., Pao, Y.H.: Learning, classification of monotonic ordinal concepts. Comput. Intell. 5, 45–49 (1989)

    Article  Google Scholar 

  7. Duivesteijn, W., Feelders, A.: Nearest neighbour classification with monotonicity constraints. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part I. LNCS (LNAI), vol. 5211, pp. 301–316. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  8. Fernández-Navarro, F., Riccardi, A., Carloni, S.: Ordinal neural networks without iterative tuning. IEEE Trans. Neural Netw. Learn. Syst. 25(11), 2075–2085 (2014)

    Article  Google Scholar 

  9. Fürnkranz, J.: Separate-and-conquer rule learning. Artif. Intell. Rev. 13, 3–54 (1999)

    Article  MATH  Google Scholar 

  10. Han, J., Kamber, M.: Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers Inc., Burlington (2011)

    MATH  Google Scholar 

  11. Hu, Q., Che, X., Zhang, L., Zhang, D., Guo, M., Yu, D.: Rank entropy-based decision trees for monotonic classification. IEEE Trans. Knowl. Data Eng. 24(11), 2052–2064 (2012)

    Article  Google Scholar 

  12. Lievens, S., Baets, B.D., Cao-Van, K.: A probabilistic framework for the design of instance-based supervised ranking algorithms in an ordinal setting. Ann. Oper. Res. 163(1), 115–142 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  13. Potharst, R., Feelders, A.J.: Classification trees for problems with monotonicity constraints. SIGKDD Explor. 4(1), 1–10 (2002)

    Article  Google Scholar 

  14. Salzberg, S.: A nearest hyperrectangle learning method. Mach. Learn. 6(3), 251–276 (1991)

    MathSciNet  Google Scholar 

  15. Wettschereck, D., Dietterich, T.G.: An experimental comparison of the nearest-neighbor and nearest-hyperrectangle algorithms. Mach. Learn. 19(1), 5–27 (1995)

    Google Scholar 

Download references

Acknowledgments

This work was partially supported by the Spanish Ministry of Science and Technology under project TIN2014-57251-P and the Andalusian Research Plans P11-TIC-7765, P10-TIC-6858.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Salvador García .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

García, J., Cano, JR., García, S. (2016). A Nearest Hyperrectangle Monotonic Learning Method. In: Martínez-Álvarez, F., Troncoso, A., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2016. Lecture Notes in Computer Science(), vol 9648. Springer, Cham. https://doi.org/10.1007/978-3-319-32034-2_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-32034-2_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-32033-5

  • Online ISBN: 978-3-319-32034-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics