Abstract
We can find real prediction learning problems whose class attribute is represented by ordinal values that should increase with some of the explaining attributes. They are known as classification problems with monotonicity constraints. In this contribution, our goal is to formalize the nearest hyperrectangle learning approach to manage monotonicity constraints. The idea behind it is to retain objects in \(\mathbb {R}^n\), which can be either single points or hyperrectangles or rules into a combined model. The approach is checked with experimental analysis involving wide range of monotonic data sets. The results reported, verified by nonparametric statistical tests, show that our approach is very competitive with well-known techniques for monotonic classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Australian, Automobile, Bands, Cleveland, Dermatology, Glass, Heart, Hepatitis, Housevotes, Ionosphere, Iris, Mammographic, Newthyroid, Pima, Saheart, Segment, Sonar, Titanic, Vowel, Wine, Wisconsin.
- 2.
ERA, ESL, LEV, SWD.
- 3.
Auto-mpg, Bostonhousing, MachineCPU.
References
Aha, D.W. (ed.): Lazy Learning. Springer, Heidelberg (1997)
Alcala-Fdez, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L., Herrera, F.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17(2–3), 255–287 (2011)
Bache, K., Lichman, M.: UCI machine learning repository (2013). http://archive.ics.uci.edu/ml
Ben-David, A.: Automatic generation of symbolic multiattribute ordinal knowledge-based dsss: methodology and applications. Decis. Sci. 23, 1357–1372 (1992)
Ben-David, A.: Monotonicity maintenance in information-theoretic machine learning algorithms. Mach. Learn. 19(1), 29–43 (1995)
Ben-David, A., Sterling, L., Pao, Y.H.: Learning, classification of monotonic ordinal concepts. Comput. Intell. 5, 45–49 (1989)
Duivesteijn, W., Feelders, A.: Nearest neighbour classification with monotonicity constraints. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML PKDD 2008, Part I. LNCS (LNAI), vol. 5211, pp. 301–316. Springer, Heidelberg (2008)
Fernández-Navarro, F., Riccardi, A., Carloni, S.: Ordinal neural networks without iterative tuning. IEEE Trans. Neural Netw. Learn. Syst. 25(11), 2075–2085 (2014)
Fürnkranz, J.: Separate-and-conquer rule learning. Artif. Intell. Rev. 13, 3–54 (1999)
Han, J., Kamber, M.: Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers Inc., Burlington (2011)
Hu, Q., Che, X., Zhang, L., Zhang, D., Guo, M., Yu, D.: Rank entropy-based decision trees for monotonic classification. IEEE Trans. Knowl. Data Eng. 24(11), 2052–2064 (2012)
Lievens, S., Baets, B.D., Cao-Van, K.: A probabilistic framework for the design of instance-based supervised ranking algorithms in an ordinal setting. Ann. Oper. Res. 163(1), 115–142 (2008)
Potharst, R., Feelders, A.J.: Classification trees for problems with monotonicity constraints. SIGKDD Explor. 4(1), 1–10 (2002)
Salzberg, S.: A nearest hyperrectangle learning method. Mach. Learn. 6(3), 251–276 (1991)
Wettschereck, D., Dietterich, T.G.: An experimental comparison of the nearest-neighbor and nearest-hyperrectangle algorithms. Mach. Learn. 19(1), 5–27 (1995)
Acknowledgments
This work was partially supported by the Spanish Ministry of Science and Technology under project TIN2014-57251-P and the Andalusian Research Plans P11-TIC-7765, P10-TIC-6858.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
García, J., Cano, JR., García, S. (2016). A Nearest Hyperrectangle Monotonic Learning Method. In: Martínez-Álvarez, F., Troncoso, A., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2016. Lecture Notes in Computer Science(), vol 9648. Springer, Cham. https://doi.org/10.1007/978-3-319-32034-2_26
Download citation
DOI: https://doi.org/10.1007/978-3-319-32034-2_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-32033-5
Online ISBN: 978-3-319-32034-2
eBook Packages: Computer ScienceComputer Science (R0)