Abstract
This paper describes a machine learning method, called Regression by Selecting Best Feature Projections (RSBFP). In the training phase, RSBFP projects the training data on each feature dimension and aims to find the predictive power of each feature attribute by constructing simple linear regression lines, one per each continuous feature and number of categories per each categorical feature. Because, although the predictive power of a continuous feature is constant, it varies for each distinct value of categorical features. Then the simple linear regression lines are sorted according to their predictive power. In the querying phase of learning, the best linear regression line and thus the best feature projection are selected to make predictions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Breiman, L, Friedman, J H, Olshen, R A and Stone, C J ‘Classification and Regression Trees’ Wadsworth, Belmont, California (1984)
Friedman, J H ‘Local Learning Based on Recursive Covering’ Department of Statistics, Stanford University (1996)
Weiss, S and Indurkhya, N ‘Rule-based Machine Learning Methods for Functional Prediction’ Journal of Artificial Intelligence Research Vol 3 (1995) pp 383–403
Aha, D, Kibler, D and Albert, M ‘Instance-based Learning Algorithms’ Machine Learning Vol 6 (1991) pp 37–66
Quinlan, J R ‘Learning with Continuous Classes’ Proceedings AI’92 Adams and Sterling (Eds) Singapore (1992) pp 343–348
Bratko, I and Karalic A ‘First Order Regression’ Machine Learning Vol 26 (1997) pp 147–176
Karalic, A ‘Employing Linear Regression in Regression Tree Leaves’ Proceedings of ECAI’92 Vienna, Austria, Bernd Newmann (Ed.) (1992) pp 440–441
Friedman, J H ‘Multivariate Adaptive Regression Splines’ The Annals of Statistics Vol 19 No 1 (1991) pp 1–141
Breiman, L ‘Stacked Regressions’ Machine Learning Vol 24 (1996) pp 49–64
Kibler, D, Aha D W and Albert, M K ‘Instance-based Prediction of Real-valued Attributes’ Comput. Intell. Vol 5 (1989) pp 51–57
Weiss, S and Indurkhya, N ‘Optimized Rule Induction’ IEEE Expert Vol 8 No 6 (1993) pp 61–69
Graybill, F, Iyer, H and Burdick, R ‘Applied Statistics’ Upper Saddle River, NJ (1998)
Aydin, T ‘Regression by Selecting Best Feature(s)’ M.S.Thesis, Computer Engineering, Bilkent University, September, (2000)
Aydin, T and Güvenir, H A ‘Regression by Selecting Appropriate Features’ Proceedings of TAINN’2000, Izmir, June 21-23, (2000), pp 73–82
Uysul, İ and Güvenir, H A ‘Regression on Feature Projections’ Knowledge-Based Systems, Vol.13, No:4, (2000), pp 207–214
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Aydin, T., Güvenir, H.A. (2001). An Eager Regression Method Based on Best Feature Projections. In: Monostori, L., Váncza, J., Ali, M. (eds) Engineering of Intelligent Systems. IEA/AIE 2001. Lecture Notes in Computer Science(), vol 2070. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45517-5_25
Download citation
DOI: https://doi.org/10.1007/3-540-45517-5_25
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42219-8
Online ISBN: 978-3-540-45517-2
eBook Packages: Springer Book Archive