Advertisement

Regression via Logic Supervised Classification

  • Vladimir Ryazanov
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7914)

Abstract

An approach to the restoration of dependences (regressions) is proposed that is based on solving problems of supervised classification. The main task is finding the optimal partitioning of the range of values of dependent variable on a finite number of intervals. It is necessary to find optimal number of change-points and their positions. This task is formulated as search and application of piece-wise constant function. When restoring piecewise constant functions, the problem of local discrete optimization using a model of logic supervised classification in leave –one-out mode is solved. The value of the dependent value is calculated in two steps. At first, the problem of classification of feature vector is solved. Further, the dependent variable is calculated as half of the sum of change-points values of the corresponding class.

Keywords

regression supervised classification discrete optimization approximation 

References

  1. 1.
    Draper, N., Smith, H.: Applied regression analysis. John Wiley & Sons, New York (1966)Google Scholar
  2. 2.
    Hardle, W.: Applied nonparametric regression. Cambridge University Press, Cambridge (1990)CrossRefGoogle Scholar
  3. 3.
    Collobert, R., Bengio, S.: Support Vector Machines for Large-Scale Regression Problems. Journal of Machine Learning Research 1, 9/1/, 143–160 (2001)Google Scholar
  4. 4.
    Yu, J.-R., Tzeng, G.-H., Li, H.-L.: General fuzzy piecewise regression analysis with automatic change-point detection. Fuzzy Sets and Systems 119, 247–257 (2001)MathSciNetzbMATHCrossRefGoogle Scholar
  5. 5.
    Hutter, M.: Bayesian Regression of Piecewise Constant Functions. Technical Report IDSIA-14-05, Galleria 2, CH-6928 Manno-Lugano, Switzerland (2005)Google Scholar
  6. 6.
    Janssen, F., Fyurnkranz, J.: Heuristic Rule-Based Regression via Dynamic Reduction to Classification. In: Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence (IJCAI 2011), pp. 1330–1335 (2011)Google Scholar
  7. 7.
    Bibi, S., Tsoumakas, G., Stamelos, I., Vlahavas, I.: Regression via Classification applied on software defect estimation. Expert Systems with Applications 34, 2091–2101 (2008)CrossRefGoogle Scholar
  8. 8.
    Binev, P., Cohen, A., Dahmen, W., DeVore, R., Temlyakov, V.: Universal Algorithms for Learning Theory. Part I: Piecewise Constant Functions. Journal of Machine Learning Research 6, 1297–1321 (2005)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Zhuravlev, Y.: Selected Scientific Publications, p. 420. M. Magistr Publishing (1998)Google Scholar
  10. 10.
    Harrison, D., Rubinfeld, D.L.: Hedonic prices and the demand for clean air. J. Environ. Economics & Management 5, 81–102 (1978)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Vladimir Ryazanov
    • 1
  1. 1.Dorodnicyn Computing Centre of RASInstitution of Russian Academy of SciencesMoscowRussia

Personalised recommendations