Skip to main content

Regression Spline-Model in Machine Learning for Signal Prediction and Parameterization

  • Conference paper
  • First Online:
Lecture Notes in Computational Intelligence and Decision Making (ISDMCI 2019)

Abstract

Traditionally, polynomial models are used to analyze and simulate time series. The parameters of these models are estimated using the ordinary least squares (OLS) method. The complex form of the time series requires the complication of the polynomial and the growth of its order. This leads to problems with the conditionality of the equations and the inadequacy of the model. Applying splines allows you to partially solve these problems. In this paper it is proposed to use cubic Hermite splines with infinite first and last fragments to analyze and predict time series, which allows using spline to predict. To estimate the spline parameters, the least squares method is used. For placement of nodes (docking points) algorithms of coordinate optimization with constraints and algorithm of sequential buildup of fragments are offered. In order to avoid over-training or under-training of the spline model, randomness control of residues is proposed. The results are applied for parameterization of heart sounds.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Mathématicien EU, De Boor C (1978) A practical guide to splines. Springer, New York, vol 27, 325 p

    Google Scholar 

  2. Mejering E (2002) A chronology of interpolation: from ancient astronomy to modern signal and image processing. In: Proceedings of the IEEE conference, vol 90, no 3, pp 329–342

    Google Scholar 

  3. Schoenberg I (1946) Contribution to the problem of approximation of equidistant data by analytic functions. Q Appl Math 4:45–99, 112–141

    Google Scholar 

  4. Ahlberg JH, Nilson EN, Walsh JL (1967) The theory of splines and their applications. Academic Press, New York

    MATH  Google Scholar 

  5. Schumaker LL (2007) Spline functions: basic theory, 3rd edn. (with supplement), Cambridge University Press, Cambridge, 600 p

    Google Scholar 

  6. Schumaker LL (2015) Spline functions: computational methods. SIAM, Philadelphia, 422 p

    Google Scholar 

  7. Korneichuk NP, Ligun AA (1981) Error bound of spline interpolation in an integral metric. Ukrainian Math J 33(3):301–303

    Article  Google Scholar 

  8. Rao CR, Toutenburg H et al (2008) Linear models: least squares and alternatives. Springer series in statistics, 3rd edn. Springer, Berlin, 571 p

    Google Scholar 

  9. Hamming RW (1987) Numerical methods for scientists and engineers (Dover books on mathematics), 2nd revised edn. Edition Dover Publications, 752 p

    Google Scholar 

  10. Lawson CL, Hanson RJ (1995) Solving least squares problems, vol 15, 336 p. https://doi.org/10.1137/1.9781611971217

  11. Richardson J, Reiner P, Wilamowski BM (2015) Cubic spline as an alternative to methods of machine learning. In: IEEE 13th international conference on industrial informatics (INDIN), Cambridge, pp 110–115. https://doi.org/10.1109/INDIN.2015.7281719

  12. Marsland S (2015) Machine learning: an algorithmic perspective. Machine learning & pattern recognition series, 2nd edn. Chapman and Hall/CRC, 430 p

    Google Scholar 

  13. Sekhar Roy S, Roy R, Balas VE (2018) Estimating heating load in buildings using multivariate adaptive regression splines, extreme learning machine, a hybrid model of MARS and ELM. Renew Sustain Energy Rev 82:4256–4268. https://doi.org/10.1016/j.rser.2017.05.249

    Article  Google Scholar 

  14. Lee T-S, Chiu C-C, Chou Y-C, Lu C-J (2006) Mining the customer credit using classification and regression tree and multivariate adaptive regression splines. Comput Stat Data Anal 50(4):1113–1130. https://doi.org/10.1016/j.csda.2004.11.006

    Article  MathSciNet  MATH  Google Scholar 

  15. Friedman JH (1991) Multivariate adaptive regression splines. Ann Stat 19(1):1–67

    Article  MathSciNet  Google Scholar 

  16. Shelevytsky I, Shelevytska V, Golovko V, Semenov B (2018) Segmentation and parametrization of the phonocardiogram for the heart conditions classification in newborns. In: 2018 IEEE second international conference on data stream mining & processing (DSMP). https://doi.org/10.1109/dsmp.2018.8478495

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Ihor Shelevytsky , Victoriya Shelevytska , Kseniia Semenova or Ievgen Bykov .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shelevytsky, I., Shelevytska, V., Semenova, K., Bykov, I. (2020). Regression Spline-Model in Machine Learning for Signal Prediction and Parameterization. In: Lytvynenko, V., Babichev, S., Wójcik, W., Vynokurova, O., Vyshemyrskaya, S., Radetskaya, S. (eds) Lecture Notes in Computational Intelligence and Decision Making. ISDMCI 2019. Advances in Intelligent Systems and Computing, vol 1020. Springer, Cham. https://doi.org/10.1007/978-3-030-26474-1_12

Download citation

Publish with us

Policies and ethics