Advertisement

Automation and Remote Control

, Volume 63, Issue 1, pp 25–35 | Cite as

Estimating the Parameters of Linear Regression in an Arbitrary Noise

  • O. N. Granichin
Article

Abstract

Consideration was given to estimation of the parameters of linear regression under arbitrary noise, that is, noise whose mean value is either unknown and other than zero, or a realization of a correlated random process, or defined by an unknown bounded determinate function.

Keywords

Linear Regression Mechanical Engineer System Theory Random Process Determinate Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

REFERENCES

  1. 1.
    Granichin, O.N., An Algorithm of Stochastic Approximation with Input Perturbation for Identification of a Static Nonstationary Plant, Vestn.Leningr.Gos.Univ., 1988, vol. 1, no. 3, pp. 92–93.Google Scholar
  2. 2.
    Granichin, O.N., Estimation of the Minimum Point of an Unknown Function Observed in Dependent Noise, Probl.Peredachi Inf., 1992, no. 2, pp. 16–20.Google Scholar
  3. 3.
    Goldenshluger, A.V. and Polyak, B.T., Estimation of the Regression Parameters with Arbitrary Noise, Math.Meth.Statist., 1993, vol. 2, no. 1, pp. 18–29.Google Scholar
  4. 4.
    Ljung, L. and Guo, L., The Role of Model Validation for Assessing the Size of the Unmodeled Dynamics, IEEE Trans.Autom.Control, 1997, vol. 42, no. 9, pp. 1230–1239.Google Scholar
  5. 5.
    Fisher, R.A., The Design of Experiments, Edinburgh: Oliver and Boyd, 1935.Google Scholar
  6. 6.
    Tsypkin, Ya.Z., Informatsionnaya teoriya identifikatsii (Informaitonal Theory of Identification), Moscow: Nauka, 1995.Google Scholar
  7. 7.
    Ljung, L. and Söderström, T., Theory and Practice of Recursive Identification, Cambridge: MIT Press, 1983.Google Scholar
  8. 8.
    Polyak, B.T. and Tsypkin, Ya.Z., Adaptive Estimation Algorithms (Convergence, Optimality, Stability), Avtom.Telemekh., 1979, no. 3, pp. 71–84.Google Scholar
  9. 9.
    Robbins, H. and Siegmuud, D., A Convergence Theorem for Nonnegative Almost Super-Martingales and Some Applications, in Optimizing Methods in Statistics, Rustagi, J.S., Ed., New York, Academic, 1971, pp. 233–257.Google Scholar
  10. 10.
    Polyak, B.T., Convergence and Rate of Convergence of the Iterative Stochastic Algorithms. II, Avtom.Telemekh., 1977, no. 4, pp. 101–107.Google Scholar
  11. 11.
    Poznyak, A.S., Estimating the Parameters of Autoregressive Processes by Least Squares, Int.J.Syst.Sci., 1980, vol. 11, pp. 577–588.Google Scholar

Copyright information

© MAIK “Nauka/Interperiodica” 2002

Authors and Affiliations

  • O. N. Granichin
    • 1
  1. 1.St. Petersburg State UniversitySt. PetersburgRussia

Personalised recommendations